Peripherals
Tiiny AI has introduced a pocket-sized personal AI computer designed to run large artificial intelligence models locally, positioning the device as an alternative to cloud-based AI services amid growing concerns over data privacy, recurring usage costs and dependence on remote infrastructure.
The product, called the Tiiny AI Pocket Lab, was unveiled this week at Consumer Electronics Show (CES), where it attracted attention from developers and analysts exploring ways to deploy AI without relying on cloud platforms. Unlike most mainstream AI services that require internet connectivity and usage-based pricing, Pocket Lab is built to operate entirely on-device, without subscriptions or token-based fees.
At CES demonstrations, the company showed the Pocket Lab running large language models with up to 120 billion parameters fully offline, delivering decoding speeds exceeding 20 tokens per second. Tiiny AI said the performance is intended for practical, everyday use rather than experimental or proof-of-concept workloads.
The launch comes as enterprises and individual users increasingly reassess how AI systems handle sensitive data and how costs scale over time. While cloud-based AI platforms offer flexibility and scale, they often require users to send data to third-party servers and commit to ongoing usage fees. Advances in inference efficiency, however, are making local AI deployments more viable, even on compact hardware.
Samar, go-to-market director at Tiiny AI, said the company sees a shift in how people think about AI ownership. As users become more conscious of data governance and long-term costs, he said, local AI systems offer a model closer to owning a personal computer rather than renting AI capabilities on demand.
Tiiny AI positioned Pocket Lab as a companion device rather than a replacement for laptops or desktops. The system connects via plug-and-play and offloads AI inference externally, enabling even older computers to access advanced AI models without requiring hardware upgrades.
Alongside the hardware, the company also introduced TiinyOS, an on-device software platform that allows users to download and run open-source language models and AI agents with minimal setup. TiinyOS also includes developer tools for building and deploying local AI workflows without reliance on cloud infrastructure.
Pocket Lab is scheduled to launch on Kickstarter in February, with an early-bird price of $1,399. The company said the pricing is intended to make local AI more accessible rather than position the device as a high-end workstation. The system includes 80GB of LPDDR5X memory, a configuration that typically accounts for a large portion of the overall hardware cost.
Interest in local-first AI devices has been growing, particularly as discussions around enterprise security, data sovereignty and the economics of AI services intensify. Tiiny AI said it has received confirmation from Guinness World Records that Pocket Lab will be certified as the smallest mini PC capable of running a 100-billion-parameter language model locally.
The product, called the Tiiny AI Pocket Lab, was unveiled this week at Consumer Electronics Show (CES), where it attracted attention from developers and analysts exploring ways to deploy AI without relying on cloud platforms. Unlike most mainstream AI services that require internet connectivity and usage-based pricing, Pocket Lab is built to operate entirely on-device, without subscriptions or token-based fees.
At CES demonstrations, the company showed the Pocket Lab running large language models with up to 120 billion parameters fully offline, delivering decoding speeds exceeding 20 tokens per second. Tiiny AI said the performance is intended for practical, everyday use rather than experimental or proof-of-concept workloads.
The launch comes as enterprises and individual users increasingly reassess how AI systems handle sensitive data and how costs scale over time. While cloud-based AI platforms offer flexibility and scale, they often require users to send data to third-party servers and commit to ongoing usage fees. Advances in inference efficiency, however, are making local AI deployments more viable, even on compact hardware.
Samar, go-to-market director at Tiiny AI, said the company sees a shift in how people think about AI ownership. As users become more conscious of data governance and long-term costs, he said, local AI systems offer a model closer to owning a personal computer rather than renting AI capabilities on demand.
Tiiny AI positioned Pocket Lab as a companion device rather than a replacement for laptops or desktops. The system connects via plug-and-play and offloads AI inference externally, enabling even older computers to access advanced AI models without requiring hardware upgrades.
Alongside the hardware, the company also introduced TiinyOS, an on-device software platform that allows users to download and run open-source language models and AI agents with minimal setup. TiinyOS also includes developer tools for building and deploying local AI workflows without reliance on cloud infrastructure.
Pocket Lab is scheduled to launch on Kickstarter in February, with an early-bird price of $1,399. The company said the pricing is intended to make local AI more accessible rather than position the device as a high-end workstation. The system includes 80GB of LPDDR5X memory, a configuration that typically accounts for a large portion of the overall hardware cost.
Interest in local-first AI devices has been growing, particularly as discussions around enterprise security, data sovereignty and the economics of AI services intensify. Tiiny AI said it has received confirmation from Guinness World Records that Pocket Lab will be certified as the smallest mini PC capable of running a 100-billion-parameter language model locally.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.



