New software lets you run a private AI cluster at home with networked smartphones, tablets, and computers — Exo software runs LLama and other AI models

exo allows you to run an AI cluster at home
(Image credit: Shutterstock)

Big AI developers like OpenAI, Google’s Gemini team, and the folks behind Microsoft Copilot have massive data centers at their disposal for AI workloads. Thanks to the work of a team of developers, a new software could allow you to run your own AI cluster at home using your existing smartphones, tablets, and computers.

The experimental exo software splits up your Large Language Model (LLM) to use some or all of your computing devices at home to run your personal chatbot or other AI project. This can include your Android phones and tablets, as well as computers running macOS or Linux. 

Once compiled and running, exo automatically discovers devices on your network to include in the cluster. It provides device equality using peer-to-peer connections. While exo supports various partitioning strategies to distribute the work across devices, it defaults to a ring memory-weighted scheme that allocates the workload based on how much memory each device has.

The exo software also supports iOS, but the developers say the code needs some work to be ready for mainstream use. It’s pulled the iOS version but will allow access to it for those who email the lead developer. 

The developers also plan further refinement and features. They also have a bounty program for helping add new features and compatibility. As of this writing, these included support for LLaVa, batched requests, a radio networking module, and pipeline parallel inference support. In its current state, it already looks like a cool project to experiment with.

Jeff Butts
Contributing Writer

Jeff Butts has been covering tech news for more than a decade, and his IT experience predates the internet. Yes, he remembers when 9600 baud was “fast.” He especially enjoys covering DIY and Maker topics, along with anything on the bleeding edge of technology.