4 iPhone Apps to Run AI / LLMs Locally
0Running an AI model locally doesn’t have to be scary. You don’t need to be a tech genius to get started. You are going to need the right kind of hardware to handle your model of choosing. For instance, your iPhone already has a lot of power built-in. Here are 4 iPhone apps that let you run LLMs on iOS locally:

Apollo: a handy app that lets you run AI on your phone. It can run Llama 3 and closed models like GPT. It works with LM Studio and Ollama.

MindKeep: a nifty app that you can use to run AI locally on your phone. It gives you access to OpenAI, Claude, and DeepSeek. It works with MindKeep Desktop, LM Studio or Ollama for AI chats.

NoProbLlama: another iPhone app that lets you interact with LLMs locally. It works for Ollama and LM Studio. It supports with models like DeekSeek R1, Gemma, Dolphin, and Dia-1.6B.

LLM Pigeon: a private AI app that lets you run chatbots on your Mac and access them on your phone. Just LLM Pigeon Server on your Mac, and you are set. This app relies on CloudKit to get answers back to your phone.
Stay tuned as we will cover more local AI iPhone apps here in the future.
