Have a question or found a bug? Check the FAQ below or open an issue on GitHub. We're happy to help.
On first launch, im.ai will guide you to download a model. Larger models (7B+) give better quality but need more RAM and storage. Quantized variants (Q4, Q8) balance size and quality well. iPhones support up to about 4B parameters; Apple Silicon Macs handle larger models comfortably.
Tap the model picker, select a cloud provider, and enter your API key. It's stored securely in the system Keychain. API keys for supported providers:
What are the system requirements?
macOS 15.7 or later, or iOS 18.6 or later. Apple Silicon Macs perform best for local models. Intel Macs can run local inference but more slowly.
What local models are available?
Which models appear in the picker depends on your device RAM — im.ai only shows models that fit. Each model is offered in Q4 and/or Q8 quantization; Q4 uses less memory, Q8 is higher quality. Current model catalog:
| Family | Sizes |
|---|---|
| Falcon | H1 0.5B, H1R 7B, Falcon3 Mamba 7B |
| Qwen3 | 0.6B, 1.7B, 4B, 14B; VL 2B, VL 4B, VL 8B |
| Qwen2.5 Coder | 7B, 14B |
| Qwen3.5 | 9B; Opus 2B, 4B, 9B |
| GPT | oss 20B, 5-3 4B |
| LFM2 | VL 450M, 700M, 1.2B, 2.6B, 8B A1B, 24B A2B |
| LFM2.5 | VL 1.6B, 1.2B |
| Gemma 3 | 270M, 1B, 4B |
| Gemma 4 | E2B, E4B, 26B A4B |
| SmolLM | 2 360M, 3 3B 128K |
| Granite | 3.1 1B, 4.0 350M, 4.0 Tiny, 4.0 Micro, 4.0 H 1B |
| Deepseek | R1 1.5B, R1 8B, 7B; Deepthink 7B |
| OLMo | 2 1B, 3 7B |
| Hermes | 3 Llama 3.2 3B, 4 14B |
| Llama | 3.3 8B |
| Phi | 4 Mini |
| Swe Dev | 7B |
| GLM | 4.6V Flash |
How much storage do models use?
Small models (under 1B) use a few hundred MB. Mid-size models (7-8B Q4) use around 4-5 GB. Larger models (14B+) can need 8 GB or more. Swipe left on any downloaded model in the model picker to delete it and reclaim space.
How do I return to a previous conversation?
Tap the sidebar icon in the top-left corner. Your chats are listed there, grouped by Today, Last Week, and Last Month. Tap any entry to reopen it.
How do I delete a downloaded model?
Open the model picker, find the downloaded model, and swipe left on it to reveal the Delete button.
How do I delete an old conversation?
Open the sidebar, swipe left on a chat to delete it. To clear all conversations at once, tap the recycle icon next to the Chats heading.
Can I use it offline?
Yes. Local models work fully offline. Cloud providers require an internet connection.
Where is my conversation history stored?
Entirely on your device. Nothing is sent to any im.ai server.
Is it really free?
Yes. No in-app purchases, no subscriptions, no premium tier. Cloud providers may have their own pricing for API usage above free tiers.
My API key is showing as invalid. What do I do?
Double-check that you've copied the key correctly with no leading or trailing spaces. Confirm the key is active in your provider's dashboard. Re-enter it in the model picker settings.
The app crashed or a model won't load. What should I do?
Try a Q4 variant instead of Q8, or pick a smaller model. Close other apps to free RAM. If the issue persists, open a GitHub issue with your device model and iOS/macOS version.
The best way to reach us is via GitHub. Bug reports, feature requests, and questions are all welcome.