
Come and explore Local Al Cat, I mean Chat..
Run small LLMs privately on your iPhone or Mac - no internet required.
This app allows you to interact with local Al models directly on an iPhone 15+ or Mac with M chip without relying on cloud-based processing. With a privacy-first approach, all Al computations happen entirely on-device, ensuring that your conversations stay private.
Privacy First:
All processing is done locally on your iPhone
No data is sent to external servers
No tracking, no ads, no analytics
Features:
On-Device Al Chat - No internet needed, full privacy
Fast & Efficient - Optimized LLM inference with Core ML / MLC LLM
Supports DeepSeek R1 (1.5B) - A powerful yet lightweight model
Read aloud responses
Cat Mode (early access)