Adrien Grondin
To provide a 100% private, offline, and secure AI experience by running powerful LLMs locally on Apple devices without data collection or cloud dependence.
Founding Story
Locally AI was created by Adrien Grondin, an iOS developer and AI enthusiast, to solve the privacy concerns of cloud-based AI. By leveraging Apple's MLX framework and the efficiency of Apple Silicon, he aimed to make powerful generative AI accessible directly on-device.
Discussions
No discussions yet
Be the first to start a discussion about Adrien Grondin
Leadership
Founders
Adrien Grondin
iOS Developer at Match Group (working on Match and Meetic apps). AI and ML enthusiast specializing in on-device intelligence and Apple platforms.
Executive Team
Adrien Grondin
Founder and Lead Developer
Senior iOS Developer with expertise in building scalable mobile applications and integrating on-device machine learning frameworks.
Business Model
Revenue Model
Freemium: The app is free to download and use for basic features, with an optional 'Pro' in-app purchase for advanced capabilities and unlimited use.
Pricing Tiers
Access to basic local models and standard chat features.
Unlocks advanced models, file analysis, voice mode, and enhanced performance features.
Target Markets
- Privacy-conscious individuals
- Developers and researchers
- Enterprises with strict data privacy requirements
- Apple device power users
- Private personal assistant
- Secure document summarization
- Offline coding assistance
- On-device data analysis
- Local AI research and experimentation
- Individual Apple users globally
History & Milestones
Integrated Qwen 3.5 small models, expanding the selection of high-performance local LLMs.
Officially launched Locally AI for Mac on Product Hunt, bringing the full suite of local AI tools to macOS.
Added support for Apple's Foundation model, enabling on-device chat and summarization using native OS features.
Introduced support for over 15 file types, including PDF, CSV, and source code files, for local data analysis.
1 AI Tool by Adrien Grondin
Run open-source AI models like Llama, Gemma, Qwen, and DeepSeek completely offline and privately on your iPhone, iPad, and Mac, optimized for Apple Silicon.
