SRSWTI
Building the world's fastest retrieval and inference algorithms, with a focus on local-first, privacy-preserving AI on Apple Silicon.
At a Glance
- AI Developers
- Privacy-conscious Enterprises
- Mac Power Users
AI Tools by SRSWTI
(1)Bodega Inference Engine
LLM Inference for Apple Silicon
Discussions
No discussions yet
Be the first to start a discussion about SRSWTI
Latest News
Bodega Inference Engine benchmarking vs LM Studio reveals high throughput on Mac Studio.
SRSWTI Research Labs showcased in AI Twitter Recap and Latent Space newsletter.
Bodega Inference Engine listed on EveryDev as a top tool for Apple Silicon inference.
Launch of Bodega Beam for private global file transfers.
Products & Services
High-performance, enterprise-grade LLM inference engine optimized for Apple Silicon with speculative decoding and continuous batching.
A privacy-focused operating system for local AI intelligence, part of the Bodega OS Pioneer Program.
A secure, unlimited, and private file transfer platform.
A machine learning library for advanced data retrieval and explainable AI (srswti-axis).
Market Position
Positioned as a faster, more efficient alternative to LM Studio and Ollama for Apple Silicon hardware, emphasizing 'inference per watt'.
Leadership
Founders
Rohit Tiwari
Co-founder & CTO at SRSWTI; previously Data Scientist at Tartan and GooseAI, Machine Learning Engineer at TRILL Marketplace; Education: New York University.
Executive Team
Rohit Tiwari
Chief Technology Officer
Expert in deep learning, LLMs, and high-performance inference engines.
Rajat Tiwari
Founding Engineer
Fullstack and AI Engineer; previously entrepreneur and software engineer in Dallas, TX.
Founding Story
Started as an AI research lab to solve the performance bottlenecks of local LLM inference, specifically optimizing for Apple's Metal architecture.
Business Model
Revenue Model
Currently focused on open-source tools and the 'Pioneer Program' for Bodega OS; likely moving towards enterprise licensing for inference engines.
Pricing Tiers
Full access to Bodega Inference Engine and core libraries.
Target Markets
- AI Developers
- Privacy-conscious Enterprises
- Mac Power Users
- Local LLM deployment
- Privacy-sensitive enterprise AI
- High-performance developer tools
- Secure global file transfer
- Developer Community
- Early adopters of Bodega OS Pioneer Program