AI Topic: Local Inference
Tools and platforms for running AI inference locally without cloud dependence.
AI Tools in Local Inference (14)
Liner
8dFree desktop tool for training machine learning models without code using your own data.
PaddlePaddle
10dAn open-source deep learning platform developed by Baidu for industrial-grade AI development and deployment.
Perplexica
14dAn open-source AI-powered search engine that uses advanced machine learning to understand queries and provide accurate, source-cited answers.
Chutes AI
15dServerless GPU inference platform for deploying and running AI models with pay-per-use pricing.
Dyad
25dFree, local, open-source AI app builder that lets you download and run full-stack AI apps on macOS and Windows without sign-up.
jax-js
29dA pure JavaScript port of Google's JAX ML framework that compiles and runs neural networks directly in the browser via auto-generated WebGPU kernels—with autodiff, JIT, and vectorization built in.
Jan
1moOpen-source AI desktop and web app that runs local and cloud models, provides assistants, connectors, and a local OpenAI-compatible API for self-hosted workflows.
Keras
2moKeras is an open-source, high-level deep learning API that enables building, training, and deploying neural networks across JAX, TensorFlow, and PyTorch backends.
vLLM
2moAn open-source, high-performance library for serving and running large language models with GPU-optimized inference and efficient memory and batch management.
AI Backends
3moSelf-hosted open-source AI API server that exposes unified REST endpoints and supports multiple LLM providers for integration into applications.
AI Discussions in Local Inference
Discussions
No discussions yet
Be the first to start a discussion about Local Inference