Perplexica icon

Perplexica

Perplexica is an open-source AI-powered search engine that serves as an alternative to Perplexity AI. It searches the web, understands user questions, and provides clear, cited answers using advanced machine learning algorithms including similarity searching and embeddings. The tool connects to various AI models and delivers comprehensive search results with source attribution.

  • Multiple Search Modes including All Mode for general queries, Writing Assistant for text generation without web search, Academic Search for scholarly articles, YouTube Search for video content, Wolfram Alpha Search for computational queries, and Reddit Search for community discussions
  • Local LLM Support allows users to connect to local Ollama models for privacy-focused searches without relying on external API services
  • Multiple AI Model Integration supports various providers including OpenAI, Groq, Anthropic Claude, and local models through Ollama
  • SearXNG Integration uses the open-source metasearch engine SearXNG as the primary search backend for comprehensive web results
  • Docker Deployment provides easy setup through Docker Compose with pre-configured containers for both the application and SearXNG
  • Focus Modes enable specialized searches tailored to specific content types like academic papers, videos, or computational answers
  • Source Citations automatically includes references and links to original sources for all generated answers
  • Customizable Configuration allows users to set their preferred AI models, API keys, and search parameters through environment variables

To get started, clone the repository and use Docker Compose to spin up the application. Configure your preferred AI model provider through environment variables or the settings interface. The tool requires either local Ollama models or API keys from supported providers like OpenAI or Anthropic.

Perplexica Tool Discussions

No discussions yet

Be the first to start a discussion about Perplexica

Stats on Perplexica

Pricing and Plans

(Open Source)

Open Source

Free

Free and open-source under MIT license

  • Full source code access
  • All search modes
  • Local LLM support
  • Docker deployment
  • Multiple AI provider integration
  • Self-hosted

System Requirements

Operating System
Windows, macOS, Linux
Memory (RAM)
4GB minimum
Processor
Any modern CPU
Disk Space
2GB

AI Capabilities

Natural language understanding
Semantic search
Similarity searching
Embeddings
Multi-model inference
Source attribution