Main Menu
  • Tools
  • Developers
  • Topics
  • Discussions
  • News
  • Blogs
  • Builds
  • Contests
  • Compare
  • Arena
Create
    EveryDev.ai
    Sign inSubscribe
    Home
    Tools

    2,051+ AI tools

    • New
    • Trending
    • Featured
    • Compare
    • Arena
    Categories
    • Agents1104
    • Coding995
    • Infrastructure429
    • Marketing408
    • Design354
    • Projects323
    • Analytics311
    • Research297
    • Testing194
    • Data166
    • Integration164
    • Security162
    • MCP152
    • Learning143
    • Communication126
    • Extensions118
    • Commerce112
    • Prompts109
    • Voice105
    • DevOps89
    • Web73
    • Finance19
    1. Home
    2. Tools
    3. Apertus
    Apertus icon

    Apertus

    Local Inference
    Featured

    Apertus is a fully open large language model from the Swiss AI Initiative (EPFL, ETH Zurich, CSCS). Released in 8B and 70B sizes under Apache 2.0, trained on 1,800+ languages, designed for EU AI Act compliance.

    Visit Website

    At a Glance

    Pricing
    Open Source

    Fully open foundation model released under Apache License 2.0. Free to download, use, fine-tune, redistribute, and deploy. Cloud-hosted API pricing is set independently by third-party providers such as Swisscom, Infomaniak, Phoenix Technologies, AWS, and Azure.

    Engagement

    Available On

    Web
    API
    Windows
    macOS
    Linux

    Resources

    WebsiteDocsGitHubllms.txt

    Topics

    Local InferenceAI InfrastructureAcademic Research

    Alternatives

    Tilde Open LLMSambaNovallama.cpp
    Developer
    Swiss AI InitiativeZurich and Lausanne, SwitzerlandEst. 2023

    Listed Apr 2026

    About Apertus

    Apertus is a fully open-source large language model developed by the Swiss AI Initiative — a public research collaboration between EPFL, ETH Zurich, and the Swiss National Supercomputing Centre (CSCS). Released on September 2, 2025 under the Apache 2.0 license, Apertus was trained on the Alps supercomputer in Lugano using 15 trillion tokens across more than 1,800 languages (roughly 40% non-English content), making it one of the most transparent and multilingual open models ever released by a public institution.

    Unlike closed-weight models, Apertus releases the entire development pipeline: model weights, training data recipes, training code, intermediate checkpoints, evaluation scripts, and a full technical report. Every part of the model is reproducible and auditable. Apertus is also built to comply with the EU AI Act — it respects content-owner opt-outs (even retroactively), filters personally identifiable information from training data, and uses the Goldfish training objective to suppress verbatim memorization of training data.

    Model Variants:

    • Apertus 8B and Apertus 70B — Base models intended for research and custom fine-tuning.
    • Apertus 8B Instruct and Apertus 70B Instruct — Chat-tuned versions for conversational use.

    How to Use Apertus:

    • Try it free in the browser — The Public AI chat interface offers a free ChatGPT-style frontend powered by Apertus.
    • Download the weights — Grab models directly from the swiss-ai Hugging Face collection: for example swiss-ai/Apertus-8B-Instruct-2509 or swiss-ai/Apertus-70B-2509.
    • Run locally (desktop) — Use LM Studio on Mac, Linux, or Windows for quick inference on your own hardware. Community MLX and GGUF builds are also available.
    • Run a self-hosted inference server — Use vLLM (recommended) or SGLang for production deployments.
    • Use a hosted API — Production API access is available from Swisscom (strategic partner), Infomaniak, Phoenix Technologies, Public AI, AWS SageMaker, and Microsoft Azure.
    • Fine-tune the model — The apertus-finetuning-recipes repo provides LoRA and full-parameter training scripts. LoRA fine-tuning of the 8B model fits on a single 40 GB GPU; the 70B model requires a multi-GPU setup.

    Quickstart (Python + Transformers):

    from transformers import AutoModelForCausalLM, AutoTokenizer
    
    model_name = "swiss-ai/Apertus-8B-Instruct-2509"
    tokenizer = AutoTokenizer.from_pretrained(model_name)
    model = AutoModelForCausalLM.from_pretrained(model_name).to("cuda")
    
    prompt = "Give me a brief explanation of gravity."
    messages = [{"role": "user", "content": prompt}]
    text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
    inputs = tokenizer([text], return_tensors="pt").to(model.device)
    outputs = model.generate(**inputs, max_new_tokens=256)
    print(tokenizer.decode(outputs[0], skip_special_tokens=True))
    

    Requires transformers >= 4.56.0.

    About the Swiss AI Initiative:

    Launched in December 2023 by EPFL and ETH Zurich, the Swiss AI Initiative is a public research collaboration involving more than 10 Swiss academic institutions, over 800 researchers, and 20+ million yearly GPU hours on CSCS's Alps supercomputer. It is funded by the ETH Board and operates as one of the world's largest open-science efforts dedicated to AI foundation models. Swisscom is the initiative's strategic commercial partner. Apertus is the initiative's flagship release and is intended as public-good infrastructure for chatbots, translation systems, educational tools, and domain-specific fine-tuned applications.

    Apertus - 1

    Community Discussions

    Be the first to start a conversation about Apertus

    Share your experience with Apertus, ask questions, or help others learn from your insights.

    Pricing

    OPEN SOURCE

    Open Source

    Fully open foundation model released under Apache License 2.0. Free to download, use, fine-tune, redistribute, and deploy. Cloud-hosted API pricing is set independently by third-party providers such as Swisscom, Infomaniak, Phoenix Technologies, AWS, and Azure.

    • Open weights (Apache 2.0)
    • Open training data recipes
    • Open training and evaluation code
    • Published technical report
    • 8B and 70B parameter variants

    Capabilities

    Key Features

    • Fully open large language model (weights, data recipes, code, evaluations)
    • Apache 2.0 license
    • 8B and 70B parameter variants
    • Base and Instruct versions for each size
    • Trained on 15 trillion tokens
    • Multilingual support for 1,800+ languages
    • ~40% non-English training data
    • EU AI Act compliance
    • Goldfish objective reduces verbatim memorization
    • Respects data opt-outs (retroactively)
    • Filters personally identifiable information
    • Hugging Face distribution
    • Runs locally via LM Studio (Mac, Linux, Windows)
    • Self-hosted inference via vLLM or SGLang
    • Hosted API options via Swisscom, Infomaniak, AWS, Azure, and others
    • Fine-tuning recipes (LoRA and full-parameter)
    • Intermediate training checkpoints published

    Integrations

    Hugging Face
    Transformers
    vLLM
    SGLang
    LM Studio
    Swisscom Swiss AI Platform
    Infomaniak AI
    Phoenix Technologies
    Public AI
    AWS SageMaker
    Microsoft Azure
    PEFT
    TRL
    Accelerate
    Megatron-LM
    CSCS Alps supercomputer
    API Available
    View Docs

    Reviews & Ratings

    No ratings yet

    Be the first to rate Apertus and help others make informed decisions.

    Developer

    Swiss AI Initiative

    The Swiss AI Initiative is a public research collaboration launched in December 2023 by EPFL and ETH Zurich, with the Swiss National Supercomputing Centre (CSCS) providing training infrastructure on the Alps supercomputer. Backed by the ETH Board and involving over 800 researchers across 10+ Swiss academic institutions, it is one of the world's largest open-science efforts dedicated to AI foundation models. Apertus is the initiative's flagship release, published as open weights, open data recipes, and open code under Apache 2.0. Swisscom is the strategic commercial partner.

    Founded 2023
    Zurich and Lausanne, Switzerland
    60 employees

    Used by

    Swisscom
    Associated Press
    Stanford HAI
    Read more about Swiss AI Initiative
    WebsiteGitHub
    1 tool in directory

    Similar Tools

    Tilde Open LLM icon

    Tilde Open LLM

    Tilde Open LLM is a multilingual large language model with strong support for Baltic and other European languages, designed for open and commercial use.

    SambaNova icon

    SambaNova

    AI infrastructure platform delivering fast inference on large open-source models with custom dataflow technology and energy-efficient RDU chips.

    llama.cpp icon

    llama.cpp

    LLM inference in C/C++ enabling efficient local execution of large language models across various hardware platforms.

    Browse all tools

    Related Topics

    Local Inference

    Tools and platforms for running AI inference locally without cloud dependence.

    82 tools

    AI Infrastructure

    Infrastructure designed for deploying and running AI models.

    195 tools

    Academic Research

    AI tools designed specifically for academic and scientific research.

    31 tools
    Browse all topics
    Back to all tools
    Explore AI Tools
    • AI Coding Assistants
    • Agent Frameworks
    • MCP Servers
    • AI Prompt Tools
    • Vibe Coding Tools
    • AI Design Tools
    • AI Database Tools
    • AI Website Builders
    • AI Testing Tools
    • LLM Evaluations
    Follow Us
    • X / Twitter
    • LinkedIn
    • Reddit
    • Discord
    • Threads
    • Bluesky
    • Mastodon
    • YouTube
    • GitHub
    • Instagram
    Get Started
    • About
    • Editorial Standards
    • Corrections & Disclosures
    • Community Guidelines
    • Advertise
    • Contact Us
    • Newsletter
    • Submit a Tool
    • Start a Discussion
    • Write A Blog
    • Share A Build
    • Terms of Service
    • Privacy Policy
    Explore with AI
    • ChatGPT
    • Gemini
    • Claude
    • Grok
    • Perplexity
    Agent Experience
    • llms.txt
    Theme
    With AI, Everyone is a Dev. EveryDev.ai © 2026
    Discussions