Apertus
Apertus is a fully open large language model from the Swiss AI Initiative (EPFL, ETH Zurich, CSCS). Released in 8B and 70B sizes under Apache 2.0, trained on 1,800+ languages, designed for EU AI Act compliance.
At a Glance
Fully open foundation model released under Apache License 2.0. Free to download, use, fine-tune, redistribute, and deploy. Cloud-hosted API pricing is set independently by third-party providers such as Swisscom, Infomaniak, Phoenix Technologies, AWS, and Azure.
Engagement
Available On
Alternatives
Listed Apr 2026
About Apertus
Apertus is a fully open-source large language model developed by the Swiss AI Initiative — a public research collaboration between EPFL, ETH Zurich, and the Swiss National Supercomputing Centre (CSCS). Released on September 2, 2025 under the Apache 2.0 license, Apertus was trained on the Alps supercomputer in Lugano using 15 trillion tokens across more than 1,800 languages (roughly 40% non-English content), making it one of the most transparent and multilingual open models ever released by a public institution.
Unlike closed-weight models, Apertus releases the entire development pipeline: model weights, training data recipes, training code, intermediate checkpoints, evaluation scripts, and a full technical report. Every part of the model is reproducible and auditable. Apertus is also built to comply with the EU AI Act — it respects content-owner opt-outs (even retroactively), filters personally identifiable information from training data, and uses the Goldfish training objective to suppress verbatim memorization of training data.
Model Variants:
- Apertus 8B and Apertus 70B — Base models intended for research and custom fine-tuning.
- Apertus 8B Instruct and Apertus 70B Instruct — Chat-tuned versions for conversational use.
How to Use Apertus:
- Try it free in the browser — The Public AI chat interface offers a free ChatGPT-style frontend powered by Apertus.
- Download the weights — Grab models directly from the swiss-ai Hugging Face collection: for example
swiss-ai/Apertus-8B-Instruct-2509orswiss-ai/Apertus-70B-2509. - Run locally (desktop) — Use LM Studio on Mac, Linux, or Windows for quick inference on your own hardware. Community MLX and GGUF builds are also available.
- Run a self-hosted inference server — Use vLLM (recommended) or SGLang for production deployments.
- Use a hosted API — Production API access is available from Swisscom (strategic partner), Infomaniak, Phoenix Technologies, Public AI, AWS SageMaker, and Microsoft Azure.
- Fine-tune the model — The apertus-finetuning-recipes repo provides LoRA and full-parameter training scripts. LoRA fine-tuning of the 8B model fits on a single 40 GB GPU; the 70B model requires a multi-GPU setup.
Quickstart (Python + Transformers):
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "swiss-ai/Apertus-8B-Instruct-2509"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name).to("cuda")
prompt = "Give me a brief explanation of gravity."
messages = [{"role": "user", "content": prompt}]
text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
inputs = tokenizer([text], return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=256)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Requires transformers >= 4.56.0.
About the Swiss AI Initiative:
Launched in December 2023 by EPFL and ETH Zurich, the Swiss AI Initiative is a public research collaboration involving more than 10 Swiss academic institutions, over 800 researchers, and 20+ million yearly GPU hours on CSCS's Alps supercomputer. It is funded by the ETH Board and operates as one of the world's largest open-science efforts dedicated to AI foundation models. Swisscom is the initiative's strategic commercial partner. Apertus is the initiative's flagship release and is intended as public-good infrastructure for chatbots, translation systems, educational tools, and domain-specific fine-tuned applications.
Community Discussions
Be the first to start a conversation about Apertus
Share your experience with Apertus, ask questions, or help others learn from your insights.
Pricing
Open Source
Fully open foundation model released under Apache License 2.0. Free to download, use, fine-tune, redistribute, and deploy. Cloud-hosted API pricing is set independently by third-party providers such as Swisscom, Infomaniak, Phoenix Technologies, AWS, and Azure.
- Open weights (Apache 2.0)
- Open training data recipes
- Open training and evaluation code
- Published technical report
- 8B and 70B parameter variants
Capabilities
Key Features
- Fully open large language model (weights, data recipes, code, evaluations)
- Apache 2.0 license
- 8B and 70B parameter variants
- Base and Instruct versions for each size
- Trained on 15 trillion tokens
- Multilingual support for 1,800+ languages
- ~40% non-English training data
- EU AI Act compliance
- Goldfish objective reduces verbatim memorization
- Respects data opt-outs (retroactively)
- Filters personally identifiable information
- Hugging Face distribution
- Runs locally via LM Studio (Mac, Linux, Windows)
- Self-hosted inference via vLLM or SGLang
- Hosted API options via Swisscom, Infomaniak, AWS, Azure, and others
- Fine-tuning recipes (LoRA and full-parameter)
- Intermediate training checkpoints published
