Switzerland Joins the AI Race with an Open and Multilingual Model

With capabilities in more than a thousand languages, the Swiss model prioritizes inclusion and transparency as cornerstones for trustworthy AI.
By ETH Zurique

View Original Article Aug, 27 2025

Switzerland Joins the AI Race with an Open and Multilingual Model

In a move that could redefine the global artificial intelligence landscape, researchers from ETH Zurich and EPFL, in partnership with the Swiss National Supercomputing Centre (CSCS), have announced the development of one of the world’s largest open-source language models. Trained on the “Alps” supercomputer, the project stands out for its transparency, multilingual capacity, and accessibility.

The Swiss model represents a break from the current AI paradigm. While giants such as OpenAI and Google operate with proprietary algorithms, the Swiss initiative will provide fully open source code, weights, and training data under the Apache 2.0 license. This transparency will enable independent audits and ensure compliance with the strict privacy regulations of both the European Union and Switzerland.

One of the most remarkable features is its support for more than 1,000 languages, with 40% of the training data dedicated to non-English languages. “Open models are essential for building trust and advancing AI risk research,” explains Imanol Schlag, project lead at ETH Zurich. The multilingual approach from the early stages aims to reduce the linguistic bias commonly found in commercial AI systems.

The release will include two versions: one with 8 billion parameters for lighter applications and another with 70 billion, placing it among the most powerful open models in the world. Training consumed 15 trillion tokens, with a focus on data quality and diversity. Preliminary studies show that excluding web-scraped data did not significantly impact performance, reinforcing its ethical stance on information handling.

The infrastructure behind the project is equally impressive. The “Alps” supercomputer, equipped with 10,000 NVIDIA chips and operating with 100% carbon-neutral energy, was essential to the process. This computational power puts Switzerland at the forefront of AI research, driving the world’s largest open-source initiative of its kind, involving 800 researchers and 20 million GPU hours annually.

“We are investing in sovereign infrastructure to foster innovation not only in Switzerland but globally,” says Thomas Schulthess, director of CSCS. The model, expected to be available by the end of the European summer, is anticipated to be adopted by governments, universities, and companies, reducing reliance on proprietary systems.

“We want to democratize AI,” summarizes Martin Jaggi, professor at EPFL. “Openness attracts talent and accelerates discoveries.” Funded by the ETH Board for the 2025–2028 period and linked to the European ELLIS network, the initiative strengthens Europe’s role in developing trustworthy and inclusive AI.

This breakthrough comes at a time when some Chinese companies are also releasing open-source LLMs, such as DeepSeek and Z.ai’s GLM-4.5, further expanding the open AI ecosystem and diversifying options beyond proprietary models.