Finance

Foxconn unveils first large language model

Published by Global Banking & Finance Review

Posted on March 10, 2025

2 min read

· Last updated: January 24, 2026

Add as preferred source on Google
Foxconn unveils first large language model
Global Banking & Finance Awards 2026 — Call for Entries

TAIPEI (Reuters) - Taiwan’s Foxconn said on Monday it has launched its first large language model and plans to use the technology to improve manufacturing and supply chain management. The model, named

Foxconn Launches Its First Large Language Model for Manufacturing

TAIPEI (Reuters) - Taiwan’s Foxconn said on Monday it has launched its first large language model and plans to use the technology to improve manufacturing and supply chain management.

The model, named “FoxBrain,” was trained using 120 of Nvidia’s H100 GPUs and completed in about four weeks, the world's largest contract electronics manufacturer said in a statement.

The company, which assembles iPhones for Apple and also produces Nvidia's artificial intelligence servers, said the model is based on Meta’s Llama 3.1 architecture.

It is Taiwan's first large language model with reasoning capabilities that is optimised for traditional Chinese and Taiwanese language styles, it said.

Foxconn said that though there was a slight performance gap compared with China's DeepSeek's distillation model, its overall performance is very close to world-class standards.

Initially designed for internal applications, FoxBrain covers data analysis, decision support, document collaboration, mathematics, reasoning and problem-solving, and code generation.

Foxconn said it plans to collaborate with technology partners to expand the model’s applications, share its open-source information, and promote AI in manufacturing, supply chain management, and intelligent decision-making.

Nvidia provided support through its Taiwan-based supercomputer “Taipei-1” and offered technical consulting during the model’s training, Foxconn said.

Taipei-1, the largest supercomputer in Taiwan, is owned and operated by Nvidia in Kaohsiung, a southern city on the island.

Foxconn will announce further details about the model during Nvidia’s GTC developer conference in mid-March.

(Reporting by Wen-Yee Lee; Editing by Shri Navaratnam)

Key Takeaways

  • Foxconn launches its first large language model, FoxBrain.
  • FoxBrain is based on Meta’s Llama 3.1 architecture.
  • The model is optimized for traditional Chinese and Taiwanese styles.
  • Foxconn plans to expand AI applications in manufacturing.
  • Nvidia supported the model's training with its Taipei-1 supercomputer.

Frequently Asked Questions

What is the name of Foxconn's large language model?
The model is named 'FoxBrain' and is designed to improve manufacturing and supply chain management.
What technology did Foxconn use to train FoxBrain?
FoxBrain was trained using 120 of Nvidia's H100 GPUs and completed in about four weeks.
What are some capabilities of FoxBrain?
FoxBrain covers data analysis, decision support, document collaboration, mathematics, reasoning, problem-solving, and code generation.
What are Foxconn's future plans for FoxBrain?
Foxconn plans to collaborate with technology partners to expand the model’s applications and promote AI in manufacturing and supply chain management.
Where will Foxconn announce further details about FoxBrain?
Further details about the model will be announced during Nvidia’s GTC developer conference in mid-March.

Tags

Related Articles

More from Finance

Explore more articles in the Finance category