Page 1 of 1

100 million token context window: revolution in AI development?

Posted: Sat Feb 01, 2025 3:44 am
by Reddi2
Magic has made the AI ​​industry sit up and take notice with the announcement of a new model that supports a massive context window of 100 million tokens. While the company announced the LTM-1 model with 5 million tokens over a year ago, the LTM-2 mini is now unveiling an even more impressive project that focuses on software development.

LTM-2-mini: A quantum leap in AI
Magic recently presented the LTM-2 mini model, which offers a context window of 100 million tokens. This could fundamentally change the way AI is used in software development. With this model, it becomes iran phone number data possible to process huge amounts of code or text - for example, 10 million lines of code or 750 books - at once. This capacity makes it possible to analyze and understand complex software projects or extensive documentation in a single run, which was previously unthinkable.

In addition to the impressive technology, Magic has also made significant strides in efficiency. The LTM-2 mini model is designed to be not only more powerful but also more cost-effective than comparable models. These innovations are supported by strong partnerships with Google Cloud and NVIDIA, allowing Magic to significantly expand its computing infrastructure. With new supercomputers such as Magic-G4 and Magic-G5 based on NVIDIA GB200 NVL72 GPUs, Magic will be able to train and scale the next generation of AI models.

Magic's recent progress is also reflected in the development of new evaluation methods, such as the HashHop method they developed. This method sets new standards in model evaluation and is another example of Magic's innovative power.