sur Nota AI
Nota AI Enhances Memory Efficiency of Solar LLM by 72%
Nota AI, a tech company specializing in AI optimization, introduced an innovative quantization method known as "Nota AI MoE Quantization" that reduces the memory usage of Upstage's Solar LLM by 72%. This advance maintains model accuracy and improves processing speed, leading to reduced inference costs.
The new technology, developed under the "Sovereign AI Foundation Model Project" in South Korea, addresses challenges in Mixture of Experts (MoE) architectures. It selectively preserves precision in crucial areas of the model, achieving substantial memory compression without significant performance loss.
Applying this approach to the Solar 100B model reduced its memory from 191.2GB to 51.9GB. Performance remained stable with a Perplexity (PPL) score close to the original model. Moreover, the technology is poised to facilitate deployment in fields like robotics and automotive systems, even on limited GPU infrastructures.
R. P.
Copyright © 2026 FinanzWire, tous droits de reproduction et de représentation réservés.
Clause de non responsabilité : bien que puisées aux meilleures sources, les informations et analyses diffusées par FinanzWire sont fournies à titre indicatif et ne constituent en aucune manière une incitation à prendre position sur les marchés financiers.
Cliquez ici pour consulter le communiqué de presse ayant servi de base à la rédaction de cette brève
Voir toutes les actualités de Nota AI