Announcement_21
We have launched Mu, our 0.3B ‘micro-size’ language model, built for blazing-fast on-device inference and already powering native agentic experiences on Windows devices. 🚀
We have launched Mu, our 0.3B ‘micro-size’ language model, built for blazing-fast on-device inference and already powering native agentic experiences on Windows devices. 🚀