Good bye and good riddance to ChatGPT. Decent quality LLM now fits on your phone.

The pace of AI innovation has been impressive lately. Microsoft is increasingly looking like it’s going in reverse compared to smaller, faster and less expensive options already available.

Cerebras and Opentensor are pleased to announce BTLM-3B-8K (Bittensor Language Model), a new state-of-the-art 3 billion parameter open-source language model that achieves breakthrough accuracy across a dozen AI benchmarks. Given that the most popular model on Hugging Face today is 7B, we believe compacting 7B performance to 3B is an important milestone in enabling AI access on mobile and edge devices. Unlike large models like GPT-3 that runs from the cloud, BTLM fits in mobile and edge devices with as little as 3GB of memory, helping democratize AI access to billions of devices worldwide.

This is a repeat of distributed computing history, given the open standards community developed more efficient kernels (Linux) and innovated far faster than Microsoft. Open source AI lately has been developing solutions faster than proprietary and closed versions, but small groups of focused closed source innovators in trusted AI could next speed meaningful innovations ahead of the giant brands known to extract and exploit your data.

Thus, most importantly, a rapid shift towards distributed low-cost high-freedom compute such as your phone means you don’t have to sacrifice confidentiality and integrity of data (hand control over to dubious agents of Microsoft) in order to use AI.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.