Microsoft researchers have developed BitNet b1.58 2B4T—the first 1-bit artificial intelligence (AI) model with 2 billion parameters, capable of running on regular processors like Apple M2. According to Tom’s Hardware, the model, released under an open MIT license, is available on the Hugging Face platform. Its uniqueness lies in the use of 1-bit weights (values -1, 0, +1), which significantly reduce memory and computational demands compared to traditional models using 16- or 32-bit formats.
BitNet, trained on 4 trillion tokens (units of text), is comparable in performance to models like Google Gemma 3 1B but consumes only 400 MB of memory—dramatically less than competitors. This makes it ideal for devices with limited resources, reducing reliance on powerful GPUs or neural processors (NPU). However, for optimal efficiency, a specialized framework called bitnet.cpp is required, which currently does not support GPUs.