melvincarvalho on Nostr: "Micro$oft BitNet a blazing-fast 1-bit LLM inference framework that runs directly on ...
"Micro$oft BitNet a blazing-fast 1-bit LLM inference framework that runs directly on CPUs.
You can now run 100B parameter models on local devices with up to 6x speed improvements and 82% less energy consumption—all without a GPU!"
https://github.com/microsoft/BitNet
You can now run 100B parameter models on local devices with up to 6x speed improvements and 82% less energy consumption—all without a GPU!"
https://github.com/microsoft/BitNet