Anders Thoresson on Nostr: I just got #Llama 3.2 Vision running locally, using #Msty, on my MacBook Pro M1. It's ...
I just got #Llama 3.2 Vision running locally, using #Msty, on my MacBook Pro M1. It's not fast, and perhaps not as good, as doing the same using one of the big cloud services. But the gut feeling is that it is more resource efficient.
So, looking for research comparing local vs hosted LLMs from an energy consumption perspective. Anyone with a link or two to share?
#ai #energy #climate
So, looking for research comparing local vs hosted LLMs from an energy consumption perspective. Anyone with a link or two to share?
#ai #energy #climate
