kravietz 🦇 on Nostr: Tried llamafile which is a clever way of bundling various LLM models into a single ...
Tried llamafile which is a clever way of bundling various LLM models into a single polyglot binary that runs on nearly any operating system.
I’m traveling now so running on a laptop and my first observation is: do not even touch it without a good GPU 🤷 It does answer, but it’s an experience of talking to a BBC over a 300 bps modem 😉
I’m traveling now so running on a laptop and my first observation is: do not even touch it without a good GPU 🤷 It does answer, but it’s an experience of talking to a BBC over a 300 bps modem 😉