SUPERMAX on Nostr: I prefer to run LLM's locally. Best to get started is run the Llama or Minstral 7b/8b ...
I prefer to run LLM's locally. Best to get started is run the Llama or Minstral 7b/8b models (at whatever size your RAM is, so 8b parameter model can be run on ~8GB RAM device)
If wanting something easier, free version of Claude is probably best rn imo, but the landscape changes so fast; literally day-by-day
Published at
2024-12-06 18:47:07Event JSON
{
"id": "cad841493b5341684b8dda41cf4757830b144ef4dd55bfefef87268423453729",
"pubkey": "ae1008d23930b776c18092f6eab41e4b09fcf3f03f3641b1b4e6ee3aa166d760",
"created_at": 1733510827,
"kind": 1,
"tags": [
[
"e",
"14e06514f67c9c8d96c0152109ae940ea572463d5621086bda48108fc7100c12",
"",
"root"
],
[
"e",
"928160327de2d8943c538f95346a4482db59ce2377328d487d1ab5dff7d4d69e",
"",
"reply"
],
[
"p",
"ae1008d23930b776c18092f6eab41e4b09fcf3f03f3641b1b4e6ee3aa166d760"
],
[
"p",
"056c6abe7d3b12d770e2dbf0a1ff7d737de5d1856821742e5442d94008bac24d"
]
],
"content": "I prefer to run LLM's locally. Best to get started is run the Llama or Minstral 7b/8b models (at whatever size your RAM is, so 8b parameter model can be run on ~8GB RAM device)\n\nIf wanting something easier, free version of Claude is probably best rn imo, but the landscape changes so fast; literally day-by-day",
"sig": "b58325d2521b1607302832ca1a2c10741c215b79425b1242fbf9f38f4c27f33b635f0fb167b4b1692b89643c02870c03868ff2fbc4198a3bec52157c73cc2401"
}