mpls on Nostr: Which model/setup do you use? In terms of local models, I've found TheBloke's stuff ...
Which model/setup do you use? In terms of local models, I've found TheBloke's stuff on huggingface easiest to get working with llamacpp.
Published at
2023-12-14 06:54:07Event JSON
{
"id": "247fa67949fec169f3b5487f2fbf067a6edea44b0a595bff2fb7e8d993a467e0",
"pubkey": "962e1e6f9de15b9a06772d3be2c1799e2041ce7d89f46021355ebea0badef4a0",
"created_at": 1702536847,
"kind": 1,
"tags": [
[
"e",
"e405668ca6ae7b7badfe87c276e933345aa9f83d4249d322284311b397078ef8",
"",
"root"
],
[
"p",
"bdb96ad31ac6af123c7683c55775ee2138da0f8f011e3994d56a27270e692575"
],
[
"p",
"813fce4c4e76f1e7b4f4697bf1030a90f1a0b783f187d329800a4dd8697f9759"
]
],
"content": "Which model/setup do you use? In terms of local models, I've found TheBloke's stuff on huggingface easiest to get working with llamacpp.",
"sig": "5466ac0890b324c89d5858ddb6c5bc554d68df46e83e9caab219fa1e27345d391783da58229a2cc9a770aacc98d7dfb9f85b4e586a2c5aa2b7741dad837f44f5"
}