ciori on Nostr: It's an alternative to llama.cpp, the server running the actual models, it should be ...
It's an alternative to llama.cpp, the server running the actual models, it should be more user friendly and "plug and play" wrt llama.cpp.
Published at
2024-12-02 22:00:01Event JSON
{
"id": "03ccfb246637d2624cfc0ba84a9afa99def8ec24b351b851629a9811490a5d5d",
"pubkey": "bf03bdf659e463e31574aff7698cf83b4cd81ab17829c22f7d5ccf76faacdbbd",
"created_at": 1733176801,
"kind": 1,
"tags": [
[
"e",
"5b45958a0ed9f2c516eeb2bb517ea4e89d9e32a0d3ed8e898dcf3e74d863e813",
"",
"root"
],
[
"e",
"d87fbeddfb1f827c54a96985b56337696eebaca808a9025d355ff6783057cd62"
],
[
"e",
"4f99e6aaaf7e214a1bc60f9f1a4e594ee049e7f313dbeecc0b0401f0b476ee2a",
"",
"reply"
],
[
"p",
"bf03bdf659e463e31574aff7698cf83b4cd81ab17829c22f7d5ccf76faacdbbd"
],
[
"p",
"32e1827635450ebb3c5a7d12c1f8e7b2b514439ac10a67eef3d9fd9c5c68e245"
],
[
"r",
"llama.cpp"
],
[
"r",
"llama.cpp."
]
],
"content": "It's an alternative to llama.cpp, the server running the actual models, it should be more user friendly and \"plug and play\" wrt llama.cpp.",
"sig": "076684ae8669163f2c3e814e3170a73967de129097ba3f46ef41aa333f162cf1bc42afcc8784ff6b25c5baa9482a78ddd43e15f150aeb340cc7237296de3c643"
}