Raul007 on Nostr: You could give GPT4ALL a try. It has a built in plugin that can reference local docs. ...
You could give GPT4ALL a try. It has a built in plugin that can reference local docs. I find it does a good job summarizes concepts, but not so great at pulling out specific information.
24gb is sufficient to run 13B models at 4 or 8 bit quantization, and some will fit at 16bit 👍
Published at
2023-08-27 11:04:31Event JSON
{
"id": "3f88995e143ba7f1c578bd5481e4b555dc4493fbc87fb2ba631526963d604b85",
"pubkey": "fca3f1847bdda5b29e631edfb8ce0991af688041051dfb6d3bf236880afd1678",
"created_at": 1693134271,
"kind": 1,
"tags": [
[
"e",
"902d6215967ce8b9583a78ccbf3f7862e9ed9dcd475ec15941ae4cfead06abde",
"",
"root"
],
[
"e",
"d6d2fe74604930746ce6fd98070176eb3cd663022c43c67b06a0b233c8dcfed0",
"",
"reply"
],
[
"p",
"fca3f1847bdda5b29e631edfb8ce0991af688041051dfb6d3bf236880afd1678"
],
[
"p",
"fca3f1847bdda5b29e631edfb8ce0991af688041051dfb6d3bf236880afd1678"
],
[
"p",
"23ecb8cd6b05c795717047f19a8444df75ef8626e6fed70ad648db60ca575e80"
]
],
"content": "You could give GPT4ALL a try. It has a built in plugin that can reference local docs. I find it does a good job summarizes concepts, but not so great at pulling out specific information.\n\n24gb is sufficient to run 13B models at 4 or 8 bit quantization, and some will fit at 16bit 👍",
"sig": "50fec09e69bd579212d8cc7339c37c2b1c233ab0537f03fbd7388f3d5a0aa593d16c521968f32904b84b2a4e706f414c40f525f5b19547cedccac1bba2ae5f43"
}