Kajoozie Maflingo on Nostr: If you had a $300 budget for a GPU or any PCI-E coprocessor SPECIFICALLY FOR #AI / ...
If you had a $300 budget for a GPU or any PCI-E coprocessor SPECIFICALLY FOR #AI / machine learning, no gaming, new or used, what would you buy? My RTX 3060 ti with it's paltry 8gb vram just doesn't cut the mustard for Stable Diffusion / llama.cpp
#machinelearning #deeplearning #stablediffusion #llama #chatgpt
Published at
2024-02-08 18:22:42Event JSON
{
"id": "a79288936b6d1d7e602a16bbea7651f99f880c8ac999acabc2e58de1359eb52b",
"pubkey": "341db5a7e3a931f49095d82a4acc939cf8a67293b1e4179fd4b5c0544c4fc2ef",
"created_at": 1707416562,
"kind": 1,
"tags": [
[
"t",
"ai"
],
[
"t",
"machinelearning"
],
[
"t",
"deeplearning"
],
[
"t",
"stablediffusion"
],
[
"t",
"llama"
],
[
"t",
"chatgpt"
]
],
"content": "If you had a $300 budget for a GPU or any PCI-E coprocessor SPECIFICALLY FOR #AI / machine learning, no gaming, new or used, what would you buy? My RTX 3060 ti with it's paltry 8gb vram just doesn't cut the mustard for Stable Diffusion / llama.cpp\n\n#machinelearning #deeplearning #stablediffusion #llama #chatgpt ",
"sig": "a2ad210a2be25e831a3252e02fb587ed7473e817ad9b3f8c0776cec7801d93a4e4147fb6d6cf34ca64efb79f2fefca75d0454c6d1120d5b4d041a079e89aeba1"
}