privatize_universe on Nostr: 8GB is not much. Do you have reason for using old tesla rather then current gen ...
8GB is not much. Do you have reason for using old tesla rather then current gen consumer GPU?
I use amd's rx 7700xt and it's look like weak point is 12GB memory and not chip itself. At least for llama
Published at
2025-02-16 23:02:18Event JSON
{
"id": "4fb3c1da44768bfb60300681d59fbfe73aab2b95c9834fd86eb82e54d6da1228",
"pubkey": "2c4388224fdcd50f24e53cab9baa4ab605f087e82c4587035072690f333310a9",
"created_at": 1739746938,
"kind": 1,
"tags": [
[
"e",
"52858e12e49374aabd75b8bc8c7a6496d1f1cbb0a15cd212a6a7853c1d61dcf6",
"",
"root"
],
[
"p",
"0461fcbecc4c3374439932d6b8f11269ccdb7cc973ad7a50ae362db135a474dd"
]
],
"content": "8GB is not much. Do you have reason for using old tesla rather then current gen consumer GPU?\nI use amd's rx 7700xt and it's look like weak point is 12GB memory and not chip itself. At least for llama",
"sig": "5121f471165525642674948196ffbe67d018154c5a32529b41098d464e1cc9ce23ebece19bb50a6c5167ce2e805dc0ac3f4470eaaa6369f9de692b0a03460685"
}