wobweger :verified: on Nostr: earlier today, talk about on premises #AI hardware setup, have know all along a ...
earlier today,
talk about on premises #AI hardware setup,
have know all along a gigantic waste of resources, to run a #LLM you only need a GPU with 512GB VRAM cost about 40k and well, hope the demonstration has been played in real slow motion
#salami , a hype I refuse to fall for
talk about on premises #AI hardware setup,
have know all along a gigantic waste of resources, to run a #LLM you only need a GPU with 512GB VRAM cost about 40k and well, hope the demonstration has been played in real slow motion
#salami , a hype I refuse to fall for