npub1yk…d2jl0 on Nostr: npub1l3gpk…qvu48 Not at all, if you ask me - there's alternative approaches that ...
npub1l3gpk6vrudg8r67swqlex5alv9ch59s4lw46kk6hekuxe2n3aczsyqvu48 (npub1l3g…vu48) Not at all, if you ask me - there's alternative approaches that work better, but OpenAI happens to _have_ tons of compute, storage, and RAM, and so their methodology is staying relatively simple :P
There are LLMs that can run on garbage hardware. Someone ported llama to run on a Thinkpad under Plan 9 lmao.
I suspect that OpenAI is going to get out-competed in the next few years unless their strategy to get the government to grant them a monopoly pans out.
Published at
2024-03-19 06:09:04Event JSON
{
"id": "9a7f72b5f43ff495c83796828f1277d8b996768e0ceccfef94326f3541565eb8",
"pubkey": "25a9298ce9ed607f7539fb27d94418161476e7284b523577ce2e878fcf1852fd",
"created_at": 1710828544,
"kind": 1,
"tags": [
[
"p",
"fc501b6983e35071ebd0703f9353bf61717a1615fbabab5b57cdb86caa71ee05",
"wss://relay.mostr.pub"
],
[
"p",
"c95a71dfb95245bfb233a5ccea6dbcb03adda051221413144a54a206165aa52f",
"wss://relay.mostr.pub"
],
[
"e",
"f33bf8ee85b69ec47a2103f87b602a1e285e02ec52b844568213f9ec811a03eb",
"wss://relay.mostr.pub",
"reply"
],
[
"proxy",
"https://merveilles.town/users/pixx/statuses/112120859517004734",
"activitypub"
]
],
"content": "nostr:npub1l3gpk6vrudg8r67swqlex5alv9ch59s4lw46kk6hekuxe2n3aczsyqvu48 Not at all, if you ask me - there's alternative approaches that work better, but OpenAI happens to _have_ tons of compute, storage, and RAM, and so their methodology is staying relatively simple :P\n\nThere are LLMs that can run on garbage hardware. Someone ported llama to run on a Thinkpad under Plan 9 lmao.\n\nI suspect that OpenAI is going to get out-competed in the next few years unless their strategy to get the government to grant them a monopoly pans out.",
"sig": "6eaed2b22949da8c921c77e711932afe0554d9bf5051d9cbdf423bdc0d0b1774630be719cadc8300826edfcfe70bdfcc5bcc229d443f28f27e4e310fa2195268"
}