Anthropy :verified_flashing: on Nostr: I wish there was an LLM with more output tokens. 8192 is a bunch but still very ...
I wish there was an LLM with more output tokens. 8192 is a bunch but still very limiting, and it's nice if I can feed it 1.5 million tokens, but if I can't also receive 1.5 million tokens then it won't be able to actually write a complex thing even if it understands complex things.
having like 32k output tokens would come a long way already, but ideally, honestly, it would just be equal to the input token limit.
Published at
2024-04-16 19:28:35Event JSON
{
"id": "9df971734d5214790f71d9fb15d073c66cb48218758cfa50edf9654d9ad44655",
"pubkey": "68ae7dcdff5b7de4805977bb547a0126cfb8cab62e9f8abdee6760bbb3ff6ead",
"created_at": 1713295715,
"kind": 1,
"tags": [
[
"content-warning",
"ML, AI, lack of output tokens in LLMs"
],
[
"proxy",
"https://mastodon.derg.nz/users/anthropy/statuses/112282547986822991",
"activitypub"
]
],
"content": "I wish there was an LLM with more output tokens. 8192 is a bunch but still very limiting, and it's nice if I can feed it 1.5 million tokens, but if I can't also receive 1.5 million tokens then it won't be able to actually write a complex thing even if it understands complex things.\n\nhaving like 32k output tokens would come a long way already, but ideally, honestly, it would just be equal to the input token limit.",
"sig": "720170ceaef2574c0f54624b57fe942f559230906aa874f716ec4e914bce9153800f9069bc4a7d4470338fec3bfe012144c0b208c3fc7508fcc884c73e9761fc"
}