iefan 🕊️ on Nostr: 2 million context token should be enough for 5,000 pages. It should need around 1.5 ...
2 million context token should be enough for 5,000 pages. It should need around 1.5 million tokens, leaving a lot of extra context window. Then you will basically save that model state, my app turns that into personality. After that, you can use that personality as often as needed.
Published at
2025-02-23 14:16:09Event JSON
{
"id": "34732a4b57f7e63090b96c297e1ed11b722f01eee9d0fa5108309a72f85327f0",
"pubkey": "c6f7077f1699d50cf92a9652bfebffac05fc6842b9ee391089d959b8ad5d48fd",
"created_at": 1740320169,
"kind": 1,
"tags": [
[
"e",
"7e28819e49eda192ada6d2662ead14f41227009a8c80f228a7447a127e72fd94",
"",
"root"
],
[
"e",
"3ff5589ebc3dcb9fd61ae17823d662331ed649f152ef5bd4dfe4776eaad72993",
"",
"reply"
],
[
"p",
"b7ed68b062de6b4a12e51fd5285c1e1e0ed0e5128cda93ab11b4150b55ed32fc"
],
[
"p",
"26cc4fcabfc7a1ce757bc0bfacaede8c222f9c5f0bc18381ff8e59a995ff4d54"
]
],
"content": "2 million context token should be enough for 5,000 pages. It should need around 1.5 million tokens, leaving a lot of extra context window. Then you will basically save that model state, my app turns that into personality. After that, you can use that personality as often as needed.",
"sig": "e960d74596b193c7ec513bcff4191a2eaa523731da88c745d1a003671e3b1ae078f3c3c1052b7b145a86246ec44aeb598d19b1493c324a7647faf5fbfff64fc3"
}