Guy on Nostr: Generative Pre-trained Transformer, but your point remains. And also to your point, ...
Generative Pre-trained Transformer, but your point remains.
And also to your point, anything it generates depends on its pre-training. So the more content that gets created with GPTs the more pre-training that happens with later GPTs, and a lot of noise results.
Published at
2024-08-12 20:39:45Event JSON
{
"id": "00000088a9aeb0af40ae0cf99929121411db2bb45725232d5298f79c482d36c3",
"pubkey": "772f954551fd8660907f3d4ec2db65f573cfcbe6c8fa34e620fb7b705c93249a",
"created_at": 1723495185,
"kind": 1,
"tags": [
[
"p",
"4c800257a588a82849d049817c2bdaad984b25a45ad9f6dad66e47d3b47e3b2f"
],
[
"e",
"3bd269153080e8fea0f0cd32520502a0c548f5a43a9a59cb405d34c8f6b5fa3e",
"wss://a.nos.lol/",
"root"
],
[
"e",
"e9684eae9459afed8d727f9547fd8591c8387de08f9d57c4b0fc53ad5ea173c6",
"wss://a.nos.lol/",
"reply"
],
[
"nonce",
"13835058055282466856",
"23"
]
],
"content": "Generative Pre-trained Transformer, but your point remains.\n\nAnd also to your point, anything it generates depends on its pre-training. So the more content that gets created with GPTs the more pre-training that happens with later GPTs, and a lot of noise results.",
"sig": "7c8098808e5dd4717a705bc12e992c0756a319f3eaf85c69b03a3760e637523079c0fbec00bd0ad3fab21e515bd3f76b1ba7aabb1786dbf50230aea922487a4c"
}