johny on Nostr: It’s crazy cause LLMs are just “predicting the most likely next word” based on ...
It’s crazy cause LLMs are just “predicting the most likely next word” based on text it has seen in training
The real power will be in leveraging them to create commands for plug-ins.
Published at
2023-07-07 19:31:03Event JSON
{
"id": "f936ded80205651e8db30160cb8b96514d4060b5317356ab162c71f936e05fb2",
"pubkey": "b9a1608d4ad164cb115a1d40ff36efd12b93c097cd2a3bf82a58c32534488893",
"created_at": 1688758263,
"kind": 1,
"tags": [
[
"e",
"0477c5b73350bb615504a70ea736125fc7036faf4165bb64a2c46d49edafb9bb"
]
],
"content": "It’s crazy cause LLMs are just “predicting the most likely next word” based on text it has seen in training\n\nThe real power will be in leveraging them to create commands for plug-ins.",
"sig": "34acacf801838231a0961d783e19bfcfaa895a16cba43f8d449113cc7532a12435914380ed95424a347571ffd8d2e48e38a796025c46d4d5cd96ce4ef0f26a90"
}