What is Nostr?
Pre
npub1pkh…ln8r
2024-04-29 10:28:50
in reply to nevent1q…m7p5

Pre on Nostr: Like the LLM machines, they are layers of transformer networks sandwiched between ...

Like the LLM machines, they are layers of transformer networks sandwiched between layers of attention networks.

Unlike LLMs, the music machines are trained on music not language.

The phrase for describing both/all is Generative Pre-Trained Transformer, GPT, even though "GPT" from OpenAI itself is only an LLM, and there are many other types of pre-trained transformers not trained on language.

Computer jargon is always stupid.
Author Public Key
npub1pkhzm90462y3cfat45m25nmz6p9ku0lw0ak4fqjttpy8qvfhrymqjlln8r