Ben Evans on Nostr: A: Adding more parameters to a LL model makes it "better" B: Continued ingest of the ...
A: Adding more parameters to a LL model makes it "better"
B: Continued ingest of the general Internet corpus "improves" general models.
Opposition: What if neither of those hypotheses is true? What if neither A or B comes to pass?
What if:
~A: Adding more parameters adds nothing in terms of utility;
~B: Increased output of LLMs pollutes general input sets.
C: Subsequent LLMs are even worse than current generations.
B: Continued ingest of the general Internet corpus "improves" general models.
Opposition: What if neither of those hypotheses is true? What if neither A or B comes to pass?
What if:
~A: Adding more parameters adds nothing in terms of utility;
~B: Increased output of LLMs pollutes general input sets.
C: Subsequent LLMs are even worse than current generations.