Simon Willison on Nostr: nprofile1q…mca30 "Everything Willison writes here actually just validates ...
nprofile1qy2hwumn8ghj7un9d3shjtnddaehgu3wwp6kyqpqe066xta2y4zem54lp00xsqrzk2us4vw34gc7vmr0td2zm87qsqpqtmca30 (nprofile…ca30) "Everything Willison writes here actually just validates people’s suspicions of LLM-generated code!"
That was my intent with the piece
I want people to understand that blindly trusting code written by LLMs is a huge mistake, not because they might invent a method but because even if the code compiles it doesn't mean there aren't other, much more important (and less obvious) problems
That was my intent with the piece
I want people to understand that blindly trusting code written by LLMs is a huge mistake, not because they might invent a method but because even if the code compiles it doesn't mean there aren't other, much more important (and less obvious) problems