vnprc on Nostr: This is what I want! The most time consuming part of coding with an LLM is collecting ...
This is what I want!
The most time consuming part of coding with an LLM is collecting the right context to put in the prompt. But too much context can be just as bad. I will look into replit. 🙏
Published at
2024-11-08 18:41:35Event JSON
{
"id": "f438c00af729cc0652380c1fcf4e89cc023491c6c14b8e159ae56f1d3f1b5b7e",
"pubkey": "d3052ca3e3d523b1ec80671eb1bba0517a2f522e195778dc83dd03a8d84a170e",
"created_at": 1731091295,
"kind": 1,
"tags": [
[
"e",
"3890b7bf2d154957863a2b425733737b1878b169c8da1cdc929da42822106a3a",
"",
"root"
],
[
"e",
"3826fb103631dd2d11b4e98e4bae1739ea07c9772a39340fe35d149ed63965a9",
"",
"reply"
],
[
"p",
"d3052ca3e3d523b1ec80671eb1bba0517a2f522e195778dc83dd03a8d84a170e"
],
[
"p",
"c4f5e7a75a8ce3683d529cff06368439c529e5243c6b125ba68789198856cac7"
]
],
"content": "This is what I want!\n\nThe most time consuming part of coding with an LLM is collecting the right context to put in the prompt. But too much context can be just as bad. I will look into replit. 🙏 ",
"sig": "86699e93949d41e46aefda6f99231dec2ec5d04682f4a781e41ad096e71e7452c42dade1a50c4b2ed009e803c9d0984018813c5c99b5ad827f49e87bb699c792"
}