buttercat1791 on Nostr: I think we should include optional data for LLM context. It could be additional tags, ...
I think we should include optional data for LLM context. It could be additional tags, or, more likely, a link to a different event type.
Gherkin and LLMs are 🤝
You might generate Gherkin specs from a prompt as a way of documenting your code for others to validate or test. Or you might write test cases from a human-readable spec. LLMs could do that nicely, and when they do, we should have a way of providing the prompt and model that was used so other people can attempt to recreate it, or at least get a better understanding of where the Gherkin scenarios come from.
Gherkin and LLMs are 🤝
You might generate Gherkin specs from a prompt as a way of documenting your code for others to validate or test. Or you might write test cases from a human-readable spec. LLMs could do that nicely, and when they do, we should have a way of providing the prompt and model that was used so other people can attempt to recreate it, or at least get a better understanding of where the Gherkin scenarios come from.