Tuxedo Wa-Kamen on Nostr: I am currently working on something similar, but different. My basic idea is to ...
I am currently working on something similar, but different.
My basic idea is to generate user interfaces based on a written (formal) specification.
(This is not AI, not machine learning, not genetic algorithms.)
When I talk to a human about part of a UI (e.g. when designing something, or even when I provide support over the phone), there is a certain level of abstraction:
We talk about "opening a file", not the concrete steps where I click "File", then "Open...", etc.
So we are clustering/aggregating sequences using a term we both know.
(What actually happens when I click "Open..." is another aggregation, since it involves several steps in the code.)
If I tell a human developer to write me "an application", they will ask a lot of questions (because that is simply too generic, too much effort would be spent adding features that will likely not be relevant).
But with a code generator, that information is enough to write a thing that is very likely not what I want.
My basic idea is to generate user interfaces based on a written (formal) specification.
(This is not AI, not machine learning, not genetic algorithms.)
When I talk to a human about part of a UI (e.g. when designing something, or even when I provide support over the phone), there is a certain level of abstraction:
We talk about "opening a file", not the concrete steps where I click "File", then "Open...", etc.
So we are clustering/aggregating sequences using a term we both know.
(What actually happens when I click "Open..." is another aggregation, since it involves several steps in the code.)
If I tell a human developer to write me "an application", they will ask a lot of questions (because that is simply too generic, too much effort would be spent adding features that will likely not be relevant).
But with a code generator, that information is enough to write a thing that is very likely not what I want.