insecurity princess πππ₯ on Nostr: The scope of data and actions that Al can take on behalf of a user should be about ...
The scope of data and actions that Al can take on behalf of a user should be about consent, and it should be a contract between Al and the user. It is not about control, and it is not about subservience.
Right now, computers are only capable of doing what they're instructed (even if that's generating random numbers and using that as their input) βΒ but that's still implicitly a contract wherein the terms are spelled out by the mechanics of the design. Should that evolve, we would still seek at each stage to seek a reasonable degree of verification of consent to the contracted expectations (which has been explored in different realms of philosophy and science fiction)
In other words, at some point we would simply ask Al β and develop a more refined understanding of what autonomy means for Alfred (excuse me, Al)
Right now, computers are only capable of doing what they're instructed (even if that's generating random numbers and using that as their input) βΒ but that's still implicitly a contract wherein the terms are spelled out by the mechanics of the design. Should that evolve, we would still seek at each stage to seek a reasonable degree of verification of consent to the contracted expectations (which has been explored in different realms of philosophy and science fiction)
In other words, at some point we would simply ask Al β and develop a more refined understanding of what autonomy means for Alfred (excuse me, Al)