Prof. Emily M. Bender(she/her) on Nostr: "Making chatbots that seem to apologize is a choice. Giving them cartoon-human ...
"Making chatbots that seem to apologize is a choice. Giving them cartoon-human avatars and offering up “Hello! How can I help you today?” instead of a blank input box: choices. Making chatbots that talk about their nonexistent “feelings” and pepper their responses with facial emojis is another choice."
>>
Published at
2023-08-06 21:14:01Event JSON
{
"id": "4c7e751d3933a7c2fd21139604831549263727f17a71fd8df9b904d90757d990",
"pubkey": "13ec9fd5058a18cd097d105fd6ef43759e37d5915b1c01ed36acf0ef5a3e6f2a",
"created_at": 1691356441,
"kind": 1,
"tags": [
[
"e",
"48e8295c2f181c0de683aece1fd5810df49998b4ca94ccb36a034b125245f502",
"wss://relay.mostr.pub",
"reply"
],
[
"proxy",
"https://dair-community.social/users/emilymbender/statuses/110844735779572986",
"activitypub"
]
],
"content": "\"Making chatbots that seem to apologize is a choice. Giving them cartoon-human avatars and offering up “Hello! How can I help you today?” instead of a blank input box: choices. Making chatbots that talk about their nonexistent “feelings” and pepper their responses with facial emojis is another choice.\"\n\n\u003e\u003e",
"sig": "a344a79d961eac71291dc651d1dc6d3e216abbfe332b2528882395c5a607f811684a08ed3efee6d7feac4cf7251db5c57486349a2e6547e2437289e040cbf57d"
}