Catherine Flick on Nostr: I'm really hoping you don't need me to tell you that using machine learning ("AI") ...
I'm really hoping you don't need me to tell you that using machine learning ("AI") systems to identify potential targets and then to suggest that they be targeted when at home with their family members, including children, is one of the most abhorrent, unethical, inhumane things I've ever seen. There is absolutely no excuse for developing these systems. Technology is never neutral.
https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai-database-hamas-airstrikesPublished at
2024-04-03 20:09:46Event JSON
{
"id": "ec26d2fcdb213945e7593a9a9818872fd59395e6c17bd72937f60fec861c6d32",
"pubkey": "4c2ef938dac87e4fffd2dc8db9980de50d0a54af21220556598ccc8336cd2504",
"created_at": 1712174986,
"kind": 1,
"tags": [
[
"proxy",
"https://mastodon.me.uk/users/CatherineFlick/statuses/112209099907056132",
"activitypub"
]
],
"content": "I'm really hoping you don't need me to tell you that using machine learning (\"AI\") systems to identify potential targets and then to suggest that they be targeted when at home with their family members, including children, is one of the most abhorrent, unethical, inhumane things I've ever seen. There is absolutely no excuse for developing these systems. Technology is never neutral. https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai-database-hamas-airstrikes",
"sig": "366188ca55874e964f481385a5dd00cacf67275ec4da3a7aa7253343ce9759e8759cdda19ccd3813555ff89aa1ef447a90b672f9bd46dd3efd9b3631706a93be"
}