What is Nostr?
lucash.dev
npub1stt…67hq
2023-05-30 18:41:59

lucash.dev on Nostr: “Mitigating the risk of extinction from AI should be a global priority alongside ...

“Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks, such as pandemics and nuclear war,”

Starting to think that this whole evil AI thing is mostly another attempt at justifying globalism based on made up existential threats.

Yes AI can be dangerous— though it’s mostly because humans are dumb than bc machines are intelligent.

The more I hear “existential threat” being thrown around the more I think there actually isn’t any existential threat to mankind.

Not AI. Not viruses. No nuclear war. Not even evil totalitarians.

None of that has any chance of ending human life — much less life in general.

Just like people claiming greater good are usually advocating for something evil — those claiming “existential threats to mankind” are just trying to get away with threatening *you* without sounding evil.
Author Public Key
npub1sttsl959a2lvyufqrwkdrlqeg85ks65m72mgdsup5kmx9asqq2csaw67hq