Per Axbom on Nostr: «[AGI] interprets the Turing Test as an engineering prediction, arguing that the ...
«[AGI] interprets the Turing Test as an engineering prediction, arguing that the machine “learning” algorithms of today will naturally evolve as they increase in power to think subjectively like humans, including emotion, social skills, consciousness and so on. The claims that increasing computer power will eventually result in fundamental change are hard to justify on technical grounds, and some say this is like arguing that if we make aeroplanes fly fast enough, eventually one will lay an egg.»
– Alan Blackwell in "Moral Codes"
https://moralcodes.pubpub.orgPublished at
2024-07-20 11:30:28Event JSON
{
"id": "d274816aca5bbaccba9c7c44bd2d17fbac0bc16c35c3e5764ca0ae9c17134097",
"pubkey": "a10318eb7d626f60c3cf54c89176dc43fdb58815bfd94c49e7aefb791e071ca0",
"created_at": 1721475028,
"kind": 1,
"tags": [
[
"proxy",
"https://axbom.me/objects/1bc5a949-9595-424d-9f95-606de0825a29",
"activitypub"
]
],
"content": "«[AGI] interprets the Turing Test as an engineering prediction, arguing that the machine “learning” algorithms of today will naturally evolve as they increase in power to think subjectively like humans, including emotion, social skills, consciousness and so on. The claims that increasing computer power will eventually result in fundamental change are hard to justify on technical grounds, and some say this is like arguing that if we make aeroplanes fly fast enough, eventually one will lay an egg.»\n\n– Alan Blackwell in \"Moral Codes\"\nhttps://moralcodes.pubpub.org",
"sig": "a49dc25ee49650009e0931d57dedab1d0d0586cb83b06672b7622baac3abc67bad875133b541084581017169f5fb3d5ff96d659734f8095b5f271b85397bae2c"
}