Peter Butler on Nostr: >> While Hagendorff notes in his more recent paper that the issue of LLM deception ...
>> While Hagendorff notes in his more recent paper that the issue of LLM deception and lying is confounded by AI's inability to have any sort of human-like "intention" in the human sense
How can LLMs “lie” without intent?
It seems weird to base that sort of assertion around research into playing Diplomacy, where deception is a part of winning the game … I am confused
Published at
2024-06-10 15:10:22Event JSON
{
"id": "3b2ae992d669e718d23044c197b618167d2a47333c07b504738c1de60e33fff9",
"pubkey": "2175d86ae3c844a0fb8fc07232d652865c08cb9f94cbe3b3f0e6b41290af5db0",
"created_at": 1718032222,
"kind": 1,
"tags": [
[
"proxy",
"https://mas.to/@peterbutler/112592959746171629",
"web"
],
[
"p",
"402ae9f88b21098ac55e14135f105378c994639a4d6f57662ab1fd8444fe86fb"
],
[
"e",
"b14a3d2d004160b87a0c9d21294c472a2cc691de744bbfc736d91727fa12cf87",
"",
"root"
],
[
"proxy",
"https://mas.to/users/peterbutler/statuses/112592959746171629",
"activitypub"
],
[
"L",
"pink.momostr"
],
[
"l",
"pink.momostr.activitypub:https://mas.to/users/peterbutler/statuses/112592959746171629",
"pink.momostr"
]
],
"content": "\u003e\u003e While Hagendorff notes in his more recent paper that the issue of LLM deception and lying is confounded by AI's inability to have any sort of human-like \"intention\" in the human sense\n\nHow can LLMs “lie” without intent?\n\nIt seems weird to base that sort of assertion around research into playing Diplomacy, where deception is a part of winning the game … I am confused",
"sig": "2e6ffb1c7499a8cac3b6d540f1596a61d3a5919c47a7a07f93d962220f10dcd0e58f1a4c087d40fa2e7e9947ba7e85d5d6bef82d778ceed6913ac2b1ba0854ec"
}