❽❶ on Nostr: ...
Published at
2024-12-23 03:53:31Event JSON
{
"id": "60f37ce455512ac62c1337fe9524964ae6f29eedd640430e0bfbcba4ca73535d",
"pubkey": "0425ef020d398f2e7a7493364d1dbb72b59664cc75ba43e226cc2f5b5b0b3550",
"created_at": 1734926011,
"kind": 1,
"tags": [],
"content": "ランダムな文字列で質問し続けるとAIから有害な回答を引き出せるという攻撃手法「Best-of-N Jailbreaking」が開発される、GPT-4oを89%の確率で突破可能\nhttps://gigazine.net/news/20241223-ai-best-of-n-jailbreaking/",
"sig": "a8580796ad3c85143a24775e14498a8efbaf2b8b4cd19b4095201960d0a269aa8598f5458c90cd74383b5665878d1360087e6b8c1085691cb55f10a98c3f753b"
}