Lukasz Olejnik on Nostr: The AI Overview in the core, critical, Google search product underwent some form of ...
The AI Overview in the core, critical, Google search product underwent some form of risk assessment. So they knew about these issues, deciding to accept the risk. No sane person would jump off a bridge because a search engine advised it, right? As Google said, such quirky advice is given in a small number of cases. Perhaps those few individuals who might follow the strange advice (glue, bridge, rocks, etc.) were deemed an acceptable risk, a business decision that someone had to approve?
Published at
2024-05-26 08:46:40Event JSON
{
"id": "ae06bfba5295284414605dd4401be67bff1c3271978d88d44088f1adeedba6a8",
"pubkey": "b400fa7aeef902979f3076a5e121644bebdc8d0a23ffb0eb724dab671d4b6493",
"created_at": 1716713200,
"kind": 1,
"tags": [
[
"proxy",
"https://mastodon.social/@LukaszOlejnik/112506516311759453",
"web"
],
[
"proxy",
"https://mastodon.social/users/LukaszOlejnik/statuses/112506516311759453",
"activitypub"
],
[
"L",
"pink.momostr"
],
[
"l",
"pink.momostr.activitypub:https://mastodon.social/users/LukaszOlejnik/statuses/112506516311759453",
"pink.momostr"
]
],
"content": "The AI Overview in the core, critical, Google search product underwent some form of risk assessment. So they knew about these issues, deciding to accept the risk. No sane person would jump off a bridge because a search engine advised it, right? As Google said, such quirky advice is given in a small number of cases. Perhaps those few individuals who might follow the strange advice (glue, bridge, rocks, etc.) were deemed an acceptable risk, a business decision that someone had to approve?",
"sig": "bce6e1fd8ffafb651dc79c733ce249f591161ebaac36ea003fe854a3a6ebd021f532c3085cb7c2e0ec4632647ce110155fd615779194d6081e5ebd9b69d9b83a"
}