Event JSON
{
"id": "fe0f7386fa13eccc9ad6751d31d058b16f5c5b98eae1d35cb5cb5472349610e4",
"pubkey": "0a384e4c563f1bf3d99639808ef67e02bbeba3e1885a11c1a38379166d2d7207",
"created_at": 1693248824,
"kind": 1,
"tags": [
[
"p",
"13457137ae72711251f2b32c16e55737e62e0e26e4a1dc859d3b43d60d7610a6",
"wss://relay.mostr.pub"
],
[
"p",
"7e5f6075ae409059a0e6a3cff82cfcc2c6fdb30cc48b5dd029620459f1da9ad5",
"wss://relay.mostr.pub"
],
[
"e",
"778da1ac9367937ee948a4eb0efd4e93d8ff2327310530b127fbcc569b005d03",
"wss://relay.mostr.pub",
"reply"
],
[
"proxy",
"https://mastodon.social/users/nande/statuses/110968754992701440",
"activitypub"
]
],
"content": "nostr:npub1zdzhzdawwfc3y50jkvkpde2hxlnzur3xujsaepva8dpavrtkzznqjvtl8u It's not just folklore. For the time being, the \"hard\" information AI services give should not be trusted, as they are mostly \"autocorrect on steroids\" (heard a scientist use this expression and really liked it). A few months ago I was doing research for a program (I'm a TV writer), tried to use ChatGPT to save me some googling and asked it for some articles on my subject. It gave me some articles, sure, but it made them all up!",
"sig": "68e5041c8833d923fa9c2de5624fd1e6504e2b6f1de0854759c40825a084ca9b40f081f74251c2e716b043d9a655315c14de6cf0eba935c1aaa9d3c1c29ccf28"
}