Event JSON
{
"id": "a41275cff850ccdccdbd27833bcd118e10176172d7c7da2e80915f006d545f20",
"pubkey": "d011432e09e02b7500b12b15a7cc6a27e5f155d1fed4c159313d8c710963a65b",
"created_at": 1688574866,
"kind": 1,
"tags": [
[
"p",
"25b5e014ee0aa07ec0710b596db77bbc4d5a6b491529ba0fec7438f6cb4a8794",
"wss://relay.mostr.pub"
],
[
"p",
"a86b3b2a8b37dfacd5b31841e8a1e2f4a037897fbaa361a7d17f48d54a06e909",
"wss://relay.mostr.pub"
],
[
"e",
"6a9091fe5cae36601b655e738ca02d971fb9d6e8060b3af47eabc6383c83d47a",
"wss://relay.mostr.pub",
"reply"
],
[
"mostr",
"https://mastodon.social/users/AnthonyBaker/statuses/110662442472714698"
]
],
"content": "nostr:npub1yk67q98wp2s8asr3pdvkmdmmh3x4566fz55m5rlvwsu0dj62s72q804zz4 Outside of blocking bots via those that honor robots.txt, isn’t Google ALREADY data scraping for the purposes of SEO indexing? And in support of this, we have LD+JSON and Schema.org being used by publishers and sites to more CLEARLY represent what their content is in a structured data format for bots. I mean, publishers rolled out RED CARPETS for years using these tools — and they care about getting it right. LLM/AI apps are piggybacking on all of this.",
"sig": "e0025fc05c6617a2da6421dba3cfa7b31cae62b6683eeccb0c79e24e2ab0f8432a60ffa87db12906ec22e5033527502b4a6cb5307780986ebf463a1a7fa11890"
}