What is Nostr?
Kee Hinckley /
npub1m6p…3u3l
2024-01-25 14:42:04

Kee Hinckley on Nostr: I’ve been reading a lot of non-computer-related informational and how-to sites the ...

I’ve been reading a lot of non-computer-related informational and how-to sites the past few months, and I’m starting to realize why LLM generators have such a verbose and roundabout description style. They didn’t make it up, that’s become the voice of the web and they adopted it.

The other day I was looking for tips on reducing back pain while washing dishes and the site went on for pages before saying “use a step stool”.

The old style web, similar to old style newspaper articles was to immediately get to the point, and then provide increasing levels of detail. This allowed the reader to immediately learn what they needed to know, and then get more information if they wanted it, up to the depth they wanted.

The new style is to make the article large to increase SEO, and to put the answer at the end to increase advertising revenue.

I hate it.

Although at least nobody will ever suspect my writing to be generated by a generic LLM.

There’s a great printer review on the web where the author was recommending the best laser printer for a home-office. The answer was clearly (and I happen to agree) a Brother laser printer. The author didn’t care about the additional ad revenue, but they couldn’t ignore the SEO issue. So they answered the question in the first paragraph, then told the reader to stop reading, and the let an LLM generate fill for the rest. (Read it, it’s amusing, in a dystopian way. https://www.theverge.com/23642073/best-printer-2023-brother-laser-wi-fi-its-fine).

The down side of LLM-generated content of course is that now every time you see something odd in an article, you start to wonder if it’s generated and shouldn’t be trusted. And while there’s something to be said for trusting stuff less, this isn’t the right path.

Today I was reading a helpful article about plumbing S-Traps vs. P-Traps and I hit this gem.

> Do you live in a very dry climate?
>
> Then it would be great to check the level of water within the trap.
>
> There’s a chance all the water from there will evaporate.
>
> If this happens, flush a large amount of money through the line and refill the trap.

Human mistake? Easter egg? LLM. I don’t know, but I laughed at least.

Random thought: If you want to make something useful from “AI”, make a browser plugin that reformats articles to work the old way.

#LLM #AI
Author Public Key
npub1m6pdyhqp73w7dv5997lqcxf4cfhfkcrg8zgwv80ql0x8xczyf7lsaq3u3l