What is Nostr?
翠星石 /
npub1x2v…ar98
2024-10-28 15:41:28
in reply to nevent1q…230u

翠星石 on Nostr: That Would Be Telling Hello proprietary master lover. >White line illegal No it ...

That Would Be Telling (npub1kky…mqcg) Hello proprietary master lover.

>White line illegal
No it wasn't.

Making well formed request to download files from a server for academic research purposes is not illegal, nor should it be.

>Was done at MIT via breaking into a machine room he wasn't authorized to be in
He didn't break anything in the room, he just used the internet connection that was there and seemingly a negligible amount of power.

If you don't break anything in a room or steal anything, or get in anyone's way, why does it matter if your access to that room is technically not authorized?

>His bot hammered JSTOR to the point the latter had to shut off MIT's access
If his file downloader really hammered JSTOR that hard, they would have just IP blocked MIT - it appears that JSTOR didn't even notice him downloading at first due to the trivial amount of bandwidth used, although they did eventually notice unwanted academic research and put a stop to that.

>"35 years" was the prosecutor's opening gambit, not was he was really facing
A legitimate prosecuter would announce the actual sentence that they are looking to prosecute, to prevent a future wrong, not some ridiculous sentence to punish for daring to try to do useful academic research.

>Self-admitted mentally ill people shouldn't be doing shit like this
Why shouldn't mentally ill people be allowed to do academic research?

>What "AI" companies are doing, if it's plagiarization, is very much up in the air legally (fair use is a thing) as well as morally.
What proprietary language model companies are doing is copyright infringement for profit and also taking things that were free and making them proprietary, which is morally unacceptable - too bad big businesses are often allowed to get away with taking freedom and infringing copyright as a treat.

Fair use is dependent on usages being actually fair and cannot result in a profit.

>It's certainly more akin to what human authors do every day than what Swartz did.
Proprietary language model companies run scrapers that hammer servers far harder than Swartz every could 24/7, downloading as much data as possible.

Human authors do not ingest books, mix them together randomly and vomit out mixed sentences with a few changes.
Author Public Key
npub1x2vc8gu0kj2slcujhkd2y684k32e2zhzn78d2quea4tajn9ql2pqqjar98