Carlos Guerreiro on Nostr: npub129hy4…ny2hs I was there 3000 years ago I mean in the 80s and 90s writing ...
npub129hy4ks3znggpc53w2pc972pmjsh9yftesua6nudx7plfzyyeugsmny2hs (npub129h…y2hs)
I was there 3000 years ago I mean in the 80s and 90s writing software.
Indeed we didn't often think code might last for decades.
Honestly, I don't think that had much to do with expecting AI to be around the corner.
It had a lot more to do with the pace at which we saw computer architectures and platforms come and go, and with the immaturity of the software industry.
I started programming as a kid on an 8bit ZX Spectrum clone in Basic and Z80 Assembler. By the time I was out of university programming professionally I had seen 16bit computers (Atari ST, then Commodore Amiga) with their bespoke OSs and 68k assembler and eventually Linux (and Windows) on commodified PC architectures.
Every few years there was a big break and most code didn't carry over.
Things settled down not long before Y2K, and they've changed a lot slower since.
I was there 3000 years ago I mean in the 80s and 90s writing software.
Indeed we didn't often think code might last for decades.
Honestly, I don't think that had much to do with expecting AI to be around the corner.
It had a lot more to do with the pace at which we saw computer architectures and platforms come and go, and with the immaturity of the software industry.
I started programming as a kid on an 8bit ZX Spectrum clone in Basic and Z80 Assembler. By the time I was out of university programming professionally I had seen 16bit computers (Atari ST, then Commodore Amiga) with their bespoke OSs and 68k assembler and eventually Linux (and Windows) on commodified PC architectures.
Every few years there was a big break and most code didn't carry over.
Things settled down not long before Y2K, and they've changed a lot slower since.