What is Nostr?
liminal 🦠
npub1m3x…a5sf
2024-08-12 18:44:50

liminal 🦠 on Nostr: I think about the paper "How Learning Guides Evolution". A seminal paper in AI and ...

I think about the paper "How Learning Guides Evolution". A seminal paper in AI and evolution. Some background info that I hope doesn't muddy up the original topic. The idea is that you have a complex 'fitness landscape' with various hills, and you have agents that are initialized in various locations of this landscape through their genetics and use "learning" to climb up these hills during their lifetime. Learning uses local information to climb up hills and reach the maximum fitness, but they can only do so within a lifetime. The idea is that learning helps you find the trajectory given your starting point but is forgotten after death, and that evolution converges on selecting the agents that have learned and climbed the most, because they started in positions best suited for climbing. What happens collectively, is that the population over generations collectively climbs higher and higher because of this selection criteria.

How this relates is that all individuals collectively move to better situations over time over generations. In practice, is the divide getting larger between populations at the top of the hill and those near the bottom? Sure, but we are all going through the same process of hill climbing. What is the goal? Meaning maximization, we use sub goals to get there to help us frame where we want to go "Be successful", "have a family", "Become a teacher", "be the next Steve Jobs". These goals of success are all in service of meaning maximization. The starting conditions for everyone are different, but so are the end goals because whatever it looks like will be different due to the individuals path, experience and the needs of the end situation. Regardless, we collectively try to move in that direction because we see it as a "better situation than now".

So, just to sum up - we try to define end goals (like success), but they are always in service of meaning maximization. If you know the direction for meaning maximization, it doesn't matter how exact the end goal is like "Be the next Steve Jobs" because it will always be different, but if you are maximizing your meaning it really doesn't matter because what you are doing will be the most meaningful for you. Hope I didn't get too off track 😅
Author Public Key
npub1m3xdppkd0njmrqe2ma8a6ys39zvgp5k8u22mev8xsnqp4nh80srqhqa5sf