mleku on Nostr: i've been thinking about data compression optimization for especially follow events ...
i've been thinking about data compression optimization for especially follow events for some time
i am going to look at my event store code and ponder me a way to do this
to be able to flatten the size of these heinously large event kinds to a list of monotonic indexes would enable me to do things like spidering the network for these events, because the bigger cost is the database storage, the network bandwidth isn't as big a deal
having the relay proactively gather as many profile, follow and mute lists as possible, i mean, not extremely aggressive but fairly aggressive, would be a nice feature for enhancing discoverability and a good benefit to paid relay users, who can then often get this basic data about network participants without a heinous data storage cost
👀 looking
i am going to look at my event store code and ponder me a way to do this
to be able to flatten the size of these heinously large event kinds to a list of monotonic indexes would enable me to do things like spidering the network for these events, because the bigger cost is the database storage, the network bandwidth isn't as big a deal
having the relay proactively gather as many profile, follow and mute lists as possible, i mean, not extremely aggressive but fairly aggressive, would be a nice feature for enhancing discoverability and a good benefit to paid relay users, who can then often get this basic data about network participants without a heinous data storage cost
👀 looking
quoting nevent1q…38ytand yeah, such a relay you probably would want to devise an npub compression scheme where it flattens the lists down by using a monotonic index number for each pubkey instead of storing them over and over and over again, and uses a variable length encoding so the actual size of follow events it stores is tiny
sounds like a fun project but it would take me a week or two to do it in parallel with my main paid gig