mleku on Nostr: protobuf ... been some time since i used it, but fortunately almost everything ...
protobuf ... been some time since i used it, but fortunately almost everything required to do binary encoding of events does not require anything other than allocating the struct itself, which is mostly just a series of byte slice headers...
the timestamp can even stay in its native form, more or less, only the kind needs to be copied into an int32... i already took care to make EVERYTHING into byte slices even though some fields are fixed etc... because it's obviously 2x faster to compare the actual binary data in a search, so filters also already decode fields that should be binary to do the comparisons to fields that are binary (id, pubkey, sig) and everything else is bytes just because ... well
i reserve in this the ability to later make easy optimizations like i tried to do with my binary encoder but failed to do correctly, especially the tags... i think there is some glitches in the content fields as well... but i know that it goes from JSON to the runtime format that can do the comparisions back to the JSON 100% fine it's just the binary encoding breaks on some special cases, mostly e and p and a tags
anyhow, fuck it, i want this working 100% this weekend so i will do this to the binary encoder and then run my categorizer and see if ANYTHING fails when i remove the fancy optimizations
probably none lol
the timestamp can even stay in its native form, more or less, only the kind needs to be copied into an int32... i already took care to make EVERYTHING into byte slices even though some fields are fixed etc... because it's obviously 2x faster to compare the actual binary data in a search, so filters also already decode fields that should be binary to do the comparisons to fields that are binary (id, pubkey, sig) and everything else is bytes just because ... well
i reserve in this the ability to later make easy optimizations like i tried to do with my binary encoder but failed to do correctly, especially the tags... i think there is some glitches in the content fields as well... but i know that it goes from JSON to the runtime format that can do the comparisions back to the JSON 100% fine it's just the binary encoding breaks on some special cases, mostly e and p and a tags
anyhow, fuck it, i want this working 100% this weekend so i will do this to the binary encoder and then run my categorizer and see if ANYTHING fails when i remove the fancy optimizations
probably none lol