jb55 on Nostr: It becomes an issue when you want to do more ambitious things. Current clients are ...
It becomes an issue when you want to do more ambitious things. Current clients are memory constrained. Not needing to worry about memory and letting the OS page things in/out for any size db when doing algo queries and other types of local processing is quite nice.
Local first also enables lots of use cases, like using the app offline and having everything resync with negentropy when you’re back online. Also fast local fulltext search for privacy, p2p and local network syncing…
The local algo thing is cool because the wasm algo api can have full query capability assuming the presence of a local db as an api, so local algos will be much faster. Coding everything to an in-process local relay enables lots more use cases and increases reliability in the presence of rogue relays in the outbox model… i need to write this up 😅
Local first also enables lots of use cases, like using the app offline and having everything resync with negentropy when you’re back online. Also fast local fulltext search for privacy, p2p and local network syncing…
The local algo thing is cool because the wasm algo api can have full query capability assuming the presence of a local db as an api, so local algos will be much faster. Coding everything to an in-process local relay enables lots more use cases and increases reliability in the presence of rogue relays in the outbox model… i need to write this up 😅