npub1qv…y6ghq on Nostr: Thought experiment: A microservice should almost never allocate memory that survives ...
Thought experiment: A microservice should almost never allocate memory that survives longer than the servicing of a single request. But almost all modern languages ignore this, instead spamming stuff on heaps and then GC’ing them.
Languages have internal bump allocators (“nurseries”) and then do minor GCs when these are full. Wouldn’t it make *much* more sense to size the nurseries to be big enough to serve 99.999th percentile of requests, and just zap the nursery at the end of each request?
Languages have internal bump allocators (“nurseries”) and then do minor GCs when these are full. Wouldn’t it make *much* more sense to size the nurseries to be big enough to serve 99.999th percentile of requests, and just zap the nursery at the end of each request?