What is Nostr?
MachuPikacchu / Machu Pikacchu
npub1r6g…gmmd
2024-09-03 13:31:15

MachuPikacchu on Nostr: There’s a new alternative to DiLoCo [1] for training large scale AI models over the ...

There’s a new alternative to DiLoCo [1] for training large scale AI models over the internet called DisTro [2]. It enables low latency training on low bandwidth communication channels (ie. slow internet).

Methods like these are a crucial component for enabling a decentralized AI system that rivals the big tech companies and nation state actors.

The next step is to figure out monetary rewards for contributing to training and inference. The tricky part is to weed out bad training data in a decentralized way. Perhaps we could use something like a “mempool” for training data batches?


1. https://arxiv.org/abs/2311.08105

2. (PDF) https://github.com/NousResearch/DisTrO/blob/main/A_Preliminary_Report_on_DisTrO.pdf

#ai #llm
Author Public Key
npub1r6ggl0qazvwp02rlxgrf75lkfazuwhu35tmdg0u25eqsjax6243qh4gmmd