lhl on Nostr: #llm Yesterday, something incredibly cool was released. The first (afaik) open LoRA ...
Published at
2023-08-24 04:46:41Event JSON
{
"id": "93287da8bb821cc071dcf35f4a1f086ba099d07b95cc547788ff9102ccd43c10",
"pubkey": "d51a9f9d82c96abc102f130164175a04b52fa1a3b77cbd35502fb6c1b24b72f3",
"created_at": 1692852401,
"kind": 1,
"tags": [
[
"t",
"llm"
],
[
"proxy",
"https://fediverse.randomfoo.net/objects/e03a4f75-f606-4b71-81d8-8fb38b4a1dc1",
"activitypub"
]
],
"content": "#llm Yesterday, something incredibly cool was released. The first (afaik) open LoRA MoE proof of concept - it lets you run 6 llama2-7B experts on a single 24GB GPU. Here’s my setup notes: https://llm-tracker.info/books/howto-guides/page/airoboros-lmoe",
"sig": "e2512e61e12baf0d16f3d081610b3808472622743135ccc5099bed6a985cc225de237dd0f8d3ce6958b9c7ffbf25fc973b925803cb387c6fe42a97b30b8d8e9c"
}