John Spurlock on Nostr: one of the slickest run-llms-locally-in-the-browser demos I've seen private mistral / ...
one of the slickest run-llms-locally-in-the-browser demos I've seen
private mistral / llama 3 / others run great in a browser tab on a macbook air (leverages webgpu)
https://secretllama.com/Published at
2024-05-05 22:45:28Event JSON
{
"id": "8eee5abdab337608999e5bb8b656fb6e1f0b041f40bfdbc92132c1849f8fb6fe",
"pubkey": "4299afec593da6053a215436638da2c07fc0229beb90909d314defc9a34c2f97",
"created_at": 1714949128,
"kind": 1,
"tags": [
[
"proxy",
"https://podcastindex.social/@js/112390906087582345",
"web"
],
[
"proxy",
"https://podcastindex.social/users/js/statuses/112390906087582345",
"activitypub"
],
[
"L",
"pink.momostr"
],
[
"l",
"pink.momostr.activitypub:https://podcastindex.social/users/js/statuses/112390906087582345",
"pink.momostr"
]
],
"content": "one of the slickest run-llms-locally-in-the-browser demos I've seen\n\nprivate mistral / llama 3 / others run great in a browser tab on a macbook air (leverages webgpu)\n\nhttps://secretllama.com/",
"sig": "5ed9edbeeae742b1b3b76f49310e394d5b38154c2548ff22aa251a08781858327ccce644c718eb1a05c03090be5f063e2c5cf7a29efb697b53f111e04bb8dd5c"
}