iefan 🕊️ on Nostr: Remember, this doesn’t use any OpenAI API; you’re literally running the entire ...
Remember, this doesn’t use any OpenAI API; you’re literally running the entire model.
Moreover, any developer can use my Colab notebook shortcuts and deploy this Llama language model for free, offering it with their apps.
It’s even easier than running your own relay. How does this make any sense?
Moreover, any developer can use my Colab notebook shortcuts and deploy this Llama language model for free, offering it with their apps.
It’s even easier than running your own relay. How does this make any sense?
quoting note1rtk…djvyToday many people are talking about AI. Remember, you can self-host this Llama 13 billion parameter model in just one click for free.
Open the link, click the Play icon, and it will only take a few minutes. If it fails the first time, simply click the Play icon again, and it will work.
I will also create a shortcut for stable diffusion, so you can self-host it's decent instance in one click for free.
Remember you are not using some OpenAI API, you are literally running the real model.
One click self-host link: https://colab.research.google.com/github/realiefan/NostrAi/blob/main/llama-2-13b-chat.ipynb
Demo: https://labs.perplexity.ai/