Chris Trottier on Nostr: It was only a matter of time before decent LLMs would be able to run locally on your ...
It was only a matter of time before decent LLMs would be able to run locally on your PC, and now it's happening.
Nvidia just announced that "chat with RTX" will be able to run on any PC with at least an RTX 30xx card or greater. Since I already own an RTX 3090 Ti, I'm very much looking forward to using this on my own machine. As a nice bonus, this is much more environmentally friendly than using Chat-GPT's cloud systems.
(And no, I don't think chatbots are intrinsically evil.)
https://www.theverge.com/2024/2/13/24071645/nvidia-ai-chatbot-chat-with-rtx-tech-demo-hands-on
Nvidia just announced that "chat with RTX" will be able to run on any PC with at least an RTX 30xx card or greater. Since I already own an RTX 3090 Ti, I'm very much looking forward to using this on my own machine. As a nice bonus, this is much more environmentally friendly than using Chat-GPT's cloud systems.
(And no, I don't think chatbots are intrinsically evil.)
https://www.theverge.com/2024/2/13/24071645/nvidia-ai-chatbot-chat-with-rtx-tech-demo-hands-on