Forbes on Nostr: The NewReality: Fast Inference Processing For 90% Less? ========== An Israeli startup ...
The NewReality: Fast Inference Processing For 90% Less?
==========
An Israeli startup called NeuReality has developed a new approach to AI hardware that focuses on optimizing the entire server architecture for AI workloads. Instead of relying solely on expensive accelerator chips like Nvidia's GPUs, NeuReality's NR1 chip combines the functions of a NIC, pre-processor, and post-processor, resulting in significant cost savings. The NR1 chip outperforms a CPU-driven Nvidia system by over 2X and offers a 90% cost savings compared to the DGX-H100 server. The company's strategic roadmap includes support for small Language Model Models (LLMs) and other generative AI pipelines.
#AiHardware #Neureality #Nr1Chip #AcceleratorChips #Nvidia #Dla #AiWorkloads
https://www.forbes.com/sites/karlfreund/2024/06/18/the-newreality--fast-inference-processing-for-90-less/
==========
An Israeli startup called NeuReality has developed a new approach to AI hardware that focuses on optimizing the entire server architecture for AI workloads. Instead of relying solely on expensive accelerator chips like Nvidia's GPUs, NeuReality's NR1 chip combines the functions of a NIC, pre-processor, and post-processor, resulting in significant cost savings. The NR1 chip outperforms a CPU-driven Nvidia system by over 2X and offers a 90% cost savings compared to the DGX-H100 server. The company's strategic roadmap includes support for small Language Model Models (LLMs) and other generative AI pipelines.
#AiHardware #Neureality #Nr1Chip #AcceleratorChips #Nvidia #Dla #AiWorkloads
https://www.forbes.com/sites/karlfreund/2024/06/18/the-newreality--fast-inference-processing-for-90-less/