www.ptreview.co.uk
03
'25
Written on Modified on
Vecow News
Vecow, Neuchips partner to Accelerate Gen AI Deployments
This partnership integrates Edge AI Computing and Viper series, accelerating Gen AI with high-performance, rugged design, and enterprise-focused AI inference at the Edge.
www.vecow.com

Vecow announced its partnership with Neuchips, an AI ASIC company for LLM and generative AI domain. Two companies are collaborating on a high power productivity solution that integrates Vecow Edge AI Computing System and Neuchips Viper series, an enterprise-focused offline AI solution featuring the Raptor N3000 LLM accelerator. This strategic partnership offers a high-performance and trusted solution to accelerate Gen AI developments and win-win business deployments.
The Vecow ECX-3100 RAG features workstation-grade Intel® Core™ i9/i7/i5/i3 processor with Intel® R680E PCH and up to 96GB DDR5 memory. Designed for AI inference at the Edge, it supports high-speed 10G USB vision, 2.5GigE vision, and multiple 5G/WiFi/BT/4G/LTE/GPRS/UMTS wireless data transfer. Furthermore, its rugged design, DC 12V to 50V wide range power input, ignition power control make it ideal for in-vehicle computing and industrial AI applications.
The Neuchips Viper AI accelerator card from Neuchips represents a breakthrough in efficient AI processing, powering the ShareGuru QA 2.0 solution of Golden Smart Home (GSH) Technology Corp. to deliver enterprise-grade language model capabilities in a remarkably power-efficient package. Through this strategic integration, ShareGuru QA2.0 fully harnesses the power of Mistral-Nemo, a 12B-parameter model, running efficiently on a single Viper card while consuming just 45W of power, enabling secure, on-premises AI processing without the complexity and cost of traditional infrastructure.
The synergy between GSH's advanced natural language processing platform and Viper's native BF16 Structured Language Model support creates a powerful solution for organizations seeking to implement AI-driven database analysis while maintaining data security and reducing operational costs. This hardware-software integration demonstrates the perfect balance of power efficiency, processing capability, and security, making it particularly well-suited for industrial and enterprise applications where local processing and energy efficiency are paramount.
"As on premise generative AI applications expand, the demand for multimodal large language models (LLM) is rapidly growing," said Joseph Huang, Executive Vice President at Vecow. “At Vecow, we are partnering with Neuchips to develop cutting-edge RAG-based LLM solutions, enabling users to access the latest data without training model, thereby delivering more relevant and high-quality results. It is essential for our customers who seek a cost-effective, compact and low-power AI workstation that outperforms traditional cloud-based GPU solutions”
"At Embedded World 2025, visitors to Vecow's booth will experience how our Viper AI accelerator card's unique capabilities - including 12B parameter model support at just 45W power consumption - complement Vecow's robust industrial Edge AI Computing Systems and GSH's ShareGuru SLM solutions. This powerful combination delivers secure, efficient AI processing that meets the demanding requirements of modern industrial environments," said Ken Lau, CEO of Neuchips.
Join us at Embedded World 2025 in Nuremberg to experience the next generation of industrial AI processing. Visit Vecow's booth# 3-449 to see our innovative Edge solution in action.
www.vecow.com