Meet SmartCow’s Apollo Development Kit for Conversational AI Capabilities

Meet SmartCow’s Apollo Development Kit for Conversational AI Capabilities

NVIDIA brought Jetson Xavier NX with supercomputer performance to edge locations delivering up to 21 TOPS to run neural networks in parallel and process data from several high-resolution external sensors. To  take advantage of the NVIDIA Jetson Xavier NX, many manufacturers developed custom hardware for AI and NLP applications that could be deployed at edge settings. Maltese AI manufacturer known for video analytics and AIoT devices, SmartCow announced a new audio-visual development kit, Apollo, built around the NVIDIA Jetson Xavier NX allowing developers to create applications with conversational AI capabilities.

With rich onboard sensor functionalities including four microphones, two speaker terminals, two 3.5mm phone jacks, an 8MP IMX179 camera module, and an OLED display. In terms of storage, the SmartCow Apollo development kit gets 128GB NVMe SSD for storage and is shipped with NVIDIA DeepStream and RIVA embedded software development toolkits. The base frame also allows the hardware to stand upright, making it easy to work and deploy applications.

“Traditional development kits are geared toward beginner-level developers working with general-purpose use cases with AI vision widely used across applications. We recognize the breadth and depth of developers out there who want a dev kit that enables them to go deeper into their research and development, including the ability to implement conversational AI and NLP,” said Ravi Kiran, Founder and CEO of SmartCow. “Apollo is a specialized dev kit created to meet higher-level developers’ needs and give them a way to get straight to more conversational applications.”

SmartCow Apollo Development Kit Specs

SmartCow has provided a list of example use cases for the Apollo development kit and they include text-independent speaker recognition systems, speech to text and sentiment analysis, language translation and speaker diarizations, and applications for abnormal sound and surveillance. There are many vision-based applications that can be implemented leveraging the onboard camera sensor and providing a real-time video stream for data analysis and visualization.

“Edge computing technology has revolutionized the way people work, live and travel,” continued Kiran. “Apollo enables developers to continue to create and build applications that transform everyday life, such as digital transformation of conference rooms, airport self-service counters, facility management, and more. By including advanced NLP in our development kits, Apollo addresses these growing needs, enabling users to create conversational edge computing applications.”

To get your Apollo development kit, you are requested to fill out a Google form that takes your information. There is no update on the pricing as of yet, but the shipment is expected to begin on April 8, 2022. More details can be found on the documentation page.

Please follow and like us:
Pin Share
About Abhishek Jadhav

Abhishek Jadhav is an engineering student, RISC-V ambassador and a freelance technology and science writer with bylines at Wevolver, Electromaker, Embedded Computing Design, Electronics-Lab, Hackster, and EdgeIR.

view all posts by abhishek
Notify of

Inline Feedbacks
View all comments
Get new posts by email:
Get new posts by email:

Join 97,426 other subscribers