Next-generation AI experiences in consumer electronics devices

Author : Hussein Osman | Marketing Director | Lattice Semiconductor

01 December 2021

Lattice sensAI solution stack v4.1 helps developers use sensors & AI/ML inferencing to provide new & improved user experiences for client compute devices
Lattice sensAI solution stack v4.1 helps developers use sensors & AI/ML inferencing to provide new & improved user experiences for client compute devices

The AI/ML revolution continues to gain traction across many use cases, particularly edge applications. As Hussein Osman, Marketing Director at low power FPGA specialist, Lattice Semiconductor tells us here, client compute devices operating at the edge, like security cameras, robots, industrial equipment & consumer electronics devices, such as smart TVs, laptops, tablets & even toys, can now support AI/ML capabilities that provide users with new capabilities & experiences.

This article was originally featured in the December 2021 issue of EPDT magazine [read the digital issue]. And sign up to receive your own copy each month.

According to industry analyst firm, ABI Research, the edge AI chipset market “is expected to continue to grow to US$71 billion by 2024, with a CAGR of 31% between 2019 and 2024. Such strong growth is propelled by the migration of AI inference workloads to the edge, particularly in the smartphone, smart home, automotive, wearables and robotics industries.”

However, making client compute devices “smart” adds new challenges to product design. AI/ML (artificial intelligence/machine learning) is an emerging technology, and many OEMs don’t have the in-house experience or time needed to design an AI/ML solution from scratch. The algorithms used to train client compute devices are evolving at a rapid pace, so developers are also looking for AI/ML solutions that are field upgradable. But the most crucial question for many edge AI/ML application developers is how to deliver the processing performance needed to power an AI/ML application in a device that runs on batteries?…

To address these issues, application developers and OEMs need access to flexible hardware and software solutions that make those AI/ML-enabled experiences possible at low power. Since 2018, the Lattice sensAI™ solution stack has helped Lattice customers add AI/ML capabilities to new and existing product designs.

The latest version of the sensAI solution stack (v4.1) now includes a roadmap of user experience reference designs to help bring AI/ML capabilities to client compute devices like laptops. The COVID-19 pandemic created a large uptick in the number of people using video conferencing applications to stay connected with work, friends and family. The reference designs included in the latest sensAI stack release leverage the vision and sound sensors in client computing devices to provide value-added user experiences around instant-on, presence detection, attention tracking, privacy and video conferencing, while keeping power consumption low in order to maximise battery life.

Lattice sensAI
Lattice sensAI

Lattice sensAI solution stack v4.1 helps developers use sensors & AI/ML inferencing to provide new & improved user experiences for client compute devices.

Demand for more responsive and context-aware user experiences, high quality video conferencing and collaboration applications on client compute devices is rising – accelerated by increased use of such tools and working from home during the coronavirus pandemic and lockdowns. Lattice Nexus FPGAs and the sensAI solution stack offer a compelling platform for developing computer vision and sensor fusion applications that improve engagement, privacy and collaboration for users. For example, a client device can leverage image data from its camera to determine if someone is standing too close behind the user and blur the screen for privacy, or lengthen battery life by dimming the device’s display when it “sees” the user’s attention is focused elsewhere. Client compute AI experience reference designs include:

•  User presence detection to automatically power on/off client devices as a user approaches or departs.

•  Attention tracking to lower a device’s screen brightness to conserve battery life when the user isn’t looking at the screen.

•  Face framing to improve the video experience in video conferencing applications.

•  Onlooker detection to realise when someone is standing behind a device and blurring the screen to maintain data privacy.

OEMs can benefit in multiple ways by adding AI/ML support to their device designs with the Lattice sensAI stack, including:

•  Up to a 28% increase in battery life in comparison to client compute devices using their CPUs to power AI applications.

The latest version of the award-winning Lattice sensAI stack includes hardware & software updates that make it easier to add AI/ML support to more edge devices (new features highlighted in diagram above)
The latest version of the award-winning Lattice sensAI stack includes hardware & software updates that make it easier to add AI/ML support to more edge devices (new features highlighted in diagram above)

•  Support for in-field software updates to keep pace with evolving AI technologies.

•  Scalability to run multiple use cases at low power by offloading AI data processing from the CPU.

•  Broad support for popular sensor and SoC technologies.

Lattice also plans to add more experiences to its client compute AI roadmap in future releases of the sensAI stack.

With support for the latest addition to the Lattice Nexus™ FPGA lineup, Lattice CertusPro™-NX, the stack can also deliver the performance and accuracy gains required by the highly-accurate object and defect detection applications used in automated industrial systems. To facilitate the design of voice-and vision-based AI/ML applications for client compute devices, the stack supports a new hardware platform featuring an onboard image sensor, two I2S microphones and expansion connectors for adding additional sensors.

As for software updates to the stack, Lattice include an updated neural network compiler and support for Lattice sensAI Studio, a GUI-based tool with a library of AI models that can be configured and trained for popular use cases. sensAI Studio now supports AutoML features to enable creation of ML modules based on application and dataset targets. Several of the models based on the Mobilenet ML inferencing training platform are specifically optimised for running on CertusPro-NX FPGAs. The stack is also compatible with other widely-used ML platforms, including the latest versions of Caffe, Keras, TensorFlow and TensorFlow Lite.


More information...

Contact Details and Archive...

Print this page | E-mail this page