Reality AI Demo of Explainable AI at the Edge

Embedded AI using accelerometer and vibration

How did we build this AI demo?

The AI demo uses real-time streaming accelerometer data collected while the machine is in a variety of states, and the filter is at different levels of remaining-useful-life. Because we use this demo live at conferences and expos, we also recorded vibration data with the unit placed on different surfaces and in different environments so that the Reality AI algorithm could learn to screen out background variation.

We then used Reality AI Tools® in the cloud to train three different machine learning models:  a condition monitoring classifier that identifies machine state, a prediction model for the remaining-useful-life of the filter, and an anomaly detection model for use in detecting unknown conditions for which we do not have specific training data.

Code for the three models was then compiled for the i.MX RT and flashed into firmware, where it uses approximately 15% of the overall capacity of that microcontroller. Those models generate messages, which other software on the board communicates to a listening app that displays results on a computer screen.

Building AI at the Edge

Signature Recognition

Leverage Reality AIʼs powerful signature extraction and recognition capabilities. We have 12 US patents in the areas of feature discovery and machine learning as applied to sensor and signal problems.

Easily Embeddable

This AI demo shows condition monitoring, remaining-useful-life prediction and anomaly detection in real-time, using a 3-axis accelerometer, running on an NXP i.MX RT board.

See another AI demo using different sensor data

Choose one of these demonstrations below to see other examples of Reality AI technology at work.

Request Access

Become part of the Reality AI community and help shape the AI revolution