Our emotion AI uses the device’s camera as a sensor, not as a video image recorder. It works strictly on the user’s device. The user’s private data is not recorded, nor sent to any server to carry out the analysis. This ensures that the user’s facial data remains on their device and is not shared with any third-party. This neutralises the risk of data being accessed or used without the user’s consent. Furthermore, it is specifically designed not to recognize or identify individuals. We use diverse and representative data to train and evaluate Emotion AI systems to ensure that they are not biased against certain groups or individuals.
Client-side AI processing allows for real-time adaptation of content based on the emotions and interactions detected. Since the data is collected and processed on the user’s device, there is no need to wait for a server to process the data and send it back. This eliminates latency, which is the delay caused by sending data to a server, waiting for it to be processed, and receiving a response. Our solution has a consistent 30 millisecond delay versus a minimum server-side delay of over 100 milliseconds. This makes all the difference for an authentic human experience. Human interpretation of facial expressions of emotions has been found to be a fast-acting process occurring between 23-28 millisecond after looking at a face, with happiness the fastest with neutral, disgust, fear and surprise forming a second group requiring additional time. Finally, sadness and anger constitute the group requiring the longest for recognition – about 10 times slower than happiness. As the data is collected and processed on the user’s device, there is no need to wait for a server to process the data and send instructions to our media player. This creates hyper-personalised user experiences, enabling immediate actions based on learning engagement.
Another key benefit is scalability. Our AI engine works on the client-side, using the resources of each individual client device. There is negligible latency of the client device whereas there are bottlenecks on cloud-based servers. As the emotional data is processed locally, it can be used to trigger immediate actions that depend on the emotional state and engagement of the user.
The way it is designed means our AI is very cost-effective on high usage volumes. It can handle large numbers of users or requests without a significant increase in cost. The reason for this is that the system does not rely on cloud computing costs, which can be a significant expense. Since we have minimal cloud computing costs, we can offer our service at a price up to a thousand times lower than that of our competitors. Our video storage system is a highly efficient form playback of learning content.
Low carbon footprint
We are committed to reducing our carbon footprint and doing our part to combat climate change. We understand the importance of taking action to address this global issue and are dedicated to implementing sustainable practices throughout our operations and product development. Adaptive Media software is energy-efficient by using infrastructure more efficiently. Because we are not using servers and we are using the CPU in the client device, the carbon footprint is massively reduced. The user’s device is open for many uses anyway, so around 8% of the CPU is ever used on the device, meaning that the energy consumption is negligible compared to a server. Additionally, servers require 24/7 availability.
A-dapt advocates a responsible approach to the development and use of AI / ML technologies. We highlight the importance of obtaining explicit and informed consent from individuals before collecting or processing their emotional data. This includes making sure that individuals understand the purpose of the Emotion AI, how their data will be used, and any potential risks or benefits associated with participating. Individuals should also be given the opportunity to opt-in or opt-out of the data collection or processing. Obtaining explicit and informed consent is important because it ensures that individuals are aware of how their emotional data is being used and that they can make an informed decision about whether to participate.