KInterACT - AI Platform for Interactive Services in Future Environments 12 months ago

The spaces of the future are turning into smart environments. Each lounge understands the situation as well as the persons present and controls intelligent services. With the help of artificial intelligence and sensor technology, the human-space interaction is put to a new level of quality. This vision is dedicated to the project KInterACT.

The main objective of the project is to create a new quality of interaction between people and service technology (consumer electronics, building services, cloud services, initiation of assistance). Based on 3dvisionlabs’ 3d sensor technology and Inferics’ Artificial Intelligence (AI) scene evaluation, a universal platform for local and superordinate identification of people’s needs and action decision making is realized.

The analysis result provides the perceived situation (of persons and objects). Their interpretation with AI methods allows the purposeful interaction with the objects and persons through the control of available services. Thus, with KInterACT a selected window can be opened by means of a pointing gesture and a voice command.

KInterACT is the first AI-based sensor and inference platform and the basis for new service models. In the future a complete room may be operated intuitively taking into account privacy and a new quality of comfort can be achieved.

Project Page (German): https://www.technik-zum-menschen-bringen.de/projekte/kinteract

About 3dvisionlabs:

3dvisionlabs is a Germany-based depth camera startup, emerging with a novel extreme-wide angle vision technology called HemiStereo. Founded in August 2017 by three researchers from the Chemnitz University of Technology, 3dvisionlabs is pushing the boundaries of conventional depth sensing technologies. For additional information, visit www.3dvisionlabs.com.

Contact about this news:

Michel Findeisen
3dvisionlabs GmbH
Phone: +49 371-33716555
Email: findeisen@3dvisionlabs.com

Image Credit: ©visivasnc – stock.adobe.com