RadarCat isn’t a real cat with a radar transmitter strapped on its back. It’s a sensor that can connect to any computer or smartphone to recognize and identify organic and non-organic objects. Researchers at the University of St. Andrews in Scotland recently developed a program and sensor that connects to a computing device to recognize different types of objects and materials that can either be organic or non-organic. They call this the RadarCat, which really means Radar Categorization for Input and Interaction. As the name implies, the new technology simply uses old radar technology to identify and categorize objects. RadarCat was created by the St. Andrew University’s Computer Human Interaction research team. But the radar-based sensor originally came from the Project Soli alpha developer kit developed by the Google Advanced Technology and Projects (ATAP) program. The Google sensor was originally created to detect the slightest of finger movements. However, the RadarCat group saw an even bigger potential for the Soli sensor.
Video Courtesy of YouTube:
The St. Andrews team research team realized that there was far more to the Soli miniature radar that could open more avenues towards “touchless” interaction. They also saw the potential if RadarCat is eventually deployed in products and will revolutionize how people interact with computing devices as well as a means to identify and categorize certain objects in the office, a warehouse, in the supermarket, and in the retail industry. The Google Soli chip is smaller than a quarter coin, measuring around 8mm x 10mm and can carry both the sensor and the antenna array. The chip simply broadcasts a wide beam of electromagnetic waves, and the energy dispensed is scattered in a specific way depending on the object being “sensed.” Thus, the sensor can get specific data from the energy pattern in order to read shape, size, orientation, material, and what is contained inside the object.
The team improved on Soli’s ability to track and recognize dynamic gestures from the single chip sensor by developing a radar sensing paradigm with its hardware, software, and algorithms. Thus, the final sensor device has its own USB cable to connect to any computer, laptop, tablet, and smartphone. When an object is placed on top of the sensor, the program draws the raw radar signals as well as specifically identifying the object. The machine learning algorithm also allows the program to “learn” in order to correctly identify a certain object it previously could not recognize. In tests conducted by the team, the RadarCat system could not only identify specific objects but could also tell the difference between front and back. For instance, when a smartphone is placed on top of the sensor the RadarCat correctly identified the phone’s front or back. It did the same thing with a tablet. It could also identify if a glass was empty or with water. When any fruit was used, not only was the fruit identified, but its nutritional information was also identified. The system could also identify objects in other languages.