fresh bytes
Subscribe Now

MIT’s augmented reality room shows what robots are thinking

robot_thinkythink-1415176547950.jpg

Most of the time, most of us have absolutely no idea what robots are thinking. Someone who builds and programs a robot does have some idea how that robot is supposed to act based on certain inputs, but as sensors get more ubiquitous and the software that manages them and synthesizes their data to make decisions gets more complex, it becomes increasingly difficult to get a sense of what’s actually going on. MIT is trying to address that issue, and they’re using augmented reality to do it.

In an experiment, the researchers used their AR system to place obstacles—like human pedestrians—in the path of robots, which had to navigate through a virtual city. The robots had to detect the obstacles and then compute the optimal route to avoid running into them. As the robots did that, a projection system displayed their “thoughts” on the ground, so researchers could visualize them in real time. The “thoughts” consisted of colored lines and dots—representing obstacles, possible paths, and the optimal route—that were constantly changing as the robots and pedestrians moved.
via IEEE Spectrum

Continue reading 

Image: MIT

Leave a Reply

featured blogs
Feb 6, 2026
In which we meet a super-sized Arduino Uno that is making me drool with desire....

featured chalk talk

Democratizing Centimeter Level GNSS Precision for All Applications
Sponsored by Mouser Electronics and u-blox
In this episode of Chalk Talk, Arnaud Le Lannic from u-blox and Amelia Dalton explore the benefits of the ZED-X20P, all-band high precision GNSS module and the ZED-F20P triple-band high precision GNSS module from u-blox. They also investigate the roles that correction source and centimeter-level positioning services play in these types of designs, and how you can improve your next design with high precision position solutions from u-blox.
Jan 28, 2026
19,713 views