A guided tour of the new MIT Museum

LIGO prototype

Developed by Professor Emeritus Rainer Weiss ’55, PhD ’62, and his students, this 1970s prototype led to the Laser Interferometer Gravitational-wave Observatory (LIGO), a large-scale physics experiment that was ultimately able to detect the gravitational waves predicted by Einstein’s General Theory of Relativity. The work earned Weiss the 2017 Nobel Prize in physics.

“The experiments that LIGO was able to facilitate felt like magic to me, as a non-physicist,” Nuñez says. “Can you imagine what it was like to be there when they found out it worked?” What an amazing moment for humanity!”


One of the first social robots designed to simulate social interactions, Kismet was created in the 1990s by Cynthia Breazeal, SM ’93, ScD ’00, who is now MIT’s dean for digital learning and head of the Personal Robots Research Group at the MIT Media Lab . Originally controlled by 15 different computers, Kismet employed 21 motors to create facial expressions and body postures.

“I have a lot of affinity for that particular artifact,” says Nuñez, who studied with Breazeal at the Media Lab. “It’s such a charismatic object; it’s one of the museum’s Instagram moments.”


Developed by Julie Shah ’04, SM ’06, PhD ’11, IRGO is an interactive robot that museum visitors can help train through artificial-intelligence demonstrations. “Our visitors are participating in real robotics research,” Nuñez says. “That is such a rare and special opportunity.”

Today Shah is the HN Slater Professor in Aeronautics and Astronautics at MIT and head of the Interactive Robotics Group within the Computer Science and Artificial Intelligence Laboratory. She shares her thoughts on AI in a nearby audio gallery. Other alumni featured in that gallery include Professor Rosalind Picard, SM ’86, ScD ’91, director of the Media Lab’s Affective Computing Research Group, and Media Lab PhD students Matt Groh, SM ’19, and Pat Pataranutaporn, SM ’20.

“We want to be able to expose the fact that there are communities of people behind everything you’re seeing,” Nuñez says.

Coded gaze

Visitors to the AI ​​gallery can see the mask used by Joy Buolamwini, SM ’17, PhD ’22, to present a white face—rather than her own Black one—to facial recognition software, which she found was less accurate for people with dark skin. In her doctoral thesis, Buolamwini coined the term “coded gaze” to describe algorithmic bias.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: