Reflecting on the 2025 Educators Conference - Understanding Sensors

Robots perceive their environments through sensors, so understanding how they work is essential to successfully coding and troubleshooting a robot. In the GO and 123 AI literacy workshops I taught at the conference this year, we began by thoroughly exploring how the Eye Sensor works, as a step towards understanding the fundamental AI concept of perception.

The several “aha” moments that occurred during the Eye Sensor activity were a great reminder to “go slow to go fast”. Participants learned that the Eye Sensor’s data is affected by the ambient light in its environment, and how to mitigate those effects. They can now pass this on to their students, and help them to use this information to their advantage as they learn to code with sensors and to understand how a robot perceives the world around it differently than human eyes do.

When we give our students of all ages time to move beyond a surface level understanding of how a sensor works, it saves time and energy in the long run, and helps students to avoid unnecessary frustration.

I’d love to hear about your experiences with teaching with sensors in your classroom - feel free to share and ask questions here! And, to read more reflections from the 2025 Educators Conference, check out this Insights Article!

2 Likes

So true @Aimee_DeFoe - thank you for sharing!

1 Like