Seventh-grade students were introduced to artificial intelligence during a unit in science. To launch the unit, a team from Google’s Creative Lab presented the Pattern Radio: Whale Song project to our students. For this project, Google has been using AI technology to identify humpback whale songs that have been collected for the past 10 years. The AI technology is instrumental because there are more than 8,000 hours of songs to analyze and it would take a human more than 19 years to listen to them straight through. This technology is able to find patterns and differentiate whale songs from other sounds in the ocean. Students learned that that only humpback males sing, and that scientists are able to track whale migration through songs. The recordings are invaluable since scientists do not need to be constantly present in the ocean in order to understand what’s going on in the ocean. Avenues students were given access to the Pattern Radio project, a spectrogram that visualizes the whale songs, even before this project was made available to the public. They had a chance to make observations about the different sounds.
The Pattern Radio project launched to the public this summer. By making the project open, Google hopes that anyone, not just the experts, will be able to explore and help to identify whale sounds and patterns. To learn more about the Pattern Radio project, read their blog post here.
Next, students discussed the meaning of intelligence and what makes humans intelligent, followed by a discussion about what makes technology artificially intelligent. Students identified AI products in their daily lives. They came up with Netflix and Amazon recommendations, the Nest thermostat and Roomba vacuum cleaner and all the various voice assistants: Siri, Alexa, Google Home, Cortana. Even though there is a wide range of definition for AI, students generally agreed that AI is a piece of technology that learns and adapts.
The next two classes were devoted to machine learning. Machine learning, a subset of AI, uses statistical methods to enable machines to improve with experience. Students trained, tested and created machine learning projects with Scratch, a programming language. In their first project, they taught the computer to recognize compliments and insults based on kind and mean adjectives and phrases they had inputted. In their second project, students chose to have either the computer recognize their face—similar to facial ID recognition on smartphones—or recognize one of the four suits in playing cards. They trained and tested their data at MachineLearningForKids and then used the data to create a machine learning model in Scratch that was able to make a decision based on the data.
In their last class, students discussed the morals and ethics in AI. They had to make moral decisions by selecting choices at MIT’s Moral Decision, a site where one judges to either save the driver and passengers in a self-driving automobile or the pedestrians when the self-driving car malfunctions. The students realized that based on their personal moral values, they would select different people to “save.” And they soon realized that it was difficult to work together if they had different moral values.
Though it was short, this unit made students aware of all the AI technology that they currently encounter in their daily lives. And more importantly, students realized that humans are behind the programming of these AI technologies. Even though AI seems unbiased, there will always be bias in AI devices and AI technology as long as humans are programming them.