What is Artificial Intelligence? ‘The Process of Developing AI is Very Much About How We Humans Define What it Is,’ Says Assistant Professor Shiyan Jiang
Technology like artificial intelligence (AI) may seem mystifying to many, but Shiyan Jiang, an assistant professor of learning design and technology in NC State’s College of Education, believes that learning about the technology can help students develop important skills even if they don’t pursue a STEM profession.
AI is often thought of as a sophisticated, high-level technology. However, Jiang, whose research focuses on broadening access to STEM in areas like artificial intelligence, data science and computing, says it is really very human-oriented.
For example, in the subfield of artificial intelligence known as machine learning, the computer learns patterns from training data collected and generated by humans and then uses these patterns to make predictions based on knowledge gained from the data set.
“Many of us might think that artificial intelligence is a mystery that can make magic happen, but that’s not the case. The process of developing AI is very much about how we humans define what it is,” she said. “I want students who will go out and play with these technologies, from the very beginning, to not see it as magic or a mystery, but see it as a very human-oriented experiences, see the decisions humans need to make, and I want them to even see the biases we might have put into those AI technologies.”
Bringing Diversity into the Field
Biases in AI technology, Jiang said, may not be intentional but can come from a lack of diversity in the field and have negative impacts on those who use the technology.
For example, studies show that facial recognition AI technology on most smartphones works more often and more successfully for white men than it does for people from other racial backgrounds. This is a result of the training data – which contained a disproportionate number of images of white males – that was used to develop the AI model.
“At the beginning, we’re not always aware of these issues when we develop technology that we feel will make our life better but, in the end, we find out that there are some issues because only some people can use the function, not everyone” Jiang said.
Encouraging students from historically underrepresented backgrounds to consider the possibility of a career in AI is crucial because having a variety of different voices and perspectives can challenge assumptions that might have been made when developing AI technologies. People with different life experiences, she said, might find issues in data sets used to build AI models or issues with variables that someone of a different background may not have considered.
To bring the necessary diversity to the field, Jiang believes it is important to give high school students of all backgrounds an opportunity to see themselves in an AI profession.
“As educators, we want more students to be able to explore their potential and not see some areas as far away from their reach, and at the end, if they don’t become an AI engineer or scientist, they can see ‘I have options here, and this is something I can do,’” she said. “The goal is not to say that I want to prepare everyone to be an AI engineer or an AI scientist, but I want them to see the potential of themselves and, after the inspiration, they can decide what kind of career they want to get into.”
Merging AI and ELA
Exposing even those students who don’t express interest in pursuing a career related to artificial intelligence is beneficial, Jiang said, because they can learn skills related to the efficient use of technology and critical thinking about when advanced technologies should and should not be used.
“They have to have a critical perspective about how to use technology in the field they are interested in,” she said.
Jiang is helping to expose high school students to artificial intelligence through her work on the “Narrative Modeling with StoryQ: Integrating Mathematics, Language Arts, and Computing to Create Pathways to Artificial Intelligence Careers” project, which began in 2020.
The StoryQ project allows students to engage in the process of developing test classification models and question how the decisions they make along the way affect what the models look like. They must also think about what goals they have for their models and consider who would be impacted by their AI model.
“We want to open the black box of AI for them to experience it with themselves as decision makers and for them to see the kind of decisions that will affect what the model will look like,” Jiang said.
StoryQ is unique in that it introduces AI through an integration of different disciplines and is intended for use in an English Language Arts classroom. Because English Language Arts classes are mandatory for all students, the use of AI in this discipline means that every student will have an opportunity to learn about the technology, which is not the case in an often optional computer science class.
Students can write stories using their own narratives, perspectives and experiences as the data set for AI models. Because they are using their own words as a dataset, they are able to uncover and argue when the AI model is making a mistake by misinterpreting what they intended to say and then gain understanding of how to make the model work better.
The use of different perspectives can also bring more diversity to AI models, because language can be different in different regions, cultures and communities.
“StoryQ naturally attracts students and helps them get access to AI because [through] text or language, we all have something to say. It’s not like math or computer science that often have a barrier,” Jiang said. “We’ve seen that they’re engaged in the process of building text classification models and seeing how language is used in different contexts while also bringing in their own perspectives. Text is a rich space to create this kind of conversation with AI.”
In addition to exposing students to AI and data science concepts in a new context, the use of StoryQ in ELA classrooms, Jiang said, was that students became more invested in their English class and reported seeing their ELA teacher as a role model.
“Learning is not all about content; it is also about relationship building, and we’ve seen that students see that their ELA teacher who loves writing and literacy and poetry can also teach AI in an effective way and find themselves wanting to learn more about the topic,” she said.
- Categories: