Teaching AI
eBook - ePub

Teaching AI

Exploring New Frontiers for Learning

Michelle Zimmerman

Share book
  1. English
  2. ePUB (mobile friendly)
  3. Available on iOS & Android
eBook - ePub

Teaching AI

Exploring New Frontiers for Learning

Michelle Zimmerman

Book details
Book preview
Table of contents
Citations

About This Book

Get the tools, resources and insights you need to explore artificial intelligence in the classroom and explore what students need to know about living in a world with AI. For many, artificial intelligence, or AI, may seem like science fiction, or inherently overwhelming. The reality is that AI is already being applied in industry and, for many of us, in our daily lives as well. A better understanding of AI can help you make informed decisions in the classroom that will impact the future of your students.Drawing from a broad variety of expert voices from countries including Australia, Japan, and South Africa, as well as educators from around the world and underrepresented student voices, this book explores some of the ways AI can improve education. These include educating learners about AI, teaching them about living in a world where they will be surrounded by AI and helping educators understand how they can use AI to augment human ability.Each chapter offers activities and questions to help you deepen your understanding, try out new concepts and reflect on the information presented. Links to media artifacts from trusted sources will help make your learning experience more dynamic while also providing additional resources to use in your classroom.This book:

  • Offers a unique approach to the topic, with chapter opening scenes, case studies, and featured student voices.
  • Discusses a variety of ways to teach students about AI, through design thinking, project-based learning and STEM connections.
  • Includes lesson ideas, activities and tools for exploring AI with your students.
  • Includes references to films and other media you can use in class to start discussions on AI or inspire design thinking and STEM projects.


In Teaching AI, you'll learn what AI is, how it works and how to use it to better prepare students in a world with increased human-computer interaction. Audience: K-12 educators, tech coordinators, teacher educators

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on ā€œCancel Subscriptionā€ - itā€™s as simple as that. After you cancel, your membership will stay active for the remainder of the time youā€™ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlegoā€™s features. The only differences are the price and subscription period: With the annual plan youā€™ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, weā€™ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Teaching AI an online PDF/ePUB?
Yes, you can access Teaching AI by Michelle Zimmerman in PDF and/or ePUB format, as well as other popular books in Education & Education Technology. We have over one million books available in our catalogue for you to explore.

CHAPTER 1

What Is AI?

Questioning Human-Computer Interaction

DRAMATIS PERSONAE

LEILA: Six-year-old girl
CUB: Five-year-old boy

SCENE

Leila and Cub are visiting a middle school learning space with their parents and younger brother.
TIME: April 2018

ACT 1

Scene 1

SETTING: School is not in session. The building is silent, absent the bustle, laughter, and exclamations of students. In an alcove marked by an orange and blue striped area rug, a scattering of lime green chairs and pillows invites visitors into a space designated for collaboration, quiet reading, media creation, and general innovation. A robot, just over three feet tall, stands in the corner.
AT RISE: Leila and Cub enter the alcove and spot the robot.
Figure 1.1 The children stare at what might be eyes or goggles. No one blinksā€”not Leila, Cub, nor the two plastic discs staring blankly from the robot.
LEILA: Why isnā€™t it talking?
CUB: Is it a good robot or a bad robot?
The children inch toward the robot and proceed to inspect its wires, screws, arms, the wall plug, and the wheels that function as its feet. They want interaction. They want to know if it is friendly. Will it scare them, or will it shake hands with them? They are breathless.
As the two young peopleā€”almost the same height as the robotā€”wait for some sort of interaction, the robot remains lifeless. Made by middle school students, who programmed it to speak a few words and perform simple gestures, the robot does not have a mind of its own. It is not ā€œintelligent.ā€ It is not receiving any sensory information. It is not analyzing Leila and Cub, nor is it processing vast amounts of data to read their reactions, find patterns, or adapt its behavior in an attempt to procure a different result.
The children, on the other hand, are taking in data as they study the robot in front of them. Cub attempts a friendly fist-bump in hopes of eliciting a reaction.
CUB: Why isnā€™t it doing anything?
LEILA: Maybe itā€™s out of batteries.
As they poke, prod, and even attempt voice commands, there seems to be a slight movement and sound from the robot. Leila jumps back and curls up under a nearby chair. She covers her face, then peeks out again.
While still uncertain whether the robot poses a threat, the youngsters remain fascinated. They approach it once more to re-examine its combination of screws and plastic wires. Why does it look like a person that should come to life? They continue their attempts to engage it, but the robot fails to react. They search for a reason why.
LEILA: Itā€™s a screw! Itā€™s missing a screw! I can fix this!
The two children begin a thorough investigation, searching the area for missing screws. Theyā€™re not yet aware of a more likely explanation: the robot simply has not been programmed by humans to function the way the young people hope it will.

UNDERSTANDING AI

Leila and Cub hypothesized that getting the robot to interact with them might be as simple as finding a loose or missing screw. Leila believed if she just found that missing piece, the robot would respond as if it were human. Throughout the history of AI, people working to solve its grand problems have been looking for the loose screwā€”the key to replicating human intelligence in a machine. Even once they find it, they realize the challenge is more complex than originally anticipated (Brooks, 2018). To date, human cognition and intelligence remain superior to the capabilities of a computer (Allen, 2011). No machine has successfully passed the Turing Test by fooling people into believing it is human.
When their search fails to produce results, Leila and Cub start listing robots from popular culture, categorizing which are ā€œgoodā€ and which are ā€œbad.ā€ The examples they draw upon, from Wall-E and Eva to C-3PO and R2-D2, all depict robots that display human-like intelligence and emotionā€”or at least evoke an emotional response from us. By exploring what they already know about AI, the children are looking for patterns that might help them predict the robotā€™s behavior.

End Scene 1

REFLECTING ON AI

ā–  What were Leila and Cub expecting from this robot?
ā–  What caused them frustration, and what did they do to mitigate their frustration?
ā–  What prior knowledge did they have, and how did they apply it?
ā–  What are some common perceptions about AI we get from media and films, and how might this have influenced the childrenā€™s behavior and confusion?
ā–  If young children like Leila and Cub are now expecting to see intelligent interaction with a plastic robot that was only programmed to do very simple tasks, what will they expect to see as high school students?
ā–  Will they have the same fears previous generations had about interacting with machines, or do those come with age, cultural influences (Nasir, Rosebery, Warren, & Lee, 2006), and biases indoctrinated by fears evoked from science fiction or cautionary dystopian storytelling?

What AI Isā€”and Isnā€™t

In the scene above, Leila and Cub expected the robot to interact with them in an intelligent wayā€”the way popular culture and science fiction have shown us robots should respond to humans. But they were interacting with a robot, not AI.
Itā€™s a common misconception that AI and robotics are the same. This may stem from science-fiction films that often depict AI housed within the form of a robot, or a body that appears human but is built out of wires and materials different from flesh and bone. Although the development of AI and robotics may go hand-in-handā€”Rodney Brooks suggests in his paper ā€œIntelligence without Reasonā€ (1991) that work in mobile robotics has helped advance approaches to AIā€”they are two different technologies. While it is common to confuse a robotā€™s capabilities with AI, such false impressions can lead to confusion, fear, or warnings of robots becoming intelligent enough to take our jobs and replace human interactions with robotic ones.
What many people donā€™t realize is that not all robots are driven by AIā€”and while AI can be placed within the shell of a robot, it can also exist in a form that doesnā€™t resemble any living creature at all. According to Microsoftā€™s First Steps into Artificial Intelligence course (tiny.cc/a3y7vy), AI is a broad umbrella term for a type of tool that helps people work better and do more (J. Zimmerman, personal communication, May 9, 2018). Thereā€™s a good chance you have experienced or used AI more than you realize. Personal assistants, chatbots, language translators, video games, smart cars, and facial recognition are all examples of AI. Purchase prediction by retailers like Target and Amazon, fraud detection in banks, online customer support, news generation for simple sports statistics and financial reports, security surveillance, music and movie recommendation services like Spotify and Netflix, and smart home devices all use AI (Albright, 2016).
This chapter and the next will explore the various technologies that fall under the umbrella term of AI, as well as the components that are necessary for AI to workā€”such as machine learning, perception and machine problem-solving.

PERSONAL ASSISTANTS

Personal Assistants, or personal digital assistants, such as Siri, Google Now, Cortana, and Alexa, can listen to your voice and respond to between 40% and 80% of questions asked. Some assistants are built-in features of mobile devices, laptops, or smart speakers.
As these applications improve, their capabilities and usage are expanding rapidly, with new products entering the market on a regular basis. An online poll in May 2017 found the most widely used personal assistants in the U.S. included Appleā€™s Siri (34%), Google Assistant (19%), Amazon Alexa (6%), and Microsoft Cortana (4%) (Graham, 2017). Apple and Google have amassed significant user bases on smartphones, as has Microsoft on Windows personal computers, smartphones, and smart speakers. Amazon Alexa also has a substantial user base for smart speakers.
Personal assistants have the potential to significantly impact which skills students need to learn in school. As these tools become capable of taking on more tasks, some of the skills students are learning today will become obsolete. Coding is one example.
Wolfram|Alpha, the computational knowledge engine that runs Siri, aims to make all systematic knowledge immediately computable and accessible to everyone (m.wolframalpha.com). To that end, the company has developed a programming language that leverages AI to automatically generate low-level code, allowing programmers to operate at a higher level. According to founder Stephen Wolfram:
One of our great goals with the Wolfram Language is to automate the process of coding as much as possible so people can concentrate on pure computational thinking. When one is using lower-level languages, like C++ and Java, thereā€™s no choice but to be involved with the detailed mechanics of coding. But with the Wolfram Language the exciting thing is that itā€™s possible to teach pure high-level computational thinking, without being forced to deal with the low-level mechanics of coding. (Wolfram, 2017)
Students can experiment with this advanced programming language and practice their computational thinking skills using the Wolfram Programming Lab (wolfram.com/programming-lab).
While virtual assistant technology is developing quickly, it still has some limitations, such as failing to answer, mishearing questions, and a lack of support for languages other than English (Dellinger, 2018). Understandably, there is also some controversy regarding devices that are always listening, but that is the flip side of helping to build a repository of data from which machines can learn. There may also be ethical concerns regarding disclosure of whether users are talking to a bot or a real person. When Google recently offered a preview of its Google Assistant feature Duplex, the softwareā€™s command of natural language ā€œwas so masterful that it was apparent the person on the other end had no idea he or she was talking to a machineā€ (Pachal, 2018).

CHATBOTS

Chatbots attempt to replicate human conversation though text chats, voice commands, or both. Machine learning, combined with a technology known as natural language processing (NLP), makes this an element of AI. Chatbots have the ability to mimic human conversation by recognizing the cadence in conversations, storing the patterns, and extracting them to imitate human behavior. This is one example of machine learning algorithms.

LANGUAGE TRANSLATORS

Most of us can speak faster than we type. Natural language processing allows the computer to translate what you speak into text, enabling language translators to do more than just translate from one language to another. Dictate (dictate.ms), a new project released through Microsoft Garage (microsoft.com/en-us/garage), is one example. There is also a language translator that will produce live subtitles, translated as you speak. The extension works with Outlook, Word, and PowerPoint for Windows, converting speech to text using the state-of-the-art speech recognition and AI imbued with Microsoft C...

Table of contents