UEN Homeroom

AI in Education

Episode Summary

In this episode of UEN Homeroom, hosts Dani and Matt are joined by three education experts to discuss the growing use of artificial intelligence (AI) in the classroom. Tune in for an insightful conversation about the benefits as well as some of the challenges of incorporating AI into education.

Episode Notes

In this episode of UEN Homeroom, hosts Dani and Matt are joined by three education experts to discuss the growing use of artificial intelligence (AI) in the classroom. Tune in for an insightful conversation about the benefits as well as some of the challenges of incorporating AI into education.

Subscribe to the UEN Professional Development Newsletter here!

Stay connected with UEN 🤳

Facebook

Instagram

Pinterest

Episode Transcription

[MUSIC PLAYING] 

Hey, Matt.

 

Hey, Dani.

 

All right, today, I'm the excited. One we're going to be talking about AI.

 

It's super cool. Artificial intelligence is so hot right now as a topic in education and everywhere else, but also, I think for a lot of people, absolutely terrifying.

 

Oh, my gosh, totally. Right now, my son thinks everything's AI. Anything that does anything, he's like, oh, it's AI, mom. I'm like, no, that's computer generated-- fine. Like just everything is AI to him, but I'm curious. Have you used AI?

 

I have used AI. I've pushed myself away from using certain AI's, because I don't want to become reliant on them. But I love playing around with image generators.

 

Ooh.

 

I think they're super fun.

 

I love trying to play, like, spot the AI kookiness in some of those images, because there's always something that's just a little bit off.

 

Yeah, absolutely, and today, we're lucky. We have three great guests to talk about AI from three very different perspectives in education. We have a teacher. We have a tech leader in a district, and we have the statewide data privacy person as well.

 

It'll be super interesting. Let's listen in.

 

[MUSIC PLAYING]

 

Utah Education Network is excited to share that our professional development team is now sharing a weekly newsletter with material to support your classroom. Whether you are looking for videos and podcasts to further your learning, UEN courses to support your goals as an educator, or professional development across the state to attend, UENs professional development newsletter supports you.

 

We also include a resource spotlight every week on topics that are important to Utah educators, like artificial intelligence, podcasting, and more. Learn more at www.uen.org/development/newsletter, and subscribe to get the newsletter in your email every Thursday. Check out the show notes for the link to subscribe.

 

[MUSIC PLAYING]

 

Before we get into the tough questions, would you all take a moment, and introduce yourself, and what your current role in education is? And I will start with Emma, if you'd go ahead and introduce yourself, and then pass it off to someone else.

 

Perfect. Thanks, Dani. My name is Emma Moss, and I'm excited to be here. I am a digital literacy teacher in Canyon School District. I Currently work with eighth graders and absolutely love empowering students with technology and bringing innovative ideas to my classroom. I'm going to pass it off to Dr. Cox.

 

Fancy. So I'm Suzy Cox. I'm director of innovative learning for Provo City School District. That means I'm responsible for all educational technology, computer science, and STEM education.

 

I am John Lyman. I am the student data privacy specialist at the Utah State Board of Education. My primary roles are to be a liaison with all 156 LEAs in the state of Utah to help them coordinate their efforts with Utah Law 53E-9-309, which states that any time that an LEA shares student data with a third party, there needs to be certain provisions in place between the two entities. I help monitor that, and I also do a lot of the training throughout the state in regard to all of the student data privacy practices. And I'm an edtech futurist, and I love AI.

 

Awesome. Thank you guys so much. We have kind of, like, a power trio here to talk to us about artificial intelligence. So let's start off with how is artificial intelligence being utilized in the field of education, and what's maybe an example you have of its practical application in a classroom or an online learning environment? And Suzy, let's start with you.

 

Yeah, it's an interesting question, because we've really had AI in our classroom for decades. But it's coming to the forefront in this new kind of iteration of it. So if we think about what it's been like for the last while, you know, we've had adaptive testing and adaptive learning platforms. Those are AI based, even things as simple as, like, autocomplete, autocorrect, suggestions in Google Docs. Like I have not written a paper by myself in Google Docs in eons, ever since those suggestions.

 

You mean the robots have already taken over?

 

Yeah, I hate to break it to all of us, but yes. But also, a lot of our assistive technologies and our things that we really try to promote among kids, things, like using text to speech, translation services, plagiarism detection, I mean, all of these things that we've been using for a long time are forms of AI in education. So it's really not anything particularly new. I think what we're talking about here is this new generative AI, which has taken those utilities up a notch, shall we say.

 

I am not in the classroom. I haven't been in a classroom with students since before the pandemic, so I am not familiar with the current landscape of what it's like to have every student in the state having their own device. And the LEAs throughout the state really-- you know, there is this new environment that we have, where we're everybody is connected, and we're trying to still mitigate this digital divide. And we've got all these different socioeconomic components that come into it and what kids have access to what technology.

 

As far as AI is concerned, I mean, I you're correct about all those different things. Like I hadn't thought about how-- I mean, I was trying to remember back when I was in fourth grade when you're typing on something, and there was no such thing as spell check. I mean, there are very few of us that remember a time before spell check, the little, red line on the document. But I know I'm using it in my professional life, and I'm sure we'll be talking about that more.

 

I think from the classroom perspective, it's interesting, because like Dr. Cox mentioned, these technologies have been there. But they're suddenly in the forefront and the limelight, and even for our students. So I don't think I've gone a day since that original version of Chat GPT came out that I haven't had a student come up and be like, do you know about Chat GPT? Can you tell me about whatever? Or as it's released new versions, have you seen that it can do, and even things like there's a tool with Canva, where you can generate images.

 

Like you can say, I want you to make me a rainbow cow, and looking at it, like it would be hard if you-- I can see, like, from a perspective of a child, like a second or third grader, to decipher like, is that a real image or not? So I think that's what we're seeing now in public education is, even though these technologies have been present, it's really in everyone's mind and seeing the impact of it. And they see more of the generative application of it of how do I-- I can take this, and it's accessible. And it can bring me all this information. So what do we do with that in education, I think, is really the question that we're going to be kind of tackling here today.

 

Well, I think that leads us nicely into our next question is, what are-- we're seeing all this new generative AI, like you're talking about, Suzy. We've had AI for a really long time, but this new generation over the last few months, last year has really heightened our understanding of what artificial intelligence is. What are some of the potential benefits of incorporating these new generative AI into educational settings, and how might they support things, like improving personalizing learning, or student engagement, or even facilitating kind of the clerical and day to day activities that a teacher has to do as just a teacher?

 

Emma, do want to start us off on that one?

 

I was going to say, I'm just waiting to see if I should jump in. But I feel like for me, from my perspective in the classroom and working with students, it really has the power to help our students have a jump off point. I had a chance to go to a conference this past March, the USET, and I heard Eric Hertz speak. And he talked about how the most powerful teachers are not those that can just use technology or just teach, but those that can use them in combination to empower our students. And I think that's really what we need to do with AI.

 

Obviously, there are some drawbacks. But if students can use it as a jumping point to have AI generate something and then learn to be more critical about it, analyzing, using those skills that are a greater depth of knowledge, using it as a creative outlet, I mean, it's amazing that you can take something, some words from your mind about something and have a computer generate what that image might look like, as well as seeing it from a view of having constructive feedback, like taking-- I've seen an English teacher that had them write them an essay. And then they said, OK, what's great about this essay? What's wrong with this essay? How could we improve this essay?

 

Whereas before, you'd have students just starting with zero. Now, they're looking at that and saying, well, this is good, and this isn't good, and this is how I'd tweak that. And suddenly, you have students that can do so much more, instead of just starting from zero.

 

I was being the research nerd that I am, reading an article the other day. It's by Ouyang and Jiao. It's talking about these three paradigms of AI in education. I think it's important for us to kind of frame it and think about it in this way, because there's AI directed education, which is more of those, like, adaptive learning platforms. Then there's AI supported and AI empowered learning, and those become really different things as you think about how could I support my learning by sitting down and kind of using this generative AI or a different form of AI as a tutor or a support to what I'm trying to learn.

 

And I think Emma's talking about that a little bit, and then this AI empowered, like really taking on some personalization and pursuing the topics you're interested in and creating things that weren't possible before. Like we really do have these very different ways of using AI, and I think we're just at the tip of exploring what those are going to look like with these new technologies that have been brought into our classrooms. My brother was telling me-- he's a sixth grade teacher in the district. --that two kids, last week, brought in essays that were written by Chat GPT. And he realized very quickly that they were easy to catch for him, because our sweet kids have not yet learned to say, in the voice and with the vocabulary of a sixth grader, write an essay about x.

 

So they brought these essays that were like, in my professional opinion and in the context of-- you know, it's just ridiculous. And he literally started reading it, looked at the kid and was like, no. So it really does, like Emma was saying, become this new skill set of, how do I define problems? How do I construct queries, and then how do I embrace a new form of literacy that is reading this response that it generates, identifying what you like and don't like about it, probing the software to take it in a different direction? So it becomes this new skill set that, I don't think, many of us have been able to develop yet as educators.

 

That's so funny that you mention that these sixth graders wrote in the voice of a college student. I remember, in November, when, all of a sudden, all of the college professors on my feed were beginning to freak out, because it was coming to finals. And they didn't know what to do, and it was a very interesting time. And funny enough, I've seen some memes, where it was like, college professors hate generative AI, until they have to write recommendation letters.

 

Bingo.

 

I haven't done that.

 

Oh, it works, Suzy.

 

It comes to this piece, where we're talking about saving teacher's time, and day to day activities, and clerical things, some things that I've worked with some of my colleagues and friends on, like when it first came out. And we're just like, try this, try this, try this, and write a lesson plan for eighth grade about Edgar Allen Poe's "The Raven," and focus on these things, and you hit Enter. And the teacher's job is essentially done as far as that piece of instruction, and of course, you want to go through and do your stuff, but as far as time saving and doing something that you can put a lot of thought into and those times after school, weekends.

 

The other thing, when it first came out is a friend and I who is a special ed teacher at a charter school, she was like, let's see if it'll do an IEP. So we went through a couple of iterations of prompt and eventually got to a prompt that was sophisticated enough to where it cranked out in seconds, an IEP that had interventions on it that this teacher said that I would have never thought of those situations. So I mean, when we're talking about potential benefits from the educators standpoint and from the day to day kind of menial tasks of getting things, I mean, I think that this could be something that is very helpful.

 

I believe that is why a lot of the tech directors across the state have all of their teachers knocking down their doors, asking for access to this technology. And I think that that's kind of why we're talking today. It's really to discuss. Because we were talking about those three types of AI and maybe how far we are along with those and just really kind of, like, defining what it is that we're talking about, so that we can get-- I mean, that's what these questions are coming up with. It's like, what's next? What are we going to do with this? So I don't know. That's where my head's at.

 

But teacher efficiency is one of the conversations. Emma and I both had a little bit of like, ugh, when we were talking about writing a lesson plan. Because it brings up a question that I've been having conversations about with fellow educators, which is, what is our role as teachers?

 

Is our role to sit down and come up with that lesson plan from scratch? Is our role as teachers to free up time, so that we're with our kids? And maybe it's both, and I don't know. But it's a conversation, I think, that's rich and that needs to be had to say, what is our fundamental purpose as educators? And how can Chat GPT support us and empower us or any other AI support us and empower us in those roles that we feel are essential to what we do, so that we can be better?

 

As we were talking, Dr. Cox mentioned that we both kind of made this face, which I know that the listeners can't see that. But in defining that essential role, you have to understand that, I think, at the end of the day, you still need an expert person to look at with a critical lens with those paradigms, and how can we use it to empower rather just then having some information that's there? I mean, admittedly, I went in and said write me a lesson plan for this just to see what it would do.

 

And being an educator who has a lot of experience in training, there are things that I was like, that section is really great, but this may not work as well, or that part doesn't even make sense to helping them achieve the learning objective. So looking at it from a critical lens, are there times that it's been helpful, that it's like, I'm really struggling to come up with ideas for how to teach this concept? Can I have 15 of them? And then I can kind sort through those.

 

But I think there still has to be a teacher and an educator with that training, and I think, sometimes, that's why people are afraid of AI in some senses. It's because they're like, well, it can do all these things. Yes, it can. It can do a lot of things, but you still need that heart of an educator behind it. I think that is what makes education so wonderful is that it is working from person to person, right?

 

You're working with students, and it's that creativity and that personal nature that, right now-- I'm not saying it won't ever. --but at this point, AI doesn't bring. And I think you still need that in a classroom, so I don't know. That's kind of why I was making that face. It's because it was like, well, yes, but-- and I think that's where people are trying to define the line as to where you should use it and maybe where it's still learning and we're still learning how to use that in a way that's effective.

 

I love this idea of using AI as a jumping off point, so we're not staring at that white screen of death, or that you can use it for something to analyze and take back into your practice or into the piece of writing that you're working on. But one of my favorite tweets that I've seen on AI is a college English professor, and she just tweeted like, you guys, I know you have not mastered the semicolon in the matter of a week since Chat GPT has been released. Like I know that this is not you, and we as teachers, we have, like Emma was saying, that teacher heart and that teacher mind. And we know you. We know that you did not do this.

 

Moving on to some of the ethical considerations related to AI, we're wondering your thoughts on things, like data privacy, bias, fairness. How can we ensure responsible or equitable use of AI? Let's maybe start with John on this one.

 

Is that because that's in my title is data privacy? Is that what you're saying?

 

Just a little bit, yeah.

 

Well, I mean, I guess this is where we're at, right? This is why we're here is to talk about this very kind of-- when it comes to privacy, privacy is not, like, a tangible thing. It is something that is very personal to people. What is private to you may not be private to me. It's very hard to define.

 

However, I think that we have things that we've put in place to kind of-- like guardrails, guardrails that we've put in place. So I would like to take this opportunity to kind of help some of the educators in the state understand what kind of guardrails in regards to privacy that are already in place and then what kinds of things we are talking about in regards to this kind of technology going forward. So essentially, where I'll start is that in the state of Utah, if you click I agree to terms of service or you log into an app, you are essentially signing an agreement or a contract with that third party under the auspice or under the umbrella of your agency.

 

There are folks at each LEA in the state that are called data managers. They are the ones that are in charge of the transmission of data. They are the ones that are in charge of making sure that surveys don't include PII or different questions that may be up against our Utah PPRA law, but I digress. So what I would suggest is that, in any instance, where there is an educator that wants to work with some sort of edtech product, that I would say that need to-- there is some supporting trainings with this, and I can direct all of you to our supporting trainings on our USB websites, YouTube.

 

But I would say that the first thing that I would encourage all educators in the state to do before you want to have access to a third party application is that you need to ask before you app. You need to ask. You cannot just cannot use them. You cannot put student's information in your roster into it. You don't want to get in one of those trial things. You can do that on your personal stuff, but under the umbrella of what's going on at your LEA, you should be asking before you app. That will help, and this is not a we want to take it take your stuff away from you kind of thing.

 

This is we want to get you to a yes. We want people to be able to use technology. We want you to familiarize yourself with your agency's data privacy practices. Each one of the agencies in the state has what is called an approved applications list or a metadata dictionary. On this list, it has all the listings of all the third parties that your LEA has agreed to terms with, and you should be able to use those things.

 

And then, again, if what you want to use is not on there, you would then go to your data manager and ask them for access to that. But then that's where we get into this issue of privacy and Chat GPT, and I mean, we can get into the back room kind of how the sausage is made when you type in your prompt and what it spits out. But in regards to that process, there is a lot of Personally Identifiable Information that is shared.

 

We don't really have a grasp on a lot of that and where it all comes from. That's why, when you hear about all of these AI art processes, a lot of creators are very angry, because it is stealing their stuff. So a lot of this stuff, is it plagiarism? Is it crowdsource? Yes. Is it correct? Who knows? That's for the experts to figure out.

 

But one of the things that I want to make very clear is that, when it comes to the policies, the LEAs are starting to swirl around, because none of us have policies yet. I talk to these data managers. They don't exist, because it's so new. And we're going to-- I know that we'll talk about this. But government will never move as fast as technology, and this technology is not a steady-- this technology is a speed train running down a mountain, and it's going to get faster, and it's not going to stop. And if you want to put some guardrails on it, good luck.

 

I mean, we're talking about what it is now. It's five months old, folks. I mean, it's five months old, and it's learning. And I read a thing last night that says, yeah, you may know about Chat GPT, and we're sitting here, talking about Chat GPT. But there were 1,000 AI apps released yesterday, things that help us with travel, things that will do all sorts of things, because they're using this as a foundational base.

 

So my point is where we're at is that we don't know, and it's very sticky. So a lot of the LEAs are saying, OK, we are going to allow our teachers to use this, because they're adults, and we're going to offer some PD. I know that there are some-- I don't know if-- I heard that UEN was offering some PD through some of the regional service centers. But what I'm saying is that PD around this would involve things, like not putting PII of the kids into the system.

 

If you are creating or generating an IEP, you may want to use it, as we've said, as a jump off point. We're not going to put Susie Q's name in the fourth grade into this system. We're not going to do that. We are going to refer to all of the training that the state has created, and I'm going to pat ourselves on the back. Utah is fortunate in that we are one of a handful of states in this country that has a student data privacy director, the director of privacy.

 

There are not other states in this country that take student data privacy as seriously as the state of Utah. We have a very well regarded reputation nationally. There are states that-- anyway, you understand. It's very archaic, and we are very fortunate to be where we are. However, we are still behind the ball in this, and a lot of other states are kind of looking to us to kind of help figure it out. Because if you Google or Chat GPT, which state has the most children per capita, the answer is going to be Utah.

 

So I've found myself in quite a-- I don't know. --unusual place for me, which is normally, as a teacher, I would likely have been the first person to jump on, and start using it, and be like, everybody use this. And now, I'm an evil administrator, and I'm not allowed to do that anymore. So I've asked the question at pretty much every meeting I've been at about what do we do about the data privacy side of this, because thankfully, OpenAI and others are evolving their privacy policies, literally, day to day. If you check it, there's variation.

 

So it went from nobody under 18 can use it to you can use it over 13, if you have parent permission, but we're not going to check and see if you're over 13 or have parent permission. Now, they've got some kind of tiered system. But if you want your data protected, you can pay for that. Like it's evolving day to day. So I'm in a position right now, where I'm saying exactly what we've been talking about, like teachers use this, understand what it can do to, have conversations in your class, bring examples to class, those kinds of things.

 

At the moment, I'm not very comfortable saying, students, hop on there, because we know that students are a little more casual with their data than we might like them to be. And they don't think they're revealing personal things, but they are. And those privacy policies as they stand make me really uncomfortable. Now, at the same time, Dani, your initial question was asking about equity, as well, right?

 

So this is where I get all like, oh, my gosh, what do we do? Because I know some kids are not monitored at home and are using it or have parent permission and are using it at home. And other kids don't know about it, don't have parent permission or guidance, don't have devices at home. So not introducing it at school contributes to issues in digital divide, right, in equity issues?

 

We're saying, well, the people who know about it and have parents or access to resources or whatever, they're going to learn how to use it, and it's going to be a tool for them to be successful. Kids who don't have it, know about it, don't have guidance, sorry. I mean, we're really contributing to that divide. So it's kind of the same thing we've been talking about forever with just edtech in general.

 

If we don't do it effectively in the classroom, then we're contributing to digital divide issues. So I'm really struggling personally with kind of managing these facets of what I value of I want to keep that student privacy so close and intact, and I want to make sure that divide's not opening up. But I also want to be experimenting, and exploring, and see what we can do.

 

I'm going to just jump in real quick, because you talked about a couple of things that I want to applaud you for. One is, yes, it did go from age 18, which was where the initial policies for schools came into play. We're going to just use it for teachers, because the kids are not allowed on [INAUDIBLE], our stance, and so we're going to block it. And then, a week later, it went to 13.

 

So what happens is you go to Chat GPT, and you say that you want to sign in, and it will ask you for either a Google or a Microsoft instance, which a lot of schools will have something like that. But then the next piece is that it asks for kind of, like, a two factor authentication to where you're going to need to type in your telephone number, so that then assumes that somebody has a phone in the first place. So that gets back to our equity piece, as well, but that then assumes that anybody over the age of 13 has a phone. That's not necessarily parental permission.

 

You got parental permission to have the phone in first place, and the other piece that I want to talk about is that you said that you were struggling with the data privacy piece. And where this gets kind of back to that sticky piece is the thing that I monitored, that metadata dictionary, the data privacy agreements. So Provo, Clint Smith is your guy, and he goes and obtains these agreements with vendors. And there's all sorts of caveats inside of these agreements, and anybody can go and look at these. They're on the metadata dictionaries of all the schools, and you can take a look at these agreements.

 

But the fact of the matter is that Chat GPT, in order to be used in an LEA, according to our state's law, would have to sign a data privacy agreement with an LEA in the state of Utah. I am confident that, based on their operation model and what they need to function, that Chat GPT will never sign a data privacy agreement not only in Utah, but nationwide. It will not be something that will happen.

 

So our director of privacy, I mean, we just kind of like-- I mean, we're stuck here. And maybe what a middle ground is, is we get Chat GPT kids or something. And that really-- I mean, and it's silly. But when it comes to the law that is in place and this new kind of, like, tidal wave of this thing, and this is something that we're wanting to kind of figure out policy wise and figure out implementation wise. Like this is a tool, and how can we get you access to this tool? And right now, the way that the law works, and as far as everything that we're guiding people with, I just don't know if it's something that we can use to be honest.

 

I think that just hearing you talk and seeing it from a teacher's perspective, like everyone that I've seen using it has not handed it over to their students. So I want to be clear about that. I think teachers are aware that there are privacy issues that are going to arise and that the information that they're putting in is somewhere. I think that our students are-- and hearing them have conversations about it, sometimes, they are aware, and sometimes, they are not. So I think where it stands right now, I think that's a really important thing to recognize is when your students ask about it.

 

Like I'm thinking of the teachers in the state, because I'm sure they've had similar conversations. My students ask that, and I say, you know, that's a lot of you don't want to be sharing all of that information, because it is going somewhere. It's using that information. It's caching that information. It's using that to answer more of your questions and recognizing that-- I know for myself, when I set it up, I'm not using any of my education accounts to access it. I'm using my personal account.

 

I'm making sure that what I'm putting in there, is that something that I would want somebody to know about me? Because I don't know where this is going to go. My husband, who's also in tech, he's the one that came home first and was like, hey, have you seen this? And he was telling me about how they were putting in, like, having the AI write Google reviews for them, but they were using, like, fake names or fake companies. Because they're so concerned about where is this going, and I don't think that we know yet.

 

And you talked about I don't think that Chat GPT or other tools will sign those agreements, because I don't know that they know where it's going yet either. You talked about it being a freight train, but that freight train is going. It is moving, so I think we have to realize that we're going to have to apply best practices in these cases. How can we evaluate this technology? How can we use this technology, but how can we do that within those laws and privacy agreements that already exist?

 

And I think that that's what you're saying here and where we're drawing the line is like, I mean, I'm not using it with my students, because I don't want their information shared. But they are all very much aware of it. So it reminds me of when-- like, it's definitely slower moving. But when cell phones first came out, people were like, well, do I use that as a tool in the classroom, or do I not have them use it at all? I feel that same sense of like, what do I do with this thing?

 

And I think, as time goes on, you'll see some of that emerge of how can I use this in a way that's both best practice, respects privacy, and helps our students become 21st century learners, so that they're able to use that when they are adults and have that mentality and those paradigms to be able to say, I'm not going to put my address into this thing, or I'm not going to ask it about a health concern that I have, or I'm not going to tell it about xyz, so that they are aware of that? Because they are going to continually encounter this. So I think that's where, right now, as a teacher, where I'm at is, how can I help my students know that there need to be guardrails, even if it is a freight train?

 

I absolutely love that, Emma, and John, I got to go back to a comment that Emma referenced, as well, is this technology with AI, it's going so quickly. And it's been described as a technology that we won't really understand the implications of, until about 400 or 500 years from now, if we survive that long, that sort of thing.

 

Yeah.

 

I was fascinated. A couple of days ago, Greg Brockman, who's the OpenAI co-founder, he did a Ted Talk that just came out earlier this week, where he showed applications of Chat GPT mixing with things, like Shopify, and Twitter, and things like that. And it's incredible to see the speed at which, in the five months since Chat GPT came out, that we're really seeing this growth and the speed of applications and thousands of apps coming out.

 

So there's a question here for educators who are trying to keep up with this. Where do you go? What have you seen for professional development to keep yourself on top of understanding the implications and best practices for AI in the classroom? I'll send it over to Suzy, first.

 

I think this is a tricky one, right? I was talking to Darren Hudgins, the author of "Digital Detectives," yesterday, the day before kind of about this and how there are ends of the spectrum on this idea. So I'm sorry. I'm not going to give you an easy answer, but on the one end of the spectrum are books about AI and its impact on technology that have been researched for a year and are just about to come out versus Chat GPT came out on November 20 whatever of 2022.

 

And by December 1, we had websites, and blog posts, and things about, oh, this is how we'll use it in education. This is how will you use it in education. So personally, I haven't found a single really great, reliable reference. I have a few places that I've pulled some ideas from and things like that. I've looked at some things. We've had some discussions at ed camps. I've looked at some materials that were produced by Ditch That Textbook pretty early on, a few other things like that, but I think we're actually in a space, where we're having to craft these materials ourselves.

 

As far as, like, what [INAUDIBLE] was talking about, I do feel like, at times, I'm like, OK, where is my footing? You asked about places that have good resources. UNESCO actually has a lot on Artificial Intelligence. They have a guide that they published on ethics that is long, but fascinating, and have a lot of-- I'm not going to sugarcoat that. It's a lot of pages, but it's a good, like, kind of help you kind of visualize those guardrails.

 

They have a lot of resources there, especially for educators, which I really liked. I also find myself frequently on Twitter just scrolling for things that I can find and grab. So I think I'm leaning towards some of those research studies as more of like, here's a foundation for me, here are some guiding principles, and then kind of reaching out to my PLN on Twitter. Here's how that's evolving. How does that fit into that scope of guiding principles that I know are sound and research based, even if it's not necessarily looking from a lens of the brand new?

 

They are providing that footing for me. So I can say, like the privacy laws, I know those exist I'm very well aware of those. So when it first came out, it was like, I don't think I want to put that in there, because I know about these. So I'm trying to use it from that perspective and use an application, instead of looking for something that's consistently analyzing what's new and just using what I know as an educator to look at those things.

 

That's such a good point, Emma, because I think that really guides the majority of our edtech decisions, right? Like when any new technology comes out, we don't immediately have the guidebook that teaches us, oh, this is exactly how we should use this in education. But we have an understanding of how kids learn.

 

We have an understanding of development and what's appropriate at different ages. We have an understanding of data privacy, like we have these building blocks at our disposal, and those are the ones that help us make those decisions. So the more we can, I think, talk about it within frameworks of effective pedagogy, and what's good for kids, and these data privacy laws, that's going to be what drives us to the right kinds of solutions, instead of, ooh, a new, shiny thing, and especially a new, shiny thing that wasn't intended for children, that were now kind of adapting and saying, OK, how do we prepare children for a world in which this exists?

 

Aside from my position, and that's kind of why I found myself in this position is that my degrees are in adolescent development and those kind of like-- and I've always been very fascinated with this juxtaposition of traditional human development, which we've all experienced over the course of however many years, we all decide that we've been around. And now, we are essentially cave men and women with these tools in our hands that are so powerful. I mean, we always kind of make the connection of like, yeah, my phone is more powerful than the moon landing rocket, you know? Like, Buzz Aldrin couldn't fit in my phone.

 

But, I mean, with that being said, I've always been very piqued, and as an educator, it was always very important to me. I worked in after school programs for 25 years with eight to 14-year-olds, and I did not have to necessarily work with-- I got to do a lot of what I wanted. And we talked a lot about the future. I study the future. It is important for me to know that these youth that are in a place generationally to where the technology that they use is going to grow so exponentially fast.

 

Our technology has grown more in the last 100 years than it has over the course of human history, and it's going to continue to get faster. And those that are in the futurist world, there are certain dates that we kind of, like, dance around as far as different-- there's a thing called Moore's law, and it talks about how fast microchips get. And the concept is that, by the year 2029, that's artificial intelligence or this kind of thing that we're talking about, that the computing power will be that of a human being. So that's six years away.

 

That's pretty exponentially fast. The next date that they kind of throw out there-- and this is not necessarily a conspiracy theory stuff. It's very, like, scientific. These are trends and things that people pay attention to. But by the time we get to the year 2045, that is when this computing power gets to be about as smart as all humans. So it's called the technological singularity.

 

There are people named Ray Kurzweil, and different folks out there. They've been talking about this, and this is not a possibility. This is a probably. It's the train. It's not going to stop.

 

We're not going to stop it. So we have these youth. What kind of a world are we really preparing them for? Are we going to prepare them for a world of jobs that don't exist yet? Are we preparing them for a world that may not have jobs?

 

What does that look like? What does it look like for my five-year-old daughter that wants to grow up and become a pharmacist? Is that something that people will be doing when she's 25?

 

Will a surgeon go into a room and actually cut open a body? No, it's going to be somebody that's really good at video games, sitting in another room, while that other person is in a sterile environment, you know? Because the robot is going to get in, and they're going to be able to get in so close that they can do something. And then we talk about what happens when autonomous cars happen, and three million people in this country are no longer employed. Because drivers of Ubers, and buses, and taxis, and all of that stuff no longer have jobs.

 

They're not going to become coders. They might, some of them, but what happens? What happens when all of this technology gets so fast and our medicine gets to the point?

 

I had one student, years ago, we were talking about. What happens when medicine gets so good that maybe we don't die? And he looks at me for a second, and he says, we're going to run out of water. It's just these kind of conceptual things, and Matt, you say, we're not going to understand this for 400 years. Like that's not hyperbole.

 

I don't know. I'll tell you this. When November and December hit and this thing came out, based on all of the things that I've been paying attention to for the last 25 years, and generational studies, and technology, and on, and on, and on, we're at a point where this is faster than where I thought. We're further along than where I thought we'd be by now. I don't know if that answers the question, but--

 

Oh, it terrifies the crud out of all of us, I think, but no, that was perfect, John. We're going to jump to our teacher question now.

 

Hi, I'm Stephanie Adams from Utah Online School in Washington County School District, and I was just wondering, how can AI be used to support students with diverse learning needs, such as students with disabilities, English language learners, or students who require additional support? And what are some best practices for leveraging AI to promote inclusive education?

 

As [INAUDIBLE], I think this is one of the ways that I'm most excited for it to come into education. I think, sometimes, teachers are like, but I don't have time to make x, y, and z to support that student. And I feel like there are ways that this technology can do that.

 

I can use it to create a separate thing and have it translate something into Spanish, or have it to provide extra context, or have it provide-- we talked about personalized learning earlier. --have it break something down for me and then say, yes, that's correct, and using it that way. I think that's a very basic way of using it. I could also see it, while I'm hoping that these technologies come forward, at this point, I can't hand it over to a student. But I'm hoping that, in some future world, there are privacy laws that allow a student to be able to have this in their hands.

 

But I could see this being as, like, a personal tutor. Like in a perfect world, I'm imagining these students, and having a student sit there, and not understand what's going on, and being able to ask this Chat GPT or this AI, and having them explain it at the level that they understand, and then being able to connect, and collaborate, and work with other people. It reminds me of when Google Translate came out, and you could speak into it, and it would speak the sentence translated into the language.

 

And I ended up going to another country, and I could sit there and have a conversation with this person on the bus. And it was like, this is so cool. And that's what I feel like I'm hoping comes into education is that, right now, I feel like, practically, it's those very base level, how do I provide scaffolds and support for those multilingual learners and those that may need additional help in whatever way in the classroom, that I'm hoping tools will develop? That can provide context, and help, and tutoring, and support, so that all of our learners are able to engage with the content that we're presenting.

 

Yeah, I mean, to reiterate some of those ideas, the ability of a teacher to say, even something as simple as I've got this group of kids in my class who loves baseball, another group who can't stand baseball, but love dance, and another group that won't stop playing Minecraft, help me get ideas about how to teach all three of those groups the same lesson. Because I'm stuck, you know? And it can generate those ideas for you.

 

So from a teacher productivity side, like Emma was talking about, being able to just say, hey, help me out here, because I can't come up with all these ideas on my own of how to customize these lesson plans. Again, there are things, like text to speech, voice typing, translation services, that kids are using, and I hope will continue and actually kind of encourage their use. Because those simple versions of AI are things that kids can use right now and that are super powerful for empowering them to say, you know what? I might have some learning differences. I might struggle with reading, or I might need something in a different language or whatever. But I now know that there are tools that can help me.

 

So I'm not stuck. I don't have to wait for somebody to come and support me. And yeah, hopefully, those tutoring capabilities come into play. Chat GPT is not the best tutor ever, because it lies.

 

But didn't we all feel that way about Wikipedia, too, right?

 

Yeah, right? So that's got to be monitored, and we maybe need to start creating some of these similar generative AI solutions that are drawing from a more precise and accurate base of data to draw from. That would be awesome, and then we can use those for tutors. To some extent, that's already happening in some of the products that we all use. So I think that really is coming along the pipeline of opportunities for kids to take advantage of tools we already have, opportunities for teachers to create these customized approaches, and then let's see what's coming, really.

 

And just generally, in education, as we look at the power that might come through things, like data, and analytics, and being able to predict where our kid's going or what are they going to need two lessons from now, so I can be ready for things, that's going to be massive. Just so many possibilities as far as where it could take some of the load off of us to be with our kids in a more supportive role. And maybe that's actually the most powerful thing is, if I can free up some of my mental space, and emotional space, and give over a few of these things, and become more efficient in my planning, that I can then be more present with my kids. And that's the best differentiation tool I can give a kid.

 

I was going to say, Dr. Cox, as you were talking, I just feel like that that's what I see it right now. I talk about these future possibilities, but right now, it really is helping give that support, so that I can build those connections with my students. So I'm not spending hours.

 

You gave that example of trying to find a personalized Minecraft example. I can say, hey, I saw a teacher in a math classroom, that she actually had it right, word problems based on what the students liked. So then they were really engaged, and it was really fun to watch, where instead of spending time trying to be like, OK, I don't know anything about Minecraft, because-- I mean, I know a little bit. But I don't know that much about Minecraft as much as my students do. How do I write this?

 

Even though the database isn't exactly precise right now, it can help provide that engagement piece. That math teacher, I remember, when I was talking to her, she was like, it was so fun to walk around. Because then they were like, hey, you knew about this, and then they were explaining things about-- in her example, it was a student that really liked fish. So I guess it used some sort of tropical fish in the example. So then they were telling her, and she was able to make that connection, which then they were like, and I can see that from a lens.

 

So for me, it's just seeing how teachers can be empowered by it to then help breed those connections with their students and be able to, again, bring that personalized self to education. I think, sometimes, people are so afraid of AI, because they're like, oh, it's going to take away. But it's really not. It's just another technology tool that's coming, and we've seen this at a slower rate happen, again and again, with new technologies.

 

I mean, when we brought in one to one devices, people were nervous. When we had cell phones, people were nervous. I'm not this old, but my mom talks about when they had Google come out. People were nervous. So things like that, just realizing that this is not a new feeling to be nervous about a new technology and that we've gotten through it before, and we figured it out with education. We always find a way to figure it out. So for me, that helps me have perspective that we're going to figure this out, even if it is a freight train versus, like, a horse drawn carriage.

 

I remember how nervous I was when the Abacus came out. Oh, my gosh.

 

Suzy, you're dating yourself.

 

I wanted to just touch on a couple of things on this piece. The thing that kind of got me started in education was working with autism and youth with autism as a paraprofessional, and this was in 1998 when autism and anybody who knew about it, the only reference was Dustin Hoffman and in Rain Man. It was not something that was very ubiquitous at all, and I remember talking to people about this. And now, it's something that is very much part of our culture.

 

But with that experience that I had in special ed for so many years, what I see with this is it just is going to eliminate a lot of barriers, barriers for, like we've said, with language. You know, how real is that Star Trek thing, where you can talk to anybody? And there's really just like-- there's just not that barrier. Like how in human history have we ever experienced something like that? That's just amazing to me.

 

How does it impact students who are blind or deaf students? I know that there is a lot of-- I mean, you talk, Emma, about bringing cell phones. Like Rick Gaisford, who a lot of us know, he mentioned at one of our last conferences about comparing Chat GPT to another thing that was kind of seismic, which was, like, bringing tablets or iPads into the classroom. Like how are we going to use these things?

 

You've got these things that are cool, and what are we going to have them do? And I know that, in a lot of that special ed space, that those students and those iPads can be very-- well, they're good friends, let's say. I mean, one thing that we could experience right now on this call is auto captioning. I know that there are folks that I work with at the state level that are hard of hearing, that are deaf, and they are probably much more equipped at participating in meetings now than ever before, right?

 

Being in a big room that's echoey or somebody's-- I mean, we're really breaking down barriers with this stuff. And the other thing is I did-- you know, here's a plug for the University of Utah School of Instructional Design and Educational Technology. But one of the things that made me go and be a part of that program was the last kind of thing that was this new thing that was going to come on and take over, which is virtual reality and how that can impact the classroom. And I think that maybe that is kind of a sister conversation to this one.

 

I know that there are a lot of folks in my cohort, one, in particular, that was very adamant that there will be a time in schools to where we will have not necessarily just computer labs. We probably wouldn't have a computer lab by that point anyway, but, like, a holodeck a virtual reality space, where you can go on a field trip to anything, anywhere. And you can do that now. You can do that with Google expeditions, and you can do those kinds of things and.

 

And one of the things that, I think, that these kinds of technologies, VR is amazing at-- I think that what it can do is it can have us-- it can explain empathy in a way that humans aren't really ever been able to experience. An example would be like, I used to have a refugee student during my summer program, and what we did one Summer, we had these little virtual reality things. We put the cell phones in them, and I put him on the front row of the NBA Finals. And LeBron James was dunking in his face, and his body was moving.

 

And if you've ever put a virtual reality headset on somebody, no matter their age, station, or whatever, after they get over kind of the wooziness, their first reaction is whoa. It's just that kind of remarkable, so it's that empathy transfer. And then me having my affluent students go and spend a little bit of time at a refugee camp through their VR goggles, and this kind of brings me back to the autism spectrum disorder is that there are experiences, where you can be in the VR space and experience what it is like to have autism, experience what it is like to have overstimulation, over sensory overload, and all of the things that we've learned in the last 20 years.

 

So I mean, I think that it can be used for diverse learning needs, students with disabilities, and English language learners, and all this additional support. I mean, it will be transformative, and it's like Emma, you said like it's going to happen. And we're going to figure it out, but it's going to take-- how are we going to slow down the train? I don't know. We're just going to jump on as it's going, so let's all just jump on together.

 

But that reminds me of just even a really simple, newer use, like as different products and different companies have picked up the use of AI and partnered it. We talked about Canva as being one of those to generate images. Duolingo has done the same to start modeling more natural conversations because I don't know about you. But I know how to say I need bread in a variety of languages thanks to Duolingo, but don't know how to hold a conversation with somebody in any of those languages, right? Because we were kind of walking through the preprogrammed instruction.

 

So that's one way that they've incorporated it. And as we think about our students with special needs and our kids on the autism spectrum, and those kinds of kids and adults who might need some interactivity and some ability to converse in a safe space, that's a pretty incredible opportunity and a space that I'm excited to see open up even more than it already is for them to be able to go in, and really hold a conversation, and see what our response should be like, and how it causes some response. I think that's a pretty neat way that this is moving.

 

Yeah, there are a lot of opportunities for practice without pain in the way. Like they utilize this with firefighters and all sorts of things, where you're not going to-- yeah, it's fascinating.

 

This has been an excellent conversation, and I love that we've kind of gotten to this hopeful place, where, yes, there are concerns that we're thinking of, and we need all of our great minds in it together to tackle these kinds of concerns. But I think we've gotten to a really hopeful place for the possibilities of AI and education, and you guys are amazing. Thank you so much for being here to talk with us about all of this.

 

Thank you so much.

 

Thanks, everybody.

 

Thank you.

 

[MUSIC PLAYING]

 

So, Dani, there was a lot in that conversation.

 

Oh, my gosh, we kept hearing the freight train metaphor, and I feel like that conversation was quite the roller coaster.

 

It was.

 

Like there were some high highs and high hopes, and there were some scary lows.

 

Absolutely. I do want to-- we kind of have a confession to start off with though.

 

Ooh, we do have a confession.

 

How did we write our questions for this, Dani?

 

What percentage of our questions were made? Make a guess by AI. Do you have the number in your head? OK, 100% of our questions were written by AI, but we didn't just read them verbatim. Then we went in. We analyzed the questions. We talked about where we wanted this conversation to go. We edited them. We moved them around a little bit, but yeah, this conversation was powered by AI.

 

And that's a great example of what you could do with AI based off of our conversation with these wonderful teachers use it to help you write, develop ideas, and then revise--

 

Exactly.

 

--and put your own spin on it. So at this point, I do want to bring up that UEN is going to be jumping into doing some professional development on AI.

 

Tell us about it.

 

So very soon, we're going to be launching a course called the AI Frontier. It'll probably be coming out around when this episode comes out but keep an eye on our UEN course catalog. And you can sign up for this course as a Utah teacher to learn about AI, how it's developed over time, some of the ethical implications, and then how you might be able to use it in your classroom.

 

And I know it's an online course, but are there any due dates or anything like that?

 

Nope, you can sign up at any time and finish by the end of the calendar year.

 

All right, that's called a MOOC-style course, and we're excited to see it. Thanks, everyone, for joining us today.

 

We'll see you next time.

 

[MUSIC PLAYING]