The Cult of Pedagogy Podcast, Episode 99 Transcript
Jennifer Gonzalez, host
In schools we use more tech tools every year. We also have very little time to vet them for quality. Do the math and you have a formula for some tech choices that may not be serving our students as well, or as equitably, as they should be.
It’s easy to dismiss this as no big deal. So what if we occasionally adopt something that isn’t the very best choice? The answer to that depends on a couple of factors: Are we spending a lot of money on the tool? Is it going to replace other learning experiences? Will it be time-consuming to adopt? Are we expecting it to close gaps and provide remediation? If the answer to any of these is yes, then it would definitely be a big deal if our chosen tool didn’t actually do what we thought it did. It would be an even bigger deal if that tool ended up widening the very gaps we were trying to close.
This is not to say that schools are just going about their tech decisions all willy-nilly. Surely everyone is acting in good faith. But when all the tools seem ideal, when they all promise to solve some of our most persistent problems, it’s pretty hard to figure out which one to pick. What we need is a framework for making these decisions, a set of practices that can help us determine which tool is really going to deliver on its promises.
With this episode, I hope to contribute something to that framework. I’ll be talking with Rupa Chandra Gupta, founder and CEO of Sown to Grow, an online platform that helps students set and measure goals. As a former school administrator and the head of an ed tech company, Gupta has been both a consumer and a producer of ed tech; this has raised her awareness of the interplay between equity and technology. Now she wants to hold herself and her peers to a higher standard when it comes to designing tools that meet the needs of more students.
Our conversation starts with an exploration of the problems that can arise when a tool isn’t carefully scrutinized. Then Gupta shares seven strategies educators can use to deeply assess a tool for its impact. I think you’ll come away with some concrete practices you can put into place right now to make sure your technology is actually serving its intended purpose.
Before we get started, I’d like to thank our sponsor, Peergrade. Peergrade is a platform that makes it easy to facilitate peer review in your classroom. Students review each other’s work, while Peergrade takes care of anonymously assigning reviewers and delivering all the relevant insights to teachers. With Peergrade, students learn to think critically and take ownership of their learning. They also learn to write kind and useful feedback for their peers. Peergrade is free to use for teachers and students. And now, Cult of Pedagogy listeners can get 3 months of Peergrade Pro free of charge! Just sign up for a free 30-day trial, then redeem the code CULT to extend that free trial to 3 months. To learn more about Peergrade visit cultofpedagogy.com/peergrade.
Support for this episode also comes from Microsoft Hacking STEM. To engage the leaders of tomorrow, we need to give our students hands-on practice with science, technology, engineering, and math–experiences that reflect the academic standards and bring real-world scenarios into the classroom. Microsoft’s Hacking STEM is a resource devoted to helping teachers enhance and democratize their current STEM curriculum through inquiry and project-based lesson plans, aligned to middle-school standards. Using computational and design thinking, students build affordable scientific instruments that visualize real-time data in Microsoft Excel. All these resources are absolutely free and can be accessed by going to cultofpedagogy.com/hackingstem. (repeat)
I also want to thank you for the reviews you’ve left for this podcast on iTunes. These reviews really help draw more people in and get them listening, so if you’ve been enjoying the podcast and you think other people should listen, too, take a few minutes today to go over to iTunes and leave a review. Thanks so much.
Now here’s my interview with Rupa Gupta.
GONZALEZ: I would like to welcome Rupa Gupta to the podcast. Rupa, welcome.
GUPTA: Thank you, nice to be here.
GONZALEZ: I’m really glad that we could talk about this. You approached me about this issue awhile back in an email, and I think it’s so important that we bring this to the attention of administrators and teachers, really, because teachers are kind of at the frontline sometimes of looking at the tools we use and deciding whether they’re really working. So let’s start off by just giving my listeners a little bit of background on just who you are and the work that you do in relation to what we’re going to talk about.
GUPTA: Great. So I have had a few different roles in the last several years. My last district role I was the administrator of redesign at a comprehensive middle school where we were doing a lot of school transformation work. So I was leading up the implementation of that, and our school served a predominantly low-income, predominantly Hispanic community here in California. I left that role a couple of years ago to actually found an education technology platform of my own based on some of the work that we did at our school, and it really focuses on cycles of goal setting, soft monitoring, and reflection for students.
GONZALEZ: Okay. We can go ahead and say what that is, we can name it, yeah. Yeah, tell us what it is.
GUPTA: Yeah, it’s called Sown to Grow, and it’s based on, actually, some of the kind of hacks that I was building at my last school and the goal-setting reflection journey was one of the things that was most empowering and impactful for students as they transitioned to kind of student-led learning. So I decided to try and kind of build something that could be hopefully useful and applicable in a lot of different contexts. But, you know, it’s been so interesting because in those two roles, the last several years I’ve been both the implementor as well as the creator or builder of these tools, and it’s really kind of helped me think about different ways around how equity and technology come together and interplay and frankly it’s just really highlighted for me how we have to be thoughtful about our design and implementation to avoid kind of unintended consequences and pitfalls.
GONZALEZ: Yes. So this is sort of, this is almost like a podcast for the tech designers themselves to have kind of an idea of some things that maybe they haven’t been thinking through all the way but then also for the schools themselves, for them to be thinking about what they should be looking for.
GONZALEZ: Okay. So tell me a little bit about what is this problem that we’re going to be talking about today, and how did you get to the point where you became aware that there was a problem?
GUPTA: Yeah, so I mean I think, I’ve been working in kind of the areas of technology in education for several years now, and I think we all know that it’s a, it requires a lot of kind of investment of time, of resources. There’s, you know, messiness with implementation, and I think there’s a lot of excitement around the possibilities around individualized instruction and personalized learning and kind of what that can do for motivation and for learning for students. But I think as I was, you know, responsible for implementing and now I’m designing, I think I uncovered some unexpected kind of gaps, and I think more than anything, I think all of us are in education because we want to be closing gaps, not widening them, right?
GUPTA: Like at the end of the day, that’s something that’s, I think, core to most educators I know, right? We want to be giving opportunity to everyone in an equitable way, and ideally we’re kind of accelerating those who are behind, not leaving them further behind. And I think one of the interesting things about technology is it sort of amplifies whatever is happening. You know, so if we’re in a situation where we are widening a gap, it can be amplified by technology, and it happens faster, and it happens sometimes under the radar, because you know teachers and students might not be having every interaction in person anymore.
GONZALEZ: Right, right.
GUPTA: Right? So I think for me it, kind of seeing that and frankly just learning some lessons the hard way of where we saw some of that happen, it really highlighted like wow, we need to be paying attention, I guess, paying attention to this.
GUPTA: And probably have some strategies that we use when we’re paying attention, and then frankly just like up-level the conversation so it’s something that we’re talking about as opposed to something that we’re kind of crossing our fingers on and years from now we realize, like, oh shoot, that actually didn’t do what we expected, and now we have even kind of wider gaps in student learning than we had hoped.
GONZALEZ: Right. And I know that there’s no one out there who would want to adopt a new tool only to find that it has made some of their problems even worse. And so you are going to sort of go through for us some of the lessons that you’ve learned in your experience about, you know, some of the things that maybe tech companies need to be looking for or problems that can exist with certain tools being adopted. And then what we’re going to do is share some strategies that educators can use to make sure that the tools they choose are good for all of their students.
GUPTA: Yeah, and I really would say that I think the role of kind of teachers and administrators is so critical, because based on how you kind of vote with your feet on what tools you use, the questions you ask of designers and builders, kind of the expectations you have around transparency, around how data gets shared, around all these things that like truly is the game-changer for how this industry evolves.
GONZALEZ: Yeah, yeah.
GUPTA: It truly is. So I do think that power is really in the hands of teachers and administrators and kind of folks on the front line who can see, you know, what’s going on and help us change it. I do think though it is something that we have to be kind of consciously looking for, and hopefully, you know, some of the things that we talk about today will seed ideas on how you can take that approach when you’re adopting something new and make sure you consciously look for some of these things.
GONZALEZ: Yeah, and something that you just said, it reminds me, working sort of on the sidelines of tech myself for the past couple of years, I’ve really been struck by how some companies really are seeking feedback from teachers and from users. They want to improve their products. And I don’t think that a lot of teachers realize that, that they really can effect change in how these tools operate and the features that they have and how they can change. And so I don’t know, I think that’s an important message to get out there too that if you’re working with the right company, they want to hear what your experience is and how they can improve things.
GUPTA: Yeah, I totally agree, and we definitely take that mentality at Sown to Grow, and I know a lot of colleagues who I think have a similar mindset, so I totally agree.
GONZALEZ: Yeah. Okay, so tell us a little bit about some of the lessons that you have learned as somebody who’s been implementing tech.
GUPTA: Yeah, so my last school, like I said, I was kind of the administrator responsible for the implementation of a personalized learning model. And I think there’s a few things that bubbled up in different aspects of that implementation. So first was sort of just the reinforcement of the mantra “first do no harm.” So here, the example I’ll use, we were kind of in our second year of the implementation of a variety of, you know, intended to be thoughtful implementations of technology in the classroom, and we were selected and funded to implement a new comprehensive, personalized learning platform. So it’s one of these platforms that has a lot of components around kind of self-directed learning and there’s playlists and kids can pace themselves through their playlists.
GUPTA: So instead of, you know, every student working on the same thing every day, there’s a lot of kind of autonomy for kids to think about what they’d be working on. So we, like I said earlier, invested a ton of, kind of, time and professional development, weeks of professional development over the summers, as part of the implementation. We changed fundamentally the core of our instructional model. It was a middle school, so our sixth-graders, we did it at one grade level, about 300 sixth-graders and all, the 12 teachers, that team, all kind of implemented this program. And since my role, I was like especially assigned to implementation and monitoring, I was able to kind of pay attention and do a lot of data analysis as we progressed through the school year. And you know, when we first kind of pulled the numbers, we saw—we were using NWEA as one of our benchmark assessments—we saw pretty significant growth of students overall, kind of if you looked at the average scores on those benchmark assessments, when we went from fall to winter. Great, right? Everyone’s excited.
GUPTA: But as part of kind of my work, and something that I was pretty, especially given the school context and the students we were serving, where about 60 percent of students were behind in grade level in reading and in math, I disaggregated the data. And what we actually found was our students who were kind of entering sixth grade on or above grade level were soaring. They were doing incredibly well in that self-directed learning environment. But our students who were kind of coming in behind grade level, which like I said, was more than half of our students, were actually falling further behind. So not just kind of, you know, moving forward at a slower pace or even staying flat, they were falling further behind.
GUPTA: And, you know, I’m all for, I know how kind of the first year of implementation of anything can be challenging, and there might be some, you know, some growing pains with that, but we were basically comparing directly to our previous year, which had been our first year of kind of personalized learning at our, using our own tools.
GUPTA: And even compared to that, our students were, like, significantly underperforming.
GONZALEZ: Oh gosh.
GUPTA: Yeah, and you know it was, I will say it was a pretty eye-opening, I mean I just felt so lucky to be able to have a team that we could, you know, I could just share this with. It wasn’t news that I wanted to share, especially given the investment of time and energy and I mean everybody rewrote their curriculum and you know, like all the work that went in. But I think that fundamentally the team, and kind of the admin team and our staff, knew that this was not something that we were comfortable with.
GUPTA: We just couldn’t be taking on a system that we felt like was not designed to close equity gaps and it was widening them.
GUPTA: So we, you know, we decided after a year that, and we saw that trend continue for the rest of the year. We committed for the year, but we decided after that year to basically pause on that implementation, kind of step back and frankly go back to the drawing board. Because, you know, in our ideas, well, we gotta find the system that helps all students soar.
GUPTA: We just, we have to design with that in mind and think about that, and that’s where we left it.
GONZALEZ: Right. So you opted, let’s just, we’re not going to replace it with something, we just need to stop, because this is obviously making things worse?
GUPTA: Yeah. I think the kind of how, the data was so clear —
GUPTA: — I guess I can say. You know, if it was a little fuzzy, I think there might have been some room to tweak and kind of modify, but it was, the disparity was so wide that it was clear that we had to just stop, we had to kind of rethink, yeah.
GUPTA: So that was, I mean I think it was just a good reinforcement of, like, we always have to be, I’m all for innovation. I was the admin for, like literally my title was administrator of redesign. You know, I’m all for innovation and whatnot, but we do have to remember and kind of figure out when we gotta pause.
GONZALEZ: Right, right. So that was sort of the first lesson or that was one of the ones that kind of woke you up to this being a problem. So then what was another one?
GUPTA: Another lesson was really about how important design of tools is, and how it can really kind of differentially impact certain populations of students.
GUPTA: So in this example, we had done a bunch of research, our math team had done a bunch of research, on different math tools that were out there. And, you know, as part of that research, they signed up for free trials and demo accounts and whatever and played around with tools and whatnot. And it bubbled up from the team that the tool that they were most excited about had a really strong, rigorous instructional components, a lot of support for students, everyone was excited. I will say that it was like a demo version that they were a lot, that we weren’t able to use, like, the full platform, the full math platform as part of a trial. But, you know, we did a bunch of research. It wasn’t like it was an uninformed decision.
GUPTA: But I think once we, you know so we purchased the licenses school-wide and kind of off to the races with the math team. Within, I think it was probably two weeks, it was quick —
GUPTA: — the team sort of raised their hand, and they were like uh oh, I don’t think this is going to work. And there were a couple of things that kind of fundamentally we realized were just, I don’t mean, I will use the word “flaws” and “design,” “flaws” for our context, I would say.
GUPTA: And “design.” So as an example, and I observed, you know, I was in classrooms kind of while this was happening, the language used in this math program was just, I would say we were teaching sixth- through eighth-graders, was kind of on or above grade level throughout. So, you know like as an example, and if there’s a word problem, the problem would say something like, “Imagine two adjacent households and the amount of energy they expend,” or something like that.
GUPTA: Instead of, yeah, instead of like, “Two neighbors get their electricity bill.” Right?
GUPTA: Like just the language was so complex, and you know, we served a significant population of English learners, like I said earlier, we had struggling readers. Like, they couldn’t even access the questions.
GONZALEZ: Exactly. They couldn’t even do the math because of the reading.
GUPTA: Right. And it was one of those things that so clearly and so quickly when you’re observing the classroom you see kids kind of reading it, looking confused, lots of hand raising and eventually getting pretty frustrated.
GUPTA: I think the second thing was actually around how the system provided feedback. So the way it was designed, it had very, like kind of intentionally rigorous and complex math questions, which is great, but a student would do 10 questions in a row, but not get immediate feedback on whether they got the question right or wrong. They would do all 10 and then they would get like a response of, you know, “You got eight out of 10 right” or “nine out of 10 right,” whatever it was. In our case, since we did have a lot of struggling learners, there were, you know, they’d do 10 problems, and they’d spend like a good, you know, 40 minutes of an 80-minute block doing 10 problems, and then it would just be like, zero out of 10.
GONZALEZ: Oh, yeah.
GUPTA: One out of 10.
GONZALEZ: Yeah, yeah. It’s really discouraging.
GUPTA: So discouraging. Oh my gosh. Kids were like investing the time and trying to, you know, work through it and that delayed feedback after they had put in a lot, it was crushing.
GUPTA: It was the kind of thing where it’s like, this is not motivating for me. And then, you know, they’d get the one out of 10 and then there’d be like, oh, and here’s all these tips and kind of hints and whatever. But then they have to go back one by one. Just the design of that without instant feedback —
GUPTA: — was soul-crushing for our students.
GONZALEZ: Yeah. And so the feedback was just sort of a summary of things that they might want to work on? They didn’t sort of link back to your specific problems that you could — ?
GUPTA: It did link back where it’s, you know, told you how many you got right, and then you’d have to go back to each problem, and you can kind of uncover some hints and things.
GONZALEZ: Okay, yeah.
GUPTA: But it was sort of just the way they structured it did not end up being a motivating experience —
GONZALEZ: Yeah, yep.
GUPTA: — for students, especially if students are, I mean I could see how if you’re, you know, maybe if students are getting nine out of 10 right or 10 out of 10 it’s like, oh, you know, that’s great.
GONZALEZ: Right, right.
GUPTA: And you know, they could move forward without having some of that emotional kind of frustration. But yeah, for students, it just didn’t seem like the system was designed for students who might be getting questions wrong.
GONZALEZ: You know what kills me about this too, I’m picturing, you know, a teacher with a program like this, and this is just one specific program you’re talking about. There’s probably lots that have little things like this that are wrong, is that some teachers are going to interpret that response from that student as a behavior issue.
GONZALEZ: They’re going to say, “you’re not trying” or “you have a bad attitude” or “you’re just being lazy,” and the kid is, they’re there and they’re ready to give it a try, but they’re wrestling with the way this tool is designed. Very few kids are even going to be able to articulate that that’s what the problem is.
GUPTA: No. Our students, I think that’s actually a really important point, because again, if you’re not paying attention to this, our students couldn’t describe their frustration. They just go like, “I hate this program.”
GONZALEZ: No, exactly.
GUPTA: I mean their response was “I hate this stuff.”
GUPTA: Which is you know, which I, you know, when you kind of watch and see, then you sort of realize like, oh, I understand why you’re not enjoying this.
GUPTA: But unless you’re watching, I think you’re totally right, right? You’re totally right that it can be interpreted as just like a disengaged student —
GUPTA: Another, you know, a student who’s struggling with motivation issues, where I truly fundamentally think there’s components of design —
GUPTA: — that were influencing that, yeah, I’m with you. I’m totally with you. And you know, we gave feedback to our contacts. But that kind of stuff is not the kind of stuff they can change overnight, obviously.
GUPTA: And for me I think what it uncovered is I was curious sort of after the fact, I was thinking about this, I was curious, I’m like, I wonder who, what classrooms they designed in.
GUPTA: Like what were the places and, if anything, it just sort of highlights, like, you really gotta think about where, who you’re designing for and make sure you’re, yeah, kind of proactive about —
GONZALEZ: Actually trying, right, trying it, I mean sometimes I don’t even know sometimes if there is even a lot of testing. Sometimes it’s just maybe they’re like hiring teachers to write these questions, and they figure they’re experienced and they should work okay, and maybe they aren’t even tested.
GUPTA: Yeah. I mean I, yeah. It’s a good question. I don’t know kind of how this one worked. I mean it was very pervasive at the time.
GONZALEZ: Right, right.
GUPTA: And it was highly, you know, when you kind of did the research, definitely was yeah, so. Just again, kind of the focus on like how different populations of students can respond differently, because I do think certain students flew with it, right?
GUPTA: But we just have to be aware of like hey, if it’s going to kind of have a negative impact on our students who need the most support, that’s something that’s scary.
GONZALEZ: Yep, yeah.
GUPTA: That’s something that’s scary, and we can’t, you know, we can’t ignore that.
GUPTA: Yeah. So the last lesson I would highlight, you mentioned kind of the couple of lessons.
GUPTA: This one’s like a little bit more, I don’t know, I’d love to hear your thoughts on how, this one’s a little more philosophical. I think, one thing that I just kind of am aware of and I think about is whether some of the things that are most widely used are kind of gap widening or gap closing. And I wonder, I wonder if, you know, I don’t know that, widespread usage doesn’t necessarily mean it’s good for kids.
GUPTA: I guess is what I’m saying. You know?
GUPTA: Like, a good example, I feel like right now I’ve seen in a lot of elementary classrooms digital portfolio apps that, you know, are beautiful and easy for kids to take pictures of and pictures of their work and send home to parents and all of that. But I wonder, I’m like, you know, is this a way for parents who are already engaged to get more engaged, you know? Or is it really speaking to parents who we’ve been trying to kind of bring into the fold? I don’t know. If parents don’t have smartphones and computers at home, can they access this stuff?
GONZALEZ: Yes. Or even if they’re just not super tech-literate. I mean there are people I know who seem like they should be tech-literate because they’ve got smartphones and everything, but you start talking to them and you realize they don’t understand what a lot of the apps do or the point of them.
GONZALEZ: And so, yeah.
GUPTA: And then obviously if a teacher starts relying on that as their main source of communication, or kind of main pathway of communication, and they’re not making calls anymore, they’re not kind of reaching out via other kind of more traditional methods, then like, oh, are we differentially kind of leaving out a population of parents?
GONZALEZ: Sure, right. Yeah, because if 90 percent of your students’ parents are using a tool, that seems great. But then there’s still that 10 percent. Like, are you doing something to reach that 10 percent or is it like, eh, 90 percent’s good enough?
GUPTA: Yeah. And if you don’t pay attention to it, you’re like oh yeah, it seems like all my kids and all my parents are good with this, great.
GUPTA: Right? But, and I will say I think it’s probably likely that if there is a, you know, a subset of folks who aren’t kind of able to engage or aren’t able to access, it’s probably folks who kind of, we want to make sure we’re not leaving behind, right?
GONZALEZ: Yes. It’s probably most important to be reaching out to those families, and it almost seems like something that could be built into a tool. I’m thinking right now of a parent communication tool that actually has analytics in it, so that the teacher can look at the sort of dashboard and see which parents actually opened the email that they sent out, and then they can sort of —
GUPTA: Oh, interesting.
GONZALEZ: And then they can take some reaction then from that point forward to reach back out to that smaller group of parents.
GUPTA: Oh yeah.
GONZALEZ: So these are features that could be parts of these tools if we’re thinking about them.
GUPTA: Yeah. Language is of course another one, right? Like having, being able to have kind of accessibility in different languages.
GONZALEZ: Yeah, actually that same tool does have a translator too.
GUPTA: Oh, cool.
GONZALEZ: So they’re thinking through these, yeah.
GUPTA: That’s good, that’s great. Yeah, so I think it’s just more of like a reminder that we have to still test for, I guess, kind of the impacts on different student populations and communities, even if something is super widespread.
GUPTA: Because I don’t think that that’s necessarily, you know, I don’t know if that’s always just like our only indicator that it’s great for kids, yeah.
GONZALEZ: Yeah, yep. So, okay. So this is raising a lot of awareness of the issues that there could be. So if I am now, you know, looking at tools as a teacher, an administrator, or a tech person from my district, what are some things I can do to ensure that the tools that I’m choosing are good for all of my students?
GUPTA: So I think the first and most important thing is sign in as a student and use the tool.
GUPTA: Go through as much of it as you can.
GUPTA: And I could, this is one of those things that I actually kind of find a little bit frustrating about many tools that I’ve used, they often don’t have easy ways for a teacher to do that. Like, we used to have to come up with all sorts of hacks. I’d create fake student accounts.
GUPTA: And then, you know, we’d use those. Like, it wasn’t super easy always —
GUPTA: — to just sign in as a student and go through the curriculum or go through the content or whatever.
GONZALEZ: Yeah, yeah.
GUPTA: But I would just highly, highly, highly kind of make that a part of, encourage making that a part of any protocol where you’re evaluating something.
GUPTA: And when you do that, I think it’s also important to literally kind of put your, maybe even like if you have two teachers who are going to sign into, assign two teachers to do it, one should like kind of put themselves in the shoes of basically almost like role play.
GUPTA: Put yourselves in the shoes of different types of students that you serve. And like imagine, think of, like oh, there’s a student that I’m thinking of specifically, how would he or she react to these different components in this?
GUPTA: I think just by doing that and really kind of role playing it out a little bit, you sort of uncover some of the stuff of like, oh, that would be frustrating. Like, oh this student would probably get this question wrong and oh, that’s how it responds? Wait a minute.
GUPTA: So I do think that’s really important. And then you can kind of think about, like, oh, based on those struggles, is that a deal-breaker for me or not? And if it’s not a deal-breaker, that’s great, right, but there might be things that you proactively do in the classroom to support that student. So even in that 10-question example, the 10 math question thing that I was saying —
GUPTA: — our teachers discovered that the same time our students did.
GUPTA: We didn’t know, we didn’t know that’s how it was going to be. But if they had known that, they could have probably kind of let students know that this was part of the, you know, kind of framed it differently.
GUPTA: So that it didn’t lead to a frustration, that it led to, you know, a different kind of avenue for students to work forward.
GUPTA: So I do think that’s incredibly important, for no matter, anything that you’re using.
GUPTA: The second thing I would say is I am just a huge proponent in general of piloting.
GUPTA: In like a small context. So before, this I learned the hard way, of course. Like it’s very tempting to purchase things school-wide, like oh, we’re going to do this math curriculum, let’s just do it school-wide. Or oh we’re going to use this formative assessment platform. Let’s just do it school-wide. And you kind of feel like oh, if I rip off the Band-Aid it’ll, it might be a little harder, but everyone will get there faster.
GUPTA: You don’t get there faster.
GONZALEZ: No, no you create so much more work and more, yeah, more waste.
GUPTA: Yeah, and you don’t uncover some of these issues that you can work through, that you can kind of think about and work through. So I’m just like a big believer of piloting and be thoughtful about who you pilot with. Like you know, if you’re doing a math curriculum, maybe pilot with your algebra class and your, one of your intervention classes.
GUPTA: You know, whatever. Like make sure you’re thinking about, because specifically let’s pilot with different student groups, and then let’s see and ask students kind of how they’re enjoying the product, whether they feel like they’re becoming better learners as a result of using the tool. Observe them, of course, so you can kind of help and uncover issues that they might not be able to articulate. But thoughtfully pilot with a diverse kind of set of students. And I think, just like a rule of thumb on this, I would say generally pilots that are, I mean they don’t have to be a whole year. It doesn’t need to be a whole year, is what I would say.
GONZALEZ: Okay, okay.
GUPTA: Like second semester of, you know, toward the end of the school year, maybe even after testing when you have a few weeks to kind of play around with things and maybe have a little more flexibility, a little less pressure, like a three-, four-week pilot of something is, I think, gives you a lot of information before kind of implementing something.
GUPTA: Yeah. Because I do think piloting for a whole year can feel, that’s also difficult if like one teacher’s doing something really different than everybody, I mean I get it, you know.
GUPTA: Sometimes you just need to move forward as a team.
GUPTA: But I would just encourage, even like three-, four-week pilots.
GUPTA: A single unit or —
GONZALEZ: Give you a lot of information right away, yeah.
GUPTA: Give you a lot of information.
GUPTA: If you do get data from either from the, any tools that you’re using yourself or from other benchmark assessments and any other assessments that you’re using on your site, break down the results by different student populations. So this goes back to kind of the first example.
GUPTA: If we just looked at our average scores for sixth-graders, we would have, you know, been patting ourselves on the back and been excited to move forward. But it wasn’t until we actually disaggregated the data that we saw the big challenge. So find ways to do it. I will say sometimes it’s not easy to do this.
GUPTA: Sometimes you have to kind of be cutting and pasting from spreadsheets and putting it together in a way that you can do it. So, you know, and I feel like in different schools and districts there’s resources to help with that or people who have more, kind of, [INAUDIBLE] and whatnot. We were lucky at our school. We had some expertise —
GUPTA: — and were able to kind of make that work quickly. But I would just highlight this is something that, especially if you’re at the district level or an administrator, like find, figure out a way where you can disaggregate data.
GONZALEZ: Okay. Yeah, that’s a really important suggestion.
GUPTA: Yeah, and then kind of my next set of suggestions is around just sort of asking the right questions. Like I think even asking yourself critical questions around, like, why is this tool, how is this tool fundamentally changing something about teaching and learning?
GUPTA: And then building from there. So, you know, is it going to kind of give students more opportunities for self-direction or is it just an online version of a worksheet?
GUPTA: You know? Like what is it about this that’s kind of innovative or different, because I think when you ask yourself those questions, you can kind of think about how that’ll play out for different groups of students that you work with.
GONZALEZ: Got it, got it. Yeah. My brain is sort of going off in another direction right now, because I just, I gave a presentation about this, about the things that tech should be able to do for you and the other things that really don’t matter that much. And so it’s, you know, can it give a unique experience, can it make something a little bit more efficient, can it, yeah. So anyway.
GUPTA: You know, I think efficiency, if it is about efficiency, like oh, this, you know, this tech tool is basically the same thing that I was doing on paper with my kids, but it makes it way easier for me to grade it or it grades it automatically —
GONZALEZ: And can give them feedback faster or something like that, yeah.
GUPTA: And give feedback faster, that’s great. If that’s the case, just kind of name it —
GUPTA: — as that’s what I’m looking for in this tool, and then I can move forward with that. But if I’m looking for an individualized student experience, and it’s either not happening or the experience is kind of up-level and really down-level for different students.
GUPTA: Then you’re like, oh, that’s kind of what I have to, like if that’s what I’m looking for, am I getting that? And yeah, kind of paying attention to that. Asking questions of the, this goes back to what we were talking about earlier, asking kind of the builders, the people who are, people like me now who are building tools like this, asking them about impact, evidence of impact, how evidence of kind of an experience in working with different types of learners. I think there’s not nearly enough transparent information about this. So even if you like just poke around a lot of, you know, popular technology tools’ websites, there’s always something maybe on impact that’s like oh, you know, we help kids read better. I’ve never seen where, or very rarely I should say, seen where it’s like disaggregated by different levels of learners.
GONZALEZ: Yeah, yeah.
GUPTA: Or different levels of readers. And I think a lot can get lost in those kind of big, high-level statements that might be impacting your students, you know. So I think the, I would put the burden on people like me now who you should ask. Like, tell me what the difference is between these different types of students I serve. And if you don’t know, how are you going to find out?
GONZALEZ: And so —
GUPTA: Because I don’t want to be the guinea pig who finds out, you know what I mean?
GONZALEZ: If I’m somebody who’s actually got a tool, like what are some of the best ways to show that impact? I mean would it be case studies, would there be testimonials? I mean sometimes those can be pretty easy to sort of drum up also. Do you need to do research or get, are there universities doing research on specific tech tools?
GUPTA: There are. I think the best is, like, doing just authentic research. Because to me, for example, I actually saw one of these pretty recently where, you know, the, I’ve been, when I was in my district role before my school admin role I was in the district office, and I used to do evaluations of different initiatives and products that we were using. And I just kind of laugh when I see this stuff. When someone says, oh yeah, when kids who use our reading software, you know, grow in reading by whatever, by 1.5 grade levels in a year, or something.
GONZALEZ: Mhmm, yeah.
GUPTA: But then they don’t have a control group, they don’t have a set of students who —
GONZALEZ: Didn’t use that, yeah.
GUPTA: — didn’t use that tool. Right?
GONZALEZ: They might also grow by 1.5 grade levels.
GUPTA: Exactly, right? We don’t, and to me the only way for, really that I’ve seen, where you can really diagnose whether something is having the impact that you want, and it’s because of the tool is by doing a control group, kind of thoughtfully designed study.
GUPTA: I don’t think these studies need to be, I think those, you should be figuring out ways to measure as you go. And sometimes that means, like, oh I’m just going to look at two classrooms, 30 kids and 30 kids, my control and see what’s going on and share that. So they don’t necessarily need to be these huge, you know, thousands of students across every, I just, I think and unless we’re kind of putting some pressure to measure results —
GUPTA: — where, I mean similar to in other industries, like in medicine and other things, right. Like, you need to know whether what you’re doing is working.
GUPTA: And yeah. So I mean I do think as an education kind of, especially in the technology space, as like a sector, this is being, like the thinking is evolving, and hopefully there’ll be more ways for us to share kind of what this looks like.
GUPTA: But I think the disaggregation of it is really important again.
GUPTA: Like saying “on average” doesn’t really help, because I do think we’ve seen averages kind of hide sometimes the equity challenges.
GONZALEZ: Yeah. You know, the other thing that I’ve noticed is that, like I was trying to sort of poke holes in a particular tool at one point, because it didn’t seem like a good tool. One of my kids was having to use it. And so when I dug into it a little bit and they had their own research, but it was, they had their own test that they gave, and then they’ve compared students’ performance on their test to their performance on that same test not nine months later. And I thought well, of course. If it’s your test and if all of your materials are preparing students to do better on that test, of course they’re going to do better.
GONZALEZ: Like it seems like it’s important for them to also do, use some sort of outside metric of some kind.
GUPTA: I’m totally with you. I’m like laughing because that’s the kind of stuff that then they’ll make a bar chart that doesn’t disclose any of that, that’s just like “students who use it and students who don’t,” like big bar, small bar.
GUPTA: You know, and you’re like, what.
GONZALEZ: But it was all in-house, it was all in-house stuff, so it’s like, meh.
GUPTA: Yeah. I think applying a critical eye to any sort of study is always really healthy. Like I mentioned already one where they didn’t have a control group. They’re like well, what are we comparing against? So that’s not well-designed. And immediately for me I’m like, okay, so that, I’m going to just pretend that study doesn’t exist.
GUPTA: Because that’s not, it doesn’t even, it doesn’t tell us any information, right?
GONZALEZ: Yeah, yep. I’ve seen some where they like really kind of make the leap of correlation to causation, in kind of a hysterical way. There’s one that was a grammar software, it was grammar, and they found that they did, you know, whatever, their internal study, and they found that students who use the grammar software performed better on end-of-state tests, like SBAC, which is what we use in California. SBAC doesn’t test grammar.
GONZALEZ: Oh geez.
GUPTA: There’s nothing on SBAC about grammar. So you’re like, well I’m guessing the teachers who are using your software are also teaching their kids —
GONZALEZ: Maybe in a way that’s —
GUPTA: — you know what I mean?
GONZALEZ: Yes, yes.
GUPTA: In way that would be —
GONZALEZ: There’s no relationship.
GUPTA: Like this is not related to, right? Because the test, like, fundamentally doesn’t test on grammar, so how can we be making an assumption or kind of saying that that connection is real? It just like kind of screams at me around hey, we can’t be connecting correlation and causation.
GUPTA: There’s a whole other set of factors. But again, if you don’t kind of think critically about some of the statements that are being made, it’s easy to see, like, oh, if my kids use this tool, they’ll do better on end-of-state tests.
GONZALEZ: Yep, right.
GUPTA: Great. I do feel like there’s probably room for some sort of watchdog, I don’t know, do you want to be this watchdog?
GONZALEZ: I would need to hire a whole staff, yeah, right.
GUPTA: Yeah. It would be like correlation without causation. Like, kind of like a fact, you know how they have in political stuff, the fact-checking?
GONZALEZ: Yeah, right.
GUPTA: I feel like we need that. Some sort of research-checking on, like, are you making assumptions here that are not valid or extend this —
GUPTA: — beyond what it was designed to do.
GONZALEZ: There’s room out there for whoever wants to start that up, go do that.
GUPTA: I will gladly support that, because I mean frankly, it’s just one of those things like, you know, your average educator isn’t trained to be critically looking at research studies —
GONZALEZ: No. They don’t have the time either —
GUPTA: — and figuring out what —
GONZALEZ: — to really dig through all of that.
GUPTA: — no, nobody has the time. Yeah. And sometimes this stuff is like really buried, really buried in kind of like the, you know, the assumptions or language. Yeah, so I really do feel like, maybe some of these, I think there’s a few different online programs, or online websites that kind of rate apps and things. Maybe they need to add like a “is the research valid?”
GONZALEZ: Yes, Common Sense Media does, they might be able to add that piece.
GUPTA: Common Sense, yeah, oh.
GONZALEZ: Okay. I’ll get on the phone.
GUPTA: We should totally drop them a line. Honestly.
GUPTA: Because I do think if there’s like a place where you could trust a source of like, hey, that research is —
GUPTA: — you know, pretty legit, not legit at all, or like truly the best in class, independent study, in partnership with a university.
GUPTA: Like if you had some sort of way to assess it, I think that would be —
GONZALEZ: That would be great. At least having some layer of somebody just looking at it a little bit longer than yeah, yeah. That’s a great idea.
GUPTA: I guess the last kind of thing I’m thinking about is just kind of that educator sense that I think all of your, the folks you interact with have on is this good or is this bad? I mean even what you’re describing around the tool that you’re, one of your kids was using, where you’re like, the educator in me is feeling like this is not —
GUPTA: — you know, the right path forward or this is not having the impact that it’s supposed to be having. That, I don’t know, I think experienced educators have such an amazing kind of sense for what’s going to work well for, particularly for their students in their context, you know. So kind of trusting your gut a little bit, and it doesn’t necessarily mean that anything that gives you a cause for pause is something that you have to scrap. It’s more like OK, this is making me nervous about X, Y, and Z. What scaffolds am I going to put into place?
GUPTA: Or what strategy, instructional strategies will I use, and then if those still don’t work then maybe kind of let me look at other tools that will be a better fit for me. But yeah, just kind of really, I think even with the math example that I was sharing earlier, I think initially when the math teachers came to me like, “Something doesn’t feel right about this.” And they couldn’t even necessarily articulate some of the things that we eventually identified. They’re just like, “Something’s not right.” It took us awhile. I had to, like, go and observe and we did, you know, we kind of spent a couple weeks just diagnosing. But it was, the reason we did that is because it was flagged, it was flagged by, I still remember, one of our sixth-grade math teachers. Like something, this isn’t working the way —
GONZALEZ: So trust that, yeah. And you know, I think kids are actually probably —
GONZALEZ: — the first litmus test of that. I mean this is the reason I started to dig into this one tool, because my son was coming home every day saying how much he hated working on this thing. He’s like, they’re always making us get on this thing, and I’m just thinking, he’ll play Fortnite for four hours. He loves technology. So it’s not the fact that he’s being plugged into some app, it’s what’s going on with that app. And he’s an avid reader, and he likes to be challenged, so I’m thinking, if your kids hate doing it, that’s a sign that maybe something is not right.
GUPTA: Yeah. I think that’s a really good place to start. Because I like fundamentally believe that not only, like, all kids can learn, all kids want to learn.
GONZALEZ: I think so too.
GUPTA: You know? They are wired to want to learn, you know, and if there are things that are getting in the way with that, we gotta kind of re-evaluate those things, right?
GONZALEZ: Yeah, yeah. Yep.
GUPTA: I’m with you. And, I mean at the end of the day, none of this is intended to kind of, you know, suggest that teachers stop using things that they like.
GUPTA: Or, you know, like kind of drop it all. That’s not the intent at all. It’s meant to more just be kind of consciously thinking about, and when you are testing new tools and ideas, make sure that your, like I would just love to kind of make sure that this is a, this thinking is a part of the protocol, you know, part of the kind of thing that you report back out to the leadership team with, or whatever.
GUPTA: Because if we can elevate it in the conversation, then I think it’s just more likely that the whole system will adjust to make sure that it’s elevated in importance, right?
GONZALEZ: Yeah, yes. So if people want to find you online, where would they go?
GUPTA: Yeah, so I mean this, I’m obviously personally fired up by all of this, so I would love to engage and talk more and get feedback, frankly, on some of the stuff we’re trying to do.
GUPTA: To address this stuff proactively. So I mean you can always, of course, find me. I’m on Twitter, @rupa_c_g. My email is firstname.lastname@example.org.
GUPTA: One thing that I would love if people want to go check out, one of the things that we fairly recently launched is our, at Sown to Grow, our impact page, where we try to actually be really transparent about, you know, who our product, who our tool serves and what the impact is and just, like, different kind of cuts of data that I wished I had with other tools that I use.
GONZALEZ: Yes, yes.
GUPTA: So if you want to go check it out and let us know if you think it, I mean it’s a start. I won’t say it’s perfect by any means but would love your feedback there. So that’s just going to sowntogrow.com/impact.
GUPTA: And send us feedback, send us kind of the questions you have and help us get better, help us kind of make sure that we’re always, always serving all students, and kind of accelerating the growth for our most vulnerable students.
GONZALEZ: Awesome, awesome. And just in case people are in their cars listening and they don’t quite know what Rupa’s saying when she’s saying Sown to Grow, this is “sown” like the gardening, S-O-W-N-T-O-G-R-O-W dot com, and I’m going to have links to all of these things that Rupa’s mentioning over on Cult of Pedagogy too. This is going to be, I believe, episode 99. But I’ll mention that also in the outro, yeah. Thank you so much for raising awareness of this issue. I think this is so important.
GUPTA: Awesome. Thank you so much for engaging, and I think, I mean I don’t think any of us have all the answers —
GONZALEZ: No, yeah.
GUPTA: — on this, but hopefully we can just inspire more conversation and I, I like truly believe in the power of educators who are all, like in my experience, are all kind of wholly committed to doing right by kids.
GUPTA: And if we use that power, we’ll yeah, we’ll solve some of these challenges, there’s no doubt in my mind.
GONZALEZ: I totally agree. Thank you so much, Rupa.
GUPTA: Thank you.
For links to all the resources mentioned in this episode, visit cultofpedagogy.com, click podcast, and choose episode 99. To get a weekly email from me about my newest blog posts, podcast episodes, and products, sign up for my mailing list at cultofpedagogy.com/subscribe. Thanks so much for listening, and have a great day.
This podcast is a proud member of the Education Podcast Network: Podcasts for educators, podcasts by educators. To learn more, visit edupodcastnetwork.com.