Can the “Internet of Things” help teachers observe the hidden dimensions of human learning in real time, without compromising student privacy?
It’s a “moonshot,” said the leaders of a new multidisciplinary research initiative at the University of Southern California. But it’s worth a try.
“We want to provide meaningful data to help teachers make sense of what’s going on internally for students,” said Kenneth Yates, a co-director of USC’s new Center for Human-Applied Reasoning and the Internet of Things, or CHARIOT.
“That, to me, is true personalized learning.”
Launched late last year, CHARIOT is a collaboration of USC’s Rossier School of Education and Viterbi School of Engineering. The project is headed by three people:
- Yates, a professor of clinical education and expert in instructional design, whose work on “cognitive task analysis” (basically, documenting the often-invisible decisions and judgments that experts make when performing their jobs) has been used to inform training simulations for medical schools and the military;
- Bhaskar Krishnamachari, a professor in USC’s schools of engineering and computer science, whose work (including collaborations with large companies such as GM and Bosch) focuses on the development of software and technical protocols that guide the deployment of sensor networks; and
- Rao Machiraju, an executive-in-residence at USC’s education school, who has an extensive background in computer science and starting tech companies.
It’s early days for the initiative. To date, all of CHARIOT’s funding has come from inside the university, and the group’s research agenda is still being mapped out. But the idea is clear: develop and test networks of internet-enabled sensors that can be worn by students and embedded in classrooms.
The Internet of Things is already transforming transportation (think app-driven ride-sharing networks such as Uber and Lyft), building design and construction (think “smart” homes with sensor-based climate control systems), and large-scale supply-chain logistics. And the K-12 sector has started to dip its toes into the water—see, for example, this Education Week profile of AltSchool, a Bay Area start-up that is already developing and deploying extensive sensor networks in its private-school classrooms.
Can sensor networks provide teachers with meaningful information on students’ cognitive and emotional readiness to learn? What would be the benefits—and drawbacks—of doing so? What happens to all the data these sensors would generate? Is this a technology parents are willing to accept?
I sat down to discuss with Krishnamachari and Yates last week at their USC offices. Following is a transcript of our conversation, edited for length and clarity.
This is an unusual partnership, between colleges of education and engineering. What motivated you to do this now?
Krishnamachari: When we talk about the Internet of Things and its component technologies, from sensing hardware to data processing and machine learning, they have really matured quite a bit over the last 10 to 15 years. From an engineering standpoint, we are at a stage where we want to take these technologies and look at domains where they can have really deep impact. That requires us to go outside of our comfort zone. It’s a moonshot, but the impact it could have is huge.
Yates: I’ve been interested in ed-tech and cognitive science for some time. In many respects, we know how learning occurs in a laboratory. We can put people in fMRI machines and scan their brains as they are learning something. The question now is, how can we move what’s in the lab into a learning environment?
What are you hoping to learn?
Yates: Real personalized learning is finding three elements that you can’t see in real time: students’ cognitive readiness to learn, their emotional readiness to learn, and their prior knowledge, which will effect how much cognitive load they may be under when trying to learn something new. But all the work to track those is being done in the lab. That’s where these sensors come in. We want to provide meaningful data to help teachers make sense of what’s going on internally for students. That, to me, is true personalized learning.
The moonshot is really getting that level of insight in a classroom environment. What kinds of data are required? What kind of algorithmic advances and new hardware are required? The goal is amazing, but it’s unclear how you get there, and it’s unclear what the prospects for success are.
Give me an example of where this has worked in other fields, and why that gives you confidence it can work in education?
Krishnamachari: People don’t think of Uber as an Internet of Things company. But it is. It’s tens of thousands of drivers, and you’re constantly keeping track of where they are, of traffic conditions, of pick-up requests, and everything else. All of that is running using a massively distributed sensor network to automate that process of matching riders with rides.
The other big domain, which is still transitioning from research labs to commercial systems, are smart buildings and smart homes. Nest is an early example. It measures the occupancy of a home, users’ desired temperature conditions, and it tries to make the operation of the HVAC system as energy-efficient as possible.
What kinds of sensors and data collection are you imagining for K-12 classrooms?
Krishnamachari: Part of it is wearable technologies. We would look at collecting EEGs, pulse rates, heart rate, the surface temperature on your skin, signs of movement, eye movements, interactions with others in the classroom. Part of the challenge is to take those kinds of more automated sensor measurements and fuse them with more traditional assessments, such as inputs from teachers, quizzes, student activities. Plus maybe even information that is outside the classroom, such as how much activity or sleep the student got the night before.
Yates: It’s not more of the same data. It’s different data. Is a student not on task because he is emotionally distressed? Or because he just doesn’t have the prior knowledge to do the task? Once you know that, you can give feedback. The personalization goes back to finding out exactly where the learner is. Teachers don’t have the time to know that kind of information with that kind of granularity.
It’s also data that is very sensitive and that could easily be misused.
Yates: Data security and data privacy are two other components of the moonshot we have to address. These are things that will be tested in a lab, with institutional review, and then gradually tested in other controlled environments. From an engine research perspective, really the question is, How can you build a system that doesn’t necessarily need to retain a lot of historical information?
Krishnamachari: The privacy aspects of our research will ask those questions: What should be the restrictions on data use? What kind of data are OK to collect? What kind of anonymization and aggregation need to be done? There’s a tradeoff between loss of privacy and benefits of personalization—what’s the right point in that curve, that is both socially acceptable and can improve learning?
Why not just invest in smaller class sizes, so teachers can have the time to do this for themselves?
Krishnamachari: I don’t know that these have to be at odds with each other. The push for smaller classrooms is well founded. Our question is whether there is value for teacher in observing the unobservable, even in smaller classrooms.
Photo: Bhaskar Krishnamachari, Rao Machiraju and Kenneth Yates. (Courtesy University of Southern California Rossier School of Education.)
See also:
Follow @BenjaminBHerold for the latest news on ed-tech policies, practices, and trends.