Editor’s Picks


Learning Analytics and Its Potential to Bridge Education Silos: An Interview with David Niemi

By Henry Kronk
December 04, 2018

The current state and pace of development of edtech poses a huge potential for educators, administrators, researchers, and ultimately, learners. With the ability to gather large amounts of data comes the ability to determine and understand the processes of learning in ways society has not yet realized. But society has also yet to realize what this data and learning analytics as a field can accomplish. At least, that is the view of researcher and education scientist David Niemi. As VP of measurement and evaluation at Kaplan, Inc., Niemi pours a lot of effort into determining how learners learn. That work, along with other researchers in Bill and Melinda Gates Foundation work group, was published this fall in a new title, Learning Analytics in Education (Information Age Publishing).

Niemi first entered the field of education as a high school English teacher. He has since received his PhD from UCLA, formed a couple early computer learning companies, chaired the Educational Psychology department at the University of Missouri, Columbia, and has designed and investigated numerous educational testing initiatives for private companies and government bodies, all while continuing his research on assessment science.

A few years back, Niemi was contacted by the Gates Foundation to see if he wanted to join a new initiative known as the Learning Analytics Work Group. The group was headed by Stanford Roy Pea, who has long called to boost research of data science in education.

The Promise of Learning Analytics

“[The initiative] was partly inspired by the fact that we now have access to huge amounts of data which we didn’t have before, particularly data from online courses and programs where you might have tens of thousands of students studying the same course at the same time,” Niemi told eLearning Inside. “In principle, it’s possible to track every click, monitor their performance and reactions to assessments and other work you send them along the way, and more. There’s a huge amount of information to look at there.”

Niemi, along with Pea and other members of the group, set out to apply data analytics techniques used often in fields such as international finance and medicine to education.

cover of Learning Analytics in Education“We wanted to know: Could those be used to analyze education data? Could we use those data analytics techniques to actually improve learning? To do that, you need to have people involved in the field who actually understand learning and how it happens. Then you need to get them to work together with people who are using the more exotic analytic techniques and so on to try to figure out how we can learn something useful from the data,” Niemi said.

In today’s climate, it might surprise some to learn that a field like learning analytics is not already robust and well-funded. But research on the current scale has only become possible in recent years.

“Other people have kind of tried to do something similar before, but in the past, data was far more sparse,” Niemi said. “You’d have information about students in the principal’s office in the school or something and if anyone wanted to do research on that, they’d have to go collect those folders and manually enter the data. The picture has completely changed now.”

After pouring a few years’ work into the initiative, Niemi figured it was time to put ink to paper. Learning Analytics in Education includes nine chapters, each of which focus on a specific issue involved in the process of collecting and analyzing student data. The cover subjects from using learning analytics to boost student retention and persistence to student data security. Other authors include Roy Pea, Emeritus Professor of Educational Psychology and Technology at the USC Rossier School of Education Richard Clark, and Bror Saxberg, VP of learning science at the Chan Zuckerberg Initiative.

Silos in Education

The field of education, in its private, public, and academic iterations, tends to be largely siloed. While many edtech companies might have developed (or have the potential to develop) products that can help in the classroom, they don’t always do a good job of testing what they’ve created. Teachers and administrators, on the other hand, tend to be cautious of exaggerated sales pitches.

“I think there’s actually good reason for people to be wary of technology because if you go online or if you get contacted by companies that have come up with some new solution, my overwhelming experience in investigating what they’ve got is that it’s almost all hype,” Niemi said. “People make huge claims for what their systems can do without any evidence behind them. They haven’t done any studies or tested the effectiveness of what they’ve got.

“Sometimes users of this stuff unfortunately aren’t sophisticated enough to know what kinds of evidence or data they should be looking for and they’re more prone to buy things on the basis of hype. So I’m all for people being very wary and cautious of what they’re getting into. When somebody talks to me about a new product, the first question I ask is, ‘What evidence do you have that this works?’ If they say they don’t have any, I don’t even really want to hear about what they’ve got. I don’t care if they can’t show that it works.

“I got into education because I was not happy with how I had been taught. I became increasingly aware of that in high school and college and said, ‘Well, let me see if I can do something about that.’

“After I started my teaching career, the first Apple computers were introduced, the old Apple IIs. Apple donated a lot of them to schools with the idea that just putting these boxes in classrooms is going to change everything about education. It didn’t. I was actually involved in evaluating that claim, and we worked really hard to find any positive effect of just sticking the computers in the room. They weren’t being used and they weren’t helping in any educational effort because teachers didn’t really know what to do with them. The students were happy to use them, but they mostly wanted to play games.

head shot of David Niemi“The hype of introducing computer technology into classrooms—it’s very hard to work back. Many studies have been done. It’s very hard to find any evidence that they can make a difference [on their own]. It’s mostly a question of how they’re being used. Rolling stuff into classrooms isn’t going to make any difference. You have to know what to do with them, you have to have training for teachers, and you have to have something that is actually effective.

“So I’m happy when educators are cautious about what they’re getting into and cautious about the claims that technology is going to revolutionize everything.”

On the other hand, many academics continue to research education, but they aren’t the best communicators when it comes to findings that improve education. Teachers, in turn, can’t easily dedicate time to reading emerging research.

“There hasn’t been much uptake [of new research],” Niemi said, “and there are a couple of reasons for that. People in academia are often studying very narrow kinds of things or series of things, because that’s a way to land tenure. But they’re not always so concerned with how things might scale up to a big school district like New York or Chicago or Los Angeles.

“There are some people who are starting to turn to that who are asking how they can do research on a large scale.

“But also, educators can’t really read research reports. They’re technical and hard to read and require someone to translate those results into some meaningful language that everyday teachers can understand. There have been some books.”

Niemi is quick to point out the work of Professor Richard Mayer at UC Santa Barbara who has written some texts about new research explicitly for teachers. But Mayer stands as the exception that proves the rule.

“Generally, there hasn’t been much interest from educators to learn more from what research tells them they should be doing, and researchers haven’t paid much attention to how they can make sure their results are useful and interesting to educators,” Niemi said.

“If we can get learning analytics to work correctly, it would mean the research that’s being done on new technology and the data you can get out of new technologies—we have to get those groups to connect with educators to figure out what kinds of important questions and problems can those technologies actually solve.

“If there’s no connection at all on that front, we’ll have learning analytics people doing their work in one silo, and have education continue on as it has.”

The is the first of a two-part interview. The second part detailing Niemi’s view of the future of learning analytics will appear next week.

Featured Image: Aron Van de Pol, Unsplash. Other media provided by Kaplan Inc.