Articles

Editor’s Picks

Higher Education

At Portland State University, Educators Have Learned not to Overlook UX Design with Personalized Learning

By Henry Kronk
April 23, 2019

In 2016, Portland State University (PSU), along with six other public institutions of higher ed, received an Adaptive Courseware Grant from the Association of Public and Land-grant Universities (APLU). With $515,000 from the Bill and Melinda Gates Foundation, PSU was presented with the opportunity to “adopt, implement and scale use of adaptive courseware in high-enrollment, blended learning courses in multiple departments and programs to improve student success,” according to a release issued at the time. Earlier this month at the OLC Innovate Conference, members of PSU Office of Academic Innovation (UX Designer Kari Goin, Executive Director Johannes De Gruyter, and Project Manager Kevin Berg) presented on some of what they found. Their focus, furthermore, was a subject that doesn’t get much attention in discussions around adaptive and personalized learning: UX design.

Many adaptive edtech solutions across the board bill themselves as customizable. One might hope, of course, that a software solution powered in part by an adaptive algorithm might be flexible to at least some degree. The software tested in this context were all course supplements that assess students’ abilities in given fields of study, and deliver course content to help them address areas where they’re struggling. Some allowed instructors to add or change the focus course materials, others did not. 

Testing UX Design in Personalized Learning

The researchers presented on their experience with three of these products: ALEKS, an adaptive math supplement; Realizeit, which was adapted for a statistics class; and CogBooks, which aided a general physics course.

The classes in which they were used at PSU brought together a fairly diverse student body. Class sizes are typically between 200 and 300 students, and the school operates on a 10-week quarter system. As one might expect, retention issues were a primary concern in the investigation. And adaptive algorithms were a potential fix.

As Johanness De Gruyter said, “Where adaptive comes in is this promise or hypothesis that we’ll be able to play around with this system that tailors the content the assessment, and the sequencing to the needs of the student as they’re being assessed and data is being collected in the beginning and throughout. We would be able to tailor to certain so to speak the right lesson to the right student at the right time.”

The researchers remained tool agnostic in this investigation. While some software could very probably be applied to great success in certain contexts, they wanted to see which could be made to work well—with a focus on UX design—in their context at PSU. As with any course, the adaptive learning would not be happening in a vacuum. Ideally, it would be incorporated seamlessly with the course’s other aspects. PSU generally uses a flipped learning approach to teaching and also likes to incorporate OER materials as well. As the presenters learned, integrating the course supplements with these elements proved the most difficult in the UX design process.

The Three Courses Tested

Primary goals in the investigation were providing a seamless onboarding experience, eliminating moments when students needed to ask for help in regard to using the software and also eliminating the amount of clicks needed to begin learning.

An Algebra course was the first to be examined.

As Project Manager Kevin Berg explained, “Historically, the math department had already implemented ALEKS. When we entered into this, they had just migrated from an emporium style to a flipped model. ALEKS is an off-the-shelf adaptive homework system. They, however, had decided to pare it down and they removed a lot of the support material. Students were primarily using for formative assessments with a little bit of feedback.

“The math team had just introduced an OER textbook. From the classroom perspective, it’s mostly lectures with some group work. So the math team had not gone through a course design process to align ALEKS, the textbook, and the activities.”

And from the end-of-course survey, it showed. Roughly 40% of the students who responded believed that the work they did in class and on ALEKS was related. While they liked some aspects of the software, they also felt like the quizzes were too short and they didn’t get enough opportunity to practice what they learned. Methods for solving problems also varied between class, the OER textbook, and ALEKS.

The second course the course designers looked at was a run of Intro to Statistics, which used Realizeit, a fully customizable adaptive program. In the past, the course had experience huge variance in DFW rates. The current instructor was on board with flipped learning and had just created her own OER textbook.

“The material from the textbook, the videos, the problem sets were all built within Realizeit,” Berg said. Complicating the matter, however, were the two other course sections, which were taught by an adjunct professor and a graduate teaching assistant.

“Although the instructors agreed to use Realizeit, there was no agreement about other elements of the course, like in-class examples of problems, exam questions, and so forth,” Berg said. “Plus, there was not complete buy-in on the use of active teaching in the classroom. The adjunct was more comfortable with a lecture-based approach. Our graduate T.A. was amenable to it, but lacked experience prior to the training we gave them.”

Still, about two thirds of the students surveyed believed the material on Realizeit aligned with coursework. On the other hand, they were frustrated that Realizeit did not include step-by-step instructions for how to solve problems.

“That was a lesson learned with working with a customizable system,” Berg said. “You have to build it all out. You’re not starting with an ALEKS and curating it down; you’re building it from the ground up.”

The final course investigated was a general physics class which used CogBooks. Similar to Realizeit, it was fully customizable. The faculty teaching the course had already employed a flipped model. One concern, however, was that the instructor felt that students were not coming to class equipped with the necessary knowledge of math.

In the end, CogBooks elicited the most positive student response in terms of course integration. 85% surveyed believed it aligned with course material. But at the same time, CogBooks introduced a series of substantial updates to the software throughout the course, and it often glitched to the point that it was unusable throughout the semester.

Personalized Learning Doesn’t Easily Translate Everywhere. UX Design Is Key.

As De Gruyter summarized, “The impact of these tools really depends on how you implement them. And the impact, as we’ve shown, even if we have a really solid scenario in chemistry, for example, and we want to apply it to physics, it doesn’t necessarily apply right away. We are very hopeful about adaptive though. We see a lot of potential and our students—we’re just focusing on their satisfaction and feedback, but we have a ton of data we’re collecting also on learning outcomes and DFW drops. We’re very hopeful and positive about this.

“It’s just a matter of keeping up momentum with design. Even if you buy into iteration, you can’t exhaust your faculty to keep iterating. They have tons to do. So we’re trying to find a healthy middle ground with quick wins to keep going and then see how that evolves.”

Featured Image: NeONBrand, Unsplash.