Articles

Op-Ed

Voice Assistants on Campus Need to Solve this Security Problem

By Somen Saha
November 18, 2020

Over the past couple of years, universities have acknowledged a well-defined shift from the initial novelty approach of using voice assistants – such as Alexa and Google Home – to more personalized modalities, such as AI-powered chatbots on their smart phones. While technology continues to evolve, student engagement transitions along with it.

There is, however, a significant hurdle when technology becomes more widespread and offers an influx of personalization tools: student data protection. While I’ve been a pioneer in bringing these technologies to universities, looking ahead, it’s becoming increasingly apparent. Voice assistants are good at voice recognition, but leave a lot to be desired with voice authentication.

The Main Issue with Voice Assistants: Security

Right now, students and parents alike are overwhelmingly cautious of their data. It’s easy for colleges and universities to go with a smart speaker; however, privacy is no longer “the” topmost priority for Big Tech companies who thrive on data collection business models. Selling data to third parties is growing significantly, assisting companies in building their marketing models around such information. It is incumbent upon the colleges and universities to protect their students’ data.

While student engagement remains top-of-mind for colleges and universities, many are reviewing where student data is stored. Today’s students recognize that marketing is a powerful tool and that companies having third party access to their personal information could be dangerous. By protecting student data and using that data to enhance the student experience, colleges and universities have the ability to create a life-long learning partnership with their most precious users–students.

Privacy Is a Growing Concern with EdTech Products and Services

To no surprise, in Educause’s annual list of top 10 IT issues last year, privacy was a new entrant at number three. On the 2020 list, which is yet to be published, privacy will take the number two spot.

Moreover, California, Maine, Nevada, New York and other states have all recently introduced privacy protection laws in addition to existing Family Educational Rights and Privacy Act Regulations (FERPA) guidelines, inspired by the European Union’s General Data Protection Regulation (GDPR).

Because of these privacy concerns, chatbots are beginning to infiltrate every corner of higher ed, enhancing communication with students on all aspects of college life and creating a virtual “one-stop-shop” for student queries directly managed through educational institutions. This new personalized tool compensates for today’s Netflix-era students that want campus services at their fingertips. They want it when they want it, and they want it mobile. They don’t want to sit at the registrar’s office for hours.

Grand Valley State University’s myBlueLaker App

I recently worked with Grand Valley State University (GVSU) to bring their students myBlueLaker, a fully FERPA and GDPR compliant voice and AI virtual assistant app. The app operates in real-time and doesn’t store student data, working directly with the university security team. When the data transfers over to the student, it is transmitted with the university’s highest authentication level using a key assigned to every student. And the best part – student data never leaves the university’s secure systems.

These chatbots deliver an incredible amount of personal information that should be kept private for students. Previously, we’ve seen hackers target colleges and universities, building a compelling case for data protection and privacy. Students are reluctant to add their information to apps, and colleges and universities are beginning to note this as they implement more technological alternatives.

Data, however, remains king. By understanding students’ usage data from virtual assistants and treating that data as a valuable breadcrumb to better student engagement, more targeted services will allow universities to treat their students as loyal customers. With a better understanding of data and security, universities will create a lifelong learning partnership with their students. But a catastrophic data breach could quickly sour that relationship.

Voice Authentication Has a Ways to Go

Back in 2018, we worked with Alexa and Google voice speakers. While voice speakers are getting better on voice recognition, we’ve realized that voice authentication in a shared environment (such as a dorm room or library) leaves a lot to be desired still. Putting Alexa in every dorm room sounds tempting, but it can undoubtedly deter students, given the widespread skepticism around voice assistants. Moving to AI-powered chatbots, we’ve seen that it serves as an ideal scenario to reduce call centers’ cost while helping students navigate through campus on their phones.

Even with the uptick in the need to engage students during the 2020 COVID-19 pandemic, we had colleges that saw more than 50K additional questions being asked by students, so it was critical to have a digital system that manages those questions. An easy-to-use piece of technology solves a problem and increases student engagement by connecting with them remotely. There are also new tools and features that help colleges and universities do contact tracing on campus using a secure network directly managed by the university.

Right now, there is a growing demand for colleges and universities to buy and implement AI-powered chatbots at their schools, given the COVID pandemic shift and the lack of student engagement. While voice assistants devices haven’t demonstrated the move to a more positive transparent direction over their privacy policy, having educational institutions manage their own student data directly seems like the most obvious pick, at least for now.

Somen Saha is the founder and CEO of n-Powered, Inc.

Featured Image: Nicolas J Leclercq, Unsplash.

2 Comments

Leave a Reply