A wide array of education technology startups are marketing their digital learning tools to teachers, who are increasingly willing to experiment with adaptive learning products that analyze students’ online activities to personalize their lessons. Writing for The New York Times, Natasha Singer reports that school district technology directors, who are responsible for safeguarding student data, are scrambling to keep track of which companies are collecting their students’ data and how they are using it.
The market for educational software designed for pre-kindergarten through twelfth grade students reached nearly $8.4 billion, up from $7.5 billion in 2010. Yet each of the 14,000 school districts in the United States is confronting issues of privacy and student data security alone. A teacher can sign up for any app or service, without the knowledge or approval of anyone else at the school district.
Some school districts have already experienced data breaches with software that they purchased from vendors, where, in a few cases, student records have been publicly leaked online. Many legal experts posit that companies’ policy of signing up teachers directly, instead of going through the school district, skirts federal privacy laws.
What protections for student privacy are in place?
A federal law called the Family Educational Rights and Privacy Act requires that school districts maintain the confidentiality of student records and retain control of those records even if some school functions are outsourced to vendors. Another federal law, the Children’s Online Privacy Protection Act, enables schools to allow online providers to collect certain personal details from children, but the Federal Trade Commission has recommended that schools do not delegate that responsibility to individual teachers.
The Department of Education recently issued best practice recommendations for school administrators evaluating educational services, and the Consortium for School Networking, an association for district technology professionals, offers a tool kit with data security questions (PDF) that schools should ask service providers. Most recently, a coalition of educational groups published a set of security principles for the use of student data.
All of these are meant to safeguard against security issues like data breaches, identity theft, and student profiling. But unfortunately, student data has fallen prey to all of these thanks to companies who are often more focused on gaining market share than taking steps to truly handle data securely and ethically. Some sites are unencrypted, store passwords in plain text, and in other ways offer inadequate protection against hackers who could potentially seek to gain access to student data.
How are privacy issues for students unique?
While the average (adult) app or Internet user can choose the apps and services that he wants to use (and decide how much, if any, research he wants to conduct on the provider’s data security policies), the security and privacy of children’s data is a more complex subject. Student data is collected from the moment a parent registers his or her child for school online, sharing sensitive information such as the child’s name, social security number, address, birth date, previous schools, and medical and behavioral history. Attendance records and test scores are slowly added to build up a record of data points, and schools often outsource the storage and maintenance of student data to third parties, where privacy advocates worry it could fall into the hands of data marketers, become subject to data breaches, or even contribute to a data trail that follows a student indefinitely.
Additionally, learning sites and apps routinely collect information down to a student’s every keystroke in order to personalize lessons with the information reaped by analyzing data about the student’s actions, level, and preferences. But the protections required of schools and vendors by federal privacy laws don’t, in practice, extend to the providers of free apps and services.
And while more than 100 companies recently signed an industry pledge on student privacy, agreeing to “maintain a comprehensive security program that is reasonably designed to protect the security, privacy, confidentiality and integrity of student personal information against risks — such as unauthorized access or use,” the pledge does not require ed tech vendors to comply with specific basic security measures, like encrypting students’ names, screen names or other personal details. It also does not prohibit companies from using weak security protections, like storing users’ passwords in plain text, a practice that would easily permit hackers to access teacher or student accounts and link students’ names to details about their academic performance.
Such security weaknesses are common on consumer sites, but the law has long treated educational information as a category worthy of special protection. Some privacy advocates want the Federal Trade Commission and the Education Department to more strongly enforce student privacy protections, in part because students and teachers have little ability to defend themselves against the current vulnerabilities and abuses.
What makes these data security issues so important?
Singer recently reported that many education sites have what one software engineer characterizes as “glaring” security problems. To make matters worse, there is no real consensus on what “good security” really means for an educational app or website. Some of the companies behind products used by millions of teachers and students, even when alerted to vulnerabilities in their systems, neglect to address the issues.
Experts say that these individual weaknesses are symptomatic of widespread lapses across the education technology industry. Insecure sites, apps and messaging services could potentially expose students — many of whom are under 13 — to hacking, identity theft, cyberbullying, or even unwanted contact from strangers. Some companies that target teachers with learning products are actively irresponsible with student data, exploiting their email addresses for marketing purposes or sharing their details with third parties.
The freemium business model, widely used by companies marketing apps and adaptive learning products to teachers, makes it easy for teachers to try new products — and bypass school district processes for reviewing the effectiveness and data security of software used in the classroom. But teachers aren’t typically able to evaluate the data-handling practices of the startups offering free apps that collect details such as students’ names, birth dates, profile photos, voice recordings, assignments, quiz scores, and grades. Teachers also may not know whether the effectiveness of such apps has been established when it comes to the learning objectives that they claim to achieve.
Though administrators want teachers to have access to the best educational apps, guarding against data breaches, identity theft, and unauthorized student profiling is a formidable challenge. Many school districts are working to put a framework for reviewing and vetting apps and digital learning products that teachers want to use, in many cases compiling lists of approved apps for teachers.
All of the efforts at establishing guidelines for the evaluation of educational apps and services might eventually lead to the development of national standards for the adoption of new tools. Regular audits by security experts could eventually become standard across the education tech industry, and clearer regulations for app developers and service providers themselves could cut down on vulnerabilities and poor practices from the start.