Stéphan Vincent-Lancrin
Senior Analyst, OECD
The emergence of smart education technologies presents many opportunities for education. But how can governments harness the benefits of technology in education while limiting its possible risks?
While they used to mainly rely on a diagnosis and assessments of students’ knowledge, intelligent tutoring systems increasingly factor in students’ engagement in learning, metacognitive and other behavioural processes. They often use sensors, cameras and sometimes the analysis of how students approach the task at hand.
Classroom analytics sometimes monitor the entire classroom: they help teachers orchestrate their teaching with real time feedback or delayed analysis. For example how they interacted with specific students, where they moved in the classroom, how long they talked, etc.
Could (or should) education establishments and systems become a new version of “Big Brother” for the sake of improved learning outcomes?
Questions over privacy
Because analytics require large amounts of education data, sometimes personal, common concerns arise about the development and use of smart technologies particularly in relation to data protection and privacy.
They also raise ethical and political concerns. Could (or should) education establishments and systems become a new version of “Big Brother” for the sake of improved learning outcomes? Can governments and other parties be trusted to use this information for the mere sake of educational improvement – and to enforce strong data protection regimes?
Most OECD countries have strong data protection regulation that ensures personal education data cannot be shared with (or used by) third parties. For what is not regulated within a country, ethics in using AI in education must intervene. For example, algorithms could be biased and have an undesirable social impact for some population groups. They could be flawed or may not reflect current societal values. The first ethical imperative is to monitor, verify and discuss their effects on different population groups and on educational outcomes.
Transparency of monitoring education
The most advanced applications of learning analytics continuously monitor individuals (e.g. engagement, self-regulation, classroom orchestration, game-based assessments). Do all stakeholders feel comfortable with some aspects of these applications, even if they are legal? The possible surveillance regime they may appear to introduce will require some imagination. Deleting the data immediately once processed is one technique used for classroom analytics, ensuring teachers get feedback on their class but personal data no longer exists once processed.
Social negotiation and transparency with all stakeholders are also critical. Reaping the benefits of digitalisation of education to improve educational outcomes will require new discussions, the development and testing of new social practices and the identification of balanced risk management approaches. International collaboration will be key to achieve this goal.