Hi tech teaching: pitfalls and progress to emerge from the pandemic
The pandemic has created an opportunity for teachers to try technology straight out of a sci-fi movie. With mixed results.
Software can now track students' eyes, head, mouse, and keyboard movements to help teachers understand what students are comprehending.
University of Technology Sydney Professor Fang Chen upgraded online learning in search of a reliable way to measure student engagement.
'Online learning has existed for a long time, but it has never been pushed this hard before,' she told ABC News.
'Now we have the ability to make it more efficient. The future is providing a feedback loop of students learning progress instantly,' she said.
The program developed by her data analytics team tracks student responses during online lectures. This information is converted into a line graph that 'dips up and down, in real time, indicating a level of engagement that can be viewed by teachers'.
'Like in the physical classroom, the teacher can make instant adjustments while delivering the lecture or review the graph later to develop more engaging content,' says Prof Fang Chen.
However, not everyone is confident such tools are safe for students and teachers.
Research professor Kate Crawford recently wrote a piece for Nature saying the pandemic is being used as a pretext to push unproven artificial-intelligence tools into workplaces and schools.
She is concerned by emotion-recognition software being used to monitor workers and children remotely.
Her example is the system named 4 Little Trees being used at True Light College in Hong Kong to scrutinise each child’s facial expressions through their computer’s cameras. The program claims it can assess children’s emotions while they do classwork.
'It maps facial features to assign each pupil’s emotional state into a category such as happiness, sadness, anger, disgust, surprise and fear. It also gauges ‘motivation’ and forecasts grades.'
This software is no experimental outlier: the emotion-recognition industry is predicted to grow to US$37 billion by 2026 and Affectiva, an emotion-detection tech start-up, was acquired in May for $73.5 million.
The Financial Times says the 4 Little Trees algorithm works by 'measuring micro-movements of muscles on the girls' faces'.
'The company says the algorithms generate detailed reports regarding each student’s emotional state for teachers and can also gauge motivation and focus. It alerts students to 'get their attention back when they are off track'.'
Its founder says it gets children's feelings right about 85 per cent of the time.
4 Little Trees uses an algorithm evolved from facial recognition systems that creators claim can recognise human emotion and states of mind, such as tiredness, stress, and anxiety, through the analysis of facial expression, micro-gestures, eye tracking and voice tones.
The technology claims not just to understand how someone is feeling in the moment, but also to decode their intentions and predict their personality.
4 Little Trees founder Viola Lam told CNN the system monitors how long students take to answer questions; records their marks and performance history; generates reports on their strengths, weaknesses and motivation levels; and forecasts their grades. The program can adapt to each student, targeting knowledge gaps and offering game-style tests designed to make learning fun.
Lam says the technology has been especially useful to teachers during the pandemic because it allows them to remotely monitor their students' emotions as they learn.
She says 4 Little Trees records facial muscle data, but it does not video students' faces.
Racial bias had previously been tagged as a serious issue for AI after research revealed emotional analysis technology has trouble identifying the emotions of darker skinned faces, because the algorithm learns how to identify emotions from mostly white faces.
Assistant Professor Bonnie Stewart, writing for The Conversation, revealed privacy concerns arising from proctoring software used by Canadian universities to examine their students remotely during the pandemic. The for-profit tools monitor students’ laptops, tablets, or phones during an exam. They can monitor eye movements, capture students’ keystrokes, record their screens and track their searches as well as their home environments and physical behaviours.
She says the testing methods 'invade privacy and erode trust' because they flag students who fail to keep their eyes on the screen, even if the reason is autism or disability rather than cheating. She worries they enforce memorisation and put students at risk of data breaches and create a mindset of all students being potential cheaters.
Vice.com says digital proctoring software turns students’ computers into 'powerful invigilators'.
'The software flags any behaviour its algorithm deems suspicious for later viewing by the class instructor.
'If a student looks away from the screen more than their peers taking the same exam, they are flagged for an abnormality. If they look away less often, they are flagged for an abnormality. The same goes for how many keystrokes a student makes while answering a question, how many times they click, and a variety of other metrics. Variation outside the standard deviation results in a flag.'
In Singapore, in January change.org started an online petition to oppose a device management application (DMA), which allows schools to manage students' usage of tablets or laptops used for home-based learning.
Secondary 2 student Ethan Fun, 14, told The Strait Times the Ministry of Education was 'over-controlling students by asking us to install this application on our personal devices. Students also need their own personal privacy and space'.
Not all technological innovations forced by the pandemic have been negative.
Secondary teachers in Canada employed the long-mooted 'flipped classroom' where students receive taped lectures or lessons ahead of time.
They are encouraged to do their own research, then come to the classroom ready to practice, discuss or join in group projects.
Student assessments are also being revamped with evaluations that allow students to demonstrate their knowledge through journals, portfolios, discussions, and presentations.
Apps have also rushed to fit the home-learning environment. Vancouver-based reading app Simbi motivates students to read by having them narrate books that other learners across the world can listen to as they follow along with the story.
In the Persian Gulf, the education sector witnessed a 'seismic transformation' during the Covid-19 pandemic as schools quickly introduced technology to adapt to restrictions on in-classroom teaching.
Education ministries learned of the advantages and pitfalls of education technology via their own experiences 'rather than by reading case studies from other countries'.
And learning technology entrepreneur Simon Hay says hybrid learning (mixed online and in-person education) has been accelerated rapidly by the pandemic.
'For many faculty members, it has been a baptism of fire,' he said. 'The majority of faculty members in early learning, as well as higher education in this region, have been resistant to even blended learning. But they had to embrace this change in a matter of few weeks.
'Increasingly, in public schools and colleges, teachers have also come under stricter scrutiny for the effectiveness of their online teaching. They have had to make choices to sink or swim. And almost all of them have learnt to swim in this new online education environment.
Emotional recognition software has been used by Disney to test volunteers’ reactions to a range of its films; car companies to test driver alertness; by marketing firms to gauge how audiences respond to advertisements; and by UK police to identify 'suspicious people'.
Ross Andersen, a deputy editor of The Atlantic travelled to China and found its regime 'wants to build an all-seeing digital system of social control, patrolled by precog algorithms that identify potential dissenters in real time.
Intrusive surveillance is being tested on one million mostly Uyghur Muslims held in detention camps in Xinjiang. And Andersen fears the surveillance model will be exported, 'entrenching the power of a whole generation of autocrats'.
Uighurs are forced by police to install 'nanny apps' on their phones that use algorithms to detect 'ideological viruses” and the use of Arabic. The surveillance is so pervasive, usual 'workarounds' like encrypted messaging software are impossible to use.
Digital inactivity itself can raise suspicions.
In the western world, there is little agreement over whether emotion-tracking software works.
Suresh Venkatasubramanian, a machine learning scientist at the University of Utah, says machines can’t adjust their behaviour like humans do.
'When you’re interacting with one person and you make a mistake about their feelings, you can get feedback and very quickly adjust your internal model,' he says.
'But a machine can’t do that, it is building a model from some data and scaling it to thousands more people, it doesn’t have ability to adjust in the moment if it misread what you said.”
Lisa Feldman Barrett, a psychologist at Northwestern University says human emotions are hard to read.
'People, on average, the data show, scowl less than 30 per cent of the time when they’re angry. That means that more than 70 per cent of the time, people do not scowl when they’re angry. And on top of that, they scowl often when they’re not angry.'
Programs like 4Trees are based on the work of psychologist Paul Ekman, who contended there are six universal human emotions that are innate, cross-cultural, and consistent, regardless of ethnicity. He believes his work is being misinterpreted by software makers and there should be laws passed that prohibit the recording of facial expression, let alone its interpretation or measurement, without 'informed consent'.
He says faces convey universal emotions, but he’s seen no evidence that emotional-tracking technologies work.
Kate Crawford says there is 'deep scientific disagreement' about whether AI can detect emotions.
And there is growing scientific concern about the use and misuse of these technologies.
'It is time for national regulatory agencies to guard against unproven applications, especially those targeting children and other vulnerable populations.'
She likens the software to the polygraph, which persisted from the 1920 until in 1998 the US Supreme Court concluded that 'there was simply no consensus that polygraph evidence is reliable'.
Prof Crawford says Ekman's six emotions were 'standardized and automated at scale' by computer innovators, ignoring more complex factors.
'Ekman sold his system to the US Transportation Security Administration after the 11 September 2001 terrorist attacks, to assess which airline passengers were showing fear or stress, and so might be terrorists. It was strongly criticized for lacking credibility and for being racially biased. However, many of today’s tools, such as 4 Little Trees, are based on Ekman’s six-emotion categorization.'