Measuring Learner Engagement with AI and a Webcam? | EdTechnically

Measuring Learner Engagement with AI and a Webcam? | EdTechnically


Measuring learner engagement in
education is almost like understanding how well a student learns an engaged
student intakes content easier understands it better and retains it
longer so how do you know if a learner is engaged teachers who know their
students can tell in a vague sense how a lesson went over but what distance
education institutions might rely exclusively on student feedback or
academic performance in other words they don’t really have a very good idea at
all of when a learner is engaged or whether or not engaged but what if there
was another way hi my name is Henry Kronk I’m the editor
at you learning inside and this week on ed technically we’re asking what if you
could measure learner engagement with AI and a webcam with the advent of small cheap
high-resolution cameras along with adaptive algorithms technologists from
many fields dream of a way to track user interest the use of such technology
could revolutionize the way people learn educators could use this technology to
personalize learning even further they could identify specifically when a user
loses interest in a module know when they’re too tired to work identify
patterns in their learning behavior and understand when a disruptive life event
is preventing them from progressing through class schools are already using
a I equipped camera arrays for a number of purposes many in the US are using
these setups to enhance school security in China had elsewhere numerous schools
are also using AI and cameras to track student behavior a French Business
School is using AI to determine whether learners are paying attention other
efforts include tracking what students write tracking attendance and even
assessing the likelihood of sickness but all of these initiatives share one thing
in common they track something that is easy to
identify learner engagement however is a different beast
imagine a personalized learning curriculum that could track down to the
paragraph when and where a learner lost interest it could completely change the
way subjects are taught how educational content is created and ultimately how
effectively learners learn well many believe this outcome is inevitable we
have yet to arrive there as a society and before we get to issues like privacy
we need to figure out how to do it in the first place I’m really sorry if I
don’t say these names correctly but researchers Muhammad Soleimani and
Marcello Marti ro from the University of Geneva sought to tackle this issue in a
recent study and when they began their work they weren’t incredibly optimistic
because the meager body of existing research on the subject hadn’t provided
many positive results now as I said easier there are a bunch of AI camera
systems that already track human behavior
and they can do that partially because humans tend to reflect basic emotions
like happiness and sadness with more or less universal expressions but when it
comes to learner engagement or interest it’s way more difficult to nail down a
few early studies conclude simply that one cannot measure a learner engagement
with facial expression others report accuracies of determination well below
50% in other words a system was accurate in identifying learner engagement less
than half of the time still others have had much better success one breakthrough
came with a 2017 study in which Marty arrow also participated which found that
tracking the dynamic moving motions of the face compared to static images
increased accuracy from 29% to 68% that’s not bad still not great in an
effort to determine learner engagement Soleimani and Marty arrow used three
different methods first they mapped the face into nearly 50 different points and
tracked their motion in relation to each other
second they looked at I gaze duration and bodily posture third they also
measured galvanic skin response also known as GSR or the electric activity at
the level of the skin this method is used in everything from polygraphs to
athletic fitness testing and involves attaching electrodes to numerous points
on the body the researchers showed over 50 participants dozens of images and
gifts is it gifs is it gifs I don’t know let’s leave that for next time
participants were asked to rate each image and video on a scale from 1 to 7
based on their interest Soleimani and Marty arrow then used a random forest
regressor to process the data comparing their visual findings with the
participants reported interest on a basic level researchers found that they
could indeed correlate learner engagement with a set of behaviors
interested participants tended to smile move their head closer to the screen and
among many other the nation’s sack aid their eyes I’m not
sure if I’m saying that correctly but pretty much SEC aiding means move your
eyes rapidly between two points for a longer than normal duration some motions
also indicate a disinterest as the author’s right quote all participants
leaned toward the screen when a new image appeared which indicated attention
toward a novel stimulus but only when the stimulus was interesting they
maintained the posture and remain engaged when the stimulus was not
interesting they would go back to the resting position distancing themselves
from the screen end quote also of note quote micro videos elicited
more consistent behavioral patterns across participants as is observable in
the participant independent results we believe that these still images could
not elicit emotions and reactions as strong as those elicited by moving
pictures and therefore we suggest using videos and future work end quote
measuring interest and engagement via a learner’s face therefore may not be as
far away as some think still the authors conclude with a warning quote obviously
using behavioral signals such as facial expression to detect situational
interest requires capturing facial images and we should be aware that users
might find that intrusive for at least two reasons first users might not want
to share information that would make them identifiable with a system second
one might not necessarily want to share his or her inner state such as interest
in a given content deploying such systems should only be done with the
full informed consent of its users and the users should have full control over
how and where the data can be used such systems should be designed not to
transfer or store identifiable information in this case facial images
one existing solution is to execute facial tracking on user’s device and
only transfer or store the analysis outcome Soleimani and Marty ro if you
develop technology that can be brought to market good luck with these last few
points anyways to answer the question what if
you could measure a learner engagement with AI and a webcam the answer is it
would be crazy it would completely change the way that we educate people it
would allow for so many opportunities in education and it would raise so so many
red flags regarding privacy and data security people are trying to make it
happen but the good news is we still have some time to figure things out
we’re still some ways away from a product that can be brought to market or
so it seems at least as a PS this represents the work of academia there is
some indication however that proprietary software in the private sector is
further ahead China Global Television Network a government operated media
outlet reported last summer on the Chinese tutoring company VIP kid I’ve
heard some people pronounce this VIP kid but I’m pretty sure it’s VIP kid VIP kid
is huge it employs tens of thousands of tutors and teaches millions of learners
last summer they secured a funding round worth 500 million in u.s. dollars and
partnered with Microsoft to advance the use of AI and education those are two
different things CGT and the China global television
network got in touch with I’m sorry if I had miss Perrin
sang in Jing who is the VP of technology for VIP kid and Yun Jing said quote
interactivity and involvement are crucial in online education we developed
a complicated algorithm to analyze students eyes and how they move and we
trained the model through deep learning each student has different ways to
express feelings so the feedback could be very different end quote
now this is a translation into English but it does appear that VIP kid has
similar efforts underway it might be that way or it might be that they’re
simply tracking when students are paying attention and when they’re not paying
attention which is a related issue but not quite the same this has been at technically my name is
Henry Kronk and I work as the editor of elearning inside if you like this
podcast please rate and review if you want to hear more please subscribe also
keep in mind that this show is available as a video on our youtube channel and
also as a podcast on iTunes stitcher Google Play and other places where you
get your podcasts the basic content for this video first appeared as an article
on e-learning inside and if you want to learn more about online courses
technology in the classroom and EdTech in general be sure to check out our site
it’s updated daily if you’d like to get in touch with me please send an email to
[email protected] or you can follow us on twitter @eLearningInside Thanks for listening and viewing.

Leave a Reply

Leave a Reply

Your email address will not be published. Required fields are marked *