Engineer Jelissa Kamguem ’23 broke out of her comfort zone this summer, applying data-science skills to develop a software to help those leading virtual meetings
By Bryan Hay
Zoom hosts may have important information to deliver but often have no way of knowing whether those on the receiving end are paying attention or distracted by more tempting material on their tablets and laptops.
Jelissa Kamguem
But software developed by Jelissa Kamguem ’23 (chemical engineering) and a group of students during a summer internship may help presenters see engagement levels in real time during and after their Zoom meeting, allowing them to alter the tempo or intensity of their presentation to reel in their audience.
Kamguem and members of her team in the 2020 Empath summer program at Affectiva, a Boston company on a mission to bring emotional intelligence to the digital world, developed a Zoom plug-in and called it Affectobot.
Inspired by its product’s potential, the group entered it into Affectiva’s Makeathon contest, taking second place among 15 student teams and receiving a $1,600 prize.
Motivated by her experience last semester in an Introduction to Engineering Design & Data Mining course taught by Christian López, assistant professor of computer science, Kamguem wanted to learn more about interactive technologies and pursued the program at Affectiva.
“Most of the students that get into this program are computer science students, but I really wanted to get exposed to this field. I’m planning to minor in data science,” she says.
Assigned to a team with three other students, all computer science majors, Kamgeum found the situation challenging at first. But she dug deep and applied her Lafayette problem-solving skills learned while working this summer alongside Michael Senra, assistant professor of chemical and biomolecular engineering, on his research topic: improving the cold flow properties of biodiesel.
“He taught me some of the critical thinking skills that I used to do some market research for the Zoom plug-in project,” Kamguem says.
“So what my team did was to create a plug-in that uses emotional or facial recognition to give real-time feedback presented during a Zoom presentation and also after the presentation,” she says. “We did technical coding to come up with a prototype that kind of tells the presenter when their meeting guests are distracted from their camera.”
Data delivered through tags or displayed on a concealed screen allow meeting hosts to determine if participants are looking into the camera. Another feature provides a post-meeting summary of the level of audience engagement.
Kamguem credits Lopez for inspiring her to learn more about computer science and the value of data mining, skills that can apply to many disciplines. “After the course with Prof. López, I wanted to know more,” she says.
“Jelissa’s experience shows that if you get out of your comfort zone and put in the time and effort, you can achieve anything,” López says.
7 Comments
Innovative and useful. Love the inter-disciplinary transfer of skills.
Well done Jelissa. That’s wonderful, you are just amazing. Keep going , the best is still to come.
Great job Jelissa. Keep you the good work.
It’s a great initiative that can boost and add more value to video conferencing . More grease to your elbow while coding and improving your work ✊.
Fantastic…..! jelissa ….. keep up …. you are doing great!!!!!
Thanks very much Margie. With the current pandemic going on, it is necessary for us to narrow the gap between virtual and in-person interactions.
Fantastic! I love the idea of applying emotional intelligence to AI. Perhaps this will encourage meeting leaders to be more engaging, and attendees to being more engaged. Most of the time we see a gallery of checked-out stares. In this two-dimensional medium we need ways of bringing enthusiasm and interaction into the mix…even just nodding along with the presenter, smiling, giving a thumbs up while on mute.
Keep going, Jelissa! Great work!
Comments are closed.