An increasing number of us depend on digital assistants to get us through the day. Whether it’s Siri, Alexa, or some other form of this technology, we’re growing used to having these systems do our daily tasks for us. Now, buying groceries, playing music, turning on a light, or even hearing a joke is only an ask away. Still, there’s a lot of room for interpretation – or misinterpretation – by our machines. In fact, a single search online will surface plenty of these “fails.” But, how might technology respond if it had the ability to understand how we’re feeling?
That’s just what Taniya Mishra is hoping to answer. A speech and artificial intelligence scientist for Affectiva, Mishra uses “machine learning to build automatic systems that can recognize emotions.” In other words? She teaches robots how to be a bit more human.
“I see the lack of emotional understanding in systems – and the challenges of little humans trying to interact with them,” Mishra, a mother of three, explains. “One of my children would actually cry. She would repeat her request with increasing frustration, but the system doesn’t acknowledge that.” It’s not that the system is insensitive or purposely trying to upset the child; it just isn’t built to process emotions yet. “They understand the text of what you’re saying and do an excellent job of trying to respond to that, but as humans, our emotions and intentions are combined.”
As Mishra works to make the connection between artificial intelligence and emotional intelligence, new possibilities for this evolving technology are on the horizon. For example, your car might tell you to pull over when you’re noticeably tired, or maybe your child’s video game will encourage them to take a break when they’re starting to get upset. By making these machines aware of – and responsive to – human emotions and feelings, many accidents, failures, or breakdowns in communication could potentially be avoided.
But as systems grow smarter, they may actually be able to teach us a thing or two about how we interact with one another. Within the past decade, smart phones, search engines, and social media have permeated our everyday lives, causing us to stare at our screens instead of making eye contact with each other. While there’s plenty to gain from the internet’s connectivity and convenience, there may be just as much to lose. After all, if we’re distracted by devices while having a conversation with someone, what might we miss if we’re hearing their words, but not truly listening to them? Well, we might comprehend the text of what they’re saying but we may not be able to pick up on their full range of emotion, and therefore miss their overall intention. Sound familiar? As we invite technology into our lives, we still must maintain what makes us human, Mishra says.
Mishra says this realization has made her more intentional in every aspect of her life, but especially as a parent. “Children are perceptive. Their behavior isn’t just informed by what you tell them, but also, by what they see,” she explains. So, if we want them to listen to us, we must truly listen to them. And that means giving them your full attention. “I try not to multitask, which is really hard,” Mishra says. “In today’s society, we’re constantly on-call in some way or another. We have so many distractors in our lives.” That’s why Mishra and her husband have a “phone jail” until their children go to bed. “We intentionally put our phones and computers away,” she explains. That way, there are no distractions in sight when she’s spending time with her kids.
But, listening is only part of the solution. Parents should also be aware of other verbal and nonverbal cues they may observe, and teach their children to do the same. “When my kids are telling me about an interaction they had at school, I ask them, ‘What did she look like? What did she sound like? Do you think she was happy or upset?’” Now, her children provide these details without even being probed, Mishra says.
“As the interactions between humans and machines grow and evolve, there are many things we’ll accomplish with our systems in partnership,” Mishra says. These “emotionally aware devices” won’t be able to solve all of our problems, but by understanding our feelings, they might be able to help us find a solution. And that alone is a quite a big accomplishment.
As a scientist, Mishra’s goal is “to take frustration out of our interactions with systems and replace it with empathy and trust.” And as a parent, her hope is that her children will grow up to be “ethical people who care about the world around them.” Lucky for her, those two objectives are a perfect match.
To learn more about Mishra and her work, check out this video on NBC News Learn. It’s part of the new video series, Discovering You: Engineering Your World, which profiles a diverse group of engineers and their career paths.