Emotional Intelligence & Artificial Intelligence: EQ vs. AI?

by Ed Kang

This morning, I asked myself how more artificial intelligence (AI) would affect our need for emotional intelligence (EQ).

ai-brain.jpg

The first instinctive answer, for me, is EQ will become more important because AI, at least for the foreseeable future, cannot replicate empathy. And empathy is a vital part of the human to human experience we can never replace (completely) with technology. Yes, technology can facilitate empathetic exchanges between people, but empathy can never be "artificially" created.

But then I got to wondering: Do we, as people, have some sort of core emotional need for AI? To explore this, I broke it down based on what I know about EQ and our brains.

4 Emotional Questions our Brains Ask

At any given moment, our brain asks four questions based on what we are experiencing:

  1. Do I have joy right now?

  2. Is this good, bad or scary?

  3. Can I relate to this/them?

  4. What does it mean to be me here?

All these questions boil down to two emotional needs: Identity and Belonging.

What does this have to do with artificial intelligence? Well, my theory right now is that deep down, we are building, and will use AI extensively, to help us answer the above questions. I might be getting way too meta here, but stick with me.

Attempt to Use AI to Replicate the Joy Experience

Our brains are wired to seek joy. And when we don't get the joy we need from people, we will attempt to replicate it from other sources. These sources are:

  • behaviours

  • events

  • experiences

  • people (co-dependent or shallow transactional relationships)

  • substances

AI will be used as an artificial "stimulant" for the first three. Think of movies like Her, where the main character falls in love with an AI operating system. Other movies like Blade Runner 2045 are more overt assuming that we will all choose AI companions in the future. Interacting AI will change our behaviours while providing events and experiences to give us the joy we are not getting in other places.

There are obvious dangers here, but it makes sense. There is a rise of loneliness and social isolation to what some are calling epidemic levels. As a darker example, where once, certain parts of society turned to prostitution, there are now brothels with "sex dolls". No humans needed. I won't go any further with the implications here.

AI Helps Us Decide the Good, Bad & Scary

AI can process data and make decisions in fractions of a second compared to how long it would take our brains to do the same with the same size of information. In many ways, AI can tell us what we should consider good, bad or scary ahead of time. Generally speaking, I think this can be a positive.

It can also become a negative when there is bias in the system. After all, AI is programmed by people and so we will need a way to keep everything accountable, and dare I say, regulated.

AI Can Help us Relate

I stated before, I think AI has a place in helping facilitate relationships. The common example is AI recommending matches on dating apps. But what if AI could naturally learn our patterns and quickly help us engage with others through recommendations and even role play? This could be interesting.

Again, the AI would have to be programmed to with the end objective of a human-to-human objectives. If AI is designed to replace humans, well, we are back to artificial empathy, joy and robot brothels.

The Biggest Danger - AI Tells Us Who We Are

This is where my spidey senses go off and reinforces my belief on how critical EQ will be. At the center of all EQ skills is self-awareness, which is the skill necessary to begin forming a sense of identity. The highest levels of our cognitive capacity go towards answering the questions, "Who am I?" "Who do I want to be?" and "Who do I want to be me with?" These are questions that should never be entrusted to AI. But my fear is this is exactly where we are going.

Consider this: an AI engine determines what you see on your social media feeds. It decides what you like and who you should get information from. You don't have control over it at all except through interaction. Every decision you make online, every click and how long you spend on any content, teaches the engine to predict what will keep you glued to the application. Facebook assumed they know this better than you because it has the data and technology to calculate that data (regardless of how you feel about anything).

The problem is, platforms like Facebook are beginning to define people's identities. I know this from research and anecdotal experience. Clients have confessed that looking at their social media feeds makes them feel insecure and even ashamed that their lives don't compare. This is an identity problem.

Even the term "influencer" is becoming entrenched into people's identities. Individuals are expecting special treatment (some call this entitlement) because of how many followers they have on social media (which can also be artificially inflated). Don't get me started with the outrage culture with online platforms like Twitter. You can just watch for yourself with what is happening in US elections for a front row seat.

Where am I going with all this? This is where I landed.

EQ Will Balance & Help Integrate AI

The more EQ we have, the more AI will be an asset. The less EQ, liabilities abound. If we are able to become intelligent about our emotions, we can develop healthy boundaries and integrate AI for human benefit.

For example, if we have the EQ skills of self-regulation and social awareness, AI can help us with administrating relationships. We will be able to use Facebook and Twitter for its connective benefits instead of filling a void of joy. If we don't have those skills, we will rely on AI to define our sense of identity and belonging. In other words, we won't develop useful intelligence around our emotions because AI doesn't have emotion to begin with.

Let me put it bluntly. If we have EQ, we can have authentic relationships with real people which is the way it is supposed to be. If we don't have EQ, we won't have the skills to address our emotional needs, through authentically joyful and empathetic relationships, which companies will be happy to offer an AI replacement as the answer.

EQ & the Future of Work with AI

To go even further, I believe that training workers in EQ will help them work with AI better. How do we expect employees to embrace the future of work if they are in constant fear that a robot will replace them? The fact is, the more robots we have, the more people we will need to do things like development, maintenance, quality control and for that moment when someone wants to "talk to a real person."

My prediction is, as AI takes care of the mundane functions of business, we will rely on people even more with specific EQ-based skills to solve more complex challenges. Everything from unique customer service issues that require deeper empathy, to strategic decision making facing competitors using AI like everybody else.

To me, the future is bright if we begin to prioritize EQ in proportion to the advancements of AI. At the same time, I can see dystopian possibilities as well. The most important part however, is that right now, we have the choice to make. Let's choose wisely.