Unless you’ve been living under a rock (maybe not the worst of ideas these days), you’ve likely seen recent news bits in which Elon Musk states AI will be “smarter than any one human” by 2025.

I found this statement really ridiculous. And just plain wrong.

What Musk is really talking about here is what we colloquially think of as “IQ.” 0s and 1s, Einstein meets da Vinci meets Kasparov, challenges as practical as energy storage or as arcane as the Riemann hypothesis.

But IQ is not human intelligence. It’s a part of it, of course, but it’s not the sum of it. Human intelligence, even that IQ bit, does not exist in a vacuum: it’s inherently and constitutionally tied to the world around us. And the world around us, as you may have noticed, is also filled with people.

What Musk failed to account for is what we commonly term “EQ.”

Sure, we may better human lives and accrue outsized economic returns by curing autoimmune diseases or commercializing quantum computing.

But if we can’t get along? And work together productively? If we trend towards social destruction, rather than social connection?

It doesn’t take a genius to answer those questions.

And outsized economic returns don’t belong to IQ alone. EQ delivers those returns too, arguably with a more sustainable fulcrum.

A case in point: the routinely ranked ‘best companies to work for’ typically beat market returns by 3x+. This isn’t a fluke. It’s a data trend that’s persisted across decades and markets.

I spend a lot of time thinking about emotion. The work I do at Tikel is largely about taking this complex and undefined science, the science of emotion, and translating it into a practical platform with real-world impact.

And emotion is far more complex than most people imagine.

We can be happy and sad at the exact same time. What we say we feel, and what we actually feel, are often not the same thing. And what we do feel at any given time can only be fully understood and qualified within the singular context of our lived experience: physical, mental and so on.

An example. I’m currently single, and one of my friends recently got engaged. I’m happy for her, of course. But I’m also sad for myself and envious of her engagement and self-conscious about my single state. I’m also happy I’m not in a bad relationship and comfortable being alone and fond of the absurdity that is dating in 2024. All of these emotions, all at once, with dynamically shifting priority and pertinence.

And if it were my sister who had gotten engaged instead? Those emotions wouldn’t simply copy-and-paste: they’d be particular to that relationship. And we haven’t even touched upon the societal-cum-cultural strictures and expectations and beliefs and myths that have shaped the ways in which I see (and feel) the world.

In many ways, the concept of emotion or “EQ” makes the Riemann hypothesis look like a walk in the park.

But go ahead, Elon. Show us AI that’s smarter than people about, well, people.

Categories: Blog

0 Comments

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *