Intuitive AI That Attempts To Mimic The Human Psyche

Next frontier: automate intuition?
Can artificial intelligence eventually mimic human intuition? And is that a good thing?
Intuition has fueled many a business or personal life decision, and there is plenty of evidence to suggest that it’s a fairly powerful and accurate tool. It taps into and selects from a vast wellspring of information in one’s brain.
As this recent podcast with neuroscientist Joel Pearson illustrates, intuition involves more than just “tapping into any unconscious information. It’s the learned information. So when we go about our lives, our brains processing thousands of things, we’re only conscious of a tiny bit of that. We have no idea what our brains processing most of the time.”
Intuitive AI – which can sense and respond to many seen and unseen factors – may represent the next phase of the technology. With the advent of machine learning and generative AI, there’s been excitement about its productivity potential.
The next frontier of AI may be what Ruchir Puri, chief scientist at IBM Research and IBM Fellow, describes as “emotional AI.” While “human intelligence encompasses multiple dimensions – IQ or intelligence quotient, EQ or emotional quotient, and RQ or relational quotient. So far, AI has primarily only mastered IQ.”
“EQ helps humans understand and manage emotions, while RQ shapes how we build relationships,” Puri explained. “These are the next frontiers for AI development – systems that recognize, interpret and respond to human emotions beyond just sentiment analysis.”
Emotional AI may even “become one of the most significant cultural turning points of our time,” he continued. “Machines capable of understanding, responding to and generating emotions will reshape how society and businesses functions, with AI working alongside humans in a profoundly integrated way.”
The IQ of AI will definitely keep growing as well, and “we’ll soon see AI with an IQ of 1,000,000, as described by Emmy Award-winning producer Ryan Elam, founder and CEO of LocalEyes Video Production.
“At some point, AI will reach a level of intelligence so far beyond human cognition that it will no longer be comprehensible to us,” Elam predicted. “A machine with an IQ of 1,000,000 wouldn’t just solve problems faster; it would perceive and define reality differently. These ultra-intelligent AIs may discover scientific laws we don’t even have the cognitive framework to understand, essentially operating as alien minds among us. The challenge won’t be building them—it will be figuring out how to interpret their insights.”
Wrap this into a future in which “our most intimate signals — heart rate, body temperature, microexpressions, and subtle voice shifts — are openly accessible,” said Dr. Zulfikar Ramzan, chief technology officer at Point Wild. “In this world, AI, once celebrated for mastering highly analytical domains like Chess, Go, and even protein folding, can elevate – or wreak havoc upon — the concept of emotional intelligence.”
Most of the required technology already exists, Ramzan continued. “High-resolution and high-frame-rate cameras, remote photoplethysmography, thermal imaging, radar-based skin conductivity sensing, and sensitive microphones can capture signals that that we once thought private: real-time pupil size, subtle color changes in skin caused by blood flow, microexpressions, skin temperature, sweat gland activity from a distance, voice prosody.”
AI can merge these data streams, “and analyze video, images, and speech to transform ostensibly hidden signals into a cogent narrative about the inner workings of the people around us. We can literally read the room.”
Ramzan illustrates how this could work in business settings. “Imagine negotiating a deal when AI notes your counterpart’s pupils widen at a specific price point — signaling non-verbal interest that could pivot the conversation,” he said. “Picture delivering a presentation, but getting instant feedback on audience engagement. Suddenly, those who persistently struggle to interpret non-verbal cues are on nearly equal footing to the most preternaturally gifted empathetic, charismatic social chameleons.”
Getting to more intuitive or emotional AI requires a more fluid user interface – to the point in which people do not realize they’re still talking to machines – but, hopefully, will still be aware they are. “Too often, AI impresses in carefully curated demos or cherry-picked case studies, but struggles in real-world use,” said Anastasia Georgievskaya, founder and CEO of Haut.AI. “People end up spending 15 to 20 minutes trying to make it work or even an hour refining prompts just to get a decent result.”
This frustration, she continued, “comes from a fundamental limitation: we’re trying to communicate highly complex, contextual thoughts through simple text prompts, which just isn’t efficient. Our thoughts are richer, more layered than what we can type out, and that gap between what we mean and what AI understands leads to underwhelming results.”
Once we move beyond prompting and text commands, “the real innovation will happen—moving beyond text commands,” said Georgievskaya. “I see a future where we can leverage neurotechnology to express intent without language. AI that doesn’t wait for us to spell things out, but instead picks up on our thoughts, emotions, and context directly, making interactions far more intuitive.”
“Take skincare recommendations. Instead of typing, “I want something lightweight with vitamin C,” AI could already know,” said Georgievskaya. “It could sense your emotional reactions, subconscious preferences, even remember which influencer’s review you engaged with. It might recognize that you’re drawn to certain textures or packaging – without you needing to say a word. Within a few years, AI may no longer ask what we want – it will simply understand.”