Many developers have constructed emotion detectors with AI and these AI emotion detectors appear to be better at detecting emotions than humans. Part of their advantage may arise from fast camera shots. AIs can see facial expressions that may last only a fraction of a second, shorter than most humans can register. In addition, AIs do not (yet) have emotions, which allows them to detect emotions in facial expressions, language, body position, vocal timbre and volume without any overlay of their own emotions to distract them and cloud their observations and evaluations. Eventually, AIs may become the perfect therapist.
Most people do not have frequent contact with a therapist, so why do we care if AIs can detect emotions? What purpose can this AI skill serve?
Recently my wife and I received a call from a close friend, who was clearly under considerable stress. Often stress comes from one of the “Four D’s:” divorce, disability, debt, or job loss (“dejobbed”). Various addictions will eventually cause at least one of the Four Ds, as will troubles with a spouse or children.
For this friend the immediate cause of the stress was deep-in-debt. You can guess the solution the friend proposed to us. Because this is a dear friend, we internalized the high stress level and, feeling the pain, offered to wire money, but much less than the friend requested. This is an old story, perhaps one which you recognize all too well. You may be feeling some emotional stress yourself, just from replaying your own memories.
The wire did not go through. Our bank called us and gave us only cryptic explanations of the problem. Immediately we conjured up a long list of fears for our friend, which only compounded our own stress. We called our friend to relay the problem. Our friend’s stress level immediately blew through the roof, and many possible solutions poured out, each more creative and fanciful.
We called the bank again, and confirmed the recipient information and our desire to complete the transfer. The bank accepted our confirmation, the wire went through, and stress levels dropped rapidly, even if our worries about our friend’s health and safety remained for much longer.
We have done many wire transfers and this is the first one which had trouble. After a few hours, I had settled down enough to wonder why this wire transfer got flagged. Consider these markers for this wire transfer:
- The amount was a round number.
- Amount was a significant portion of the money in the account.
- Wire was scheduled for same day, not even next day.
- Recipient was an individual, not a business.
- This was our first wire transfer to that person.
- We are both old enough to collect Social Security.
What do these markers indicate to you? As I replayed the banker’s words to me regarding the problem, my perspective suddenly shifted. The AI fraud detector was not concerned about the financial condition or health of our friend. The AI fraud detector flagged the transfer because it was concerned about us, its customers. It was not programmed to care about the recipient of the wire transfer. It was programmed to consider only if we been subjected to emotional pressure to do this transfer. Well, duh!
AI for Fraud Detection, or is it Emotion Detection
While ChatGPT has captured all the AI buzz recently, AI apps have been working hard for many years to protect your financial assets from fraud. AI fraud detectors check every single credit card transaction, wire transfer, and even old-fashioned checks for various frauds. AI fraud detectors spot the fraud and flag bad transactions before you ever see them.
It is likely, almost certain in my opinion, that the AI flagged this transaction for possible fraud. The AI fraud detector may not label emotional content per se. In the case of this transfer, and likely many other similar wire transfers, the AI accurately identifies the high emotional content of the transfer using only objective facts regarding the transfer and the parties.
As humans, our motives and eventual actions are inextricably entwined with our emotions. Our emotions are often so powerful that they demand immediate action to lower the stress levels. Emotions are pretty clever. As this post describes, our emotions create high-stress levels in order to motivate immediate action. What better way for someone else to motivate us to action than to manipulate our emotions? As AIs continue to serve and protect humans, it is no surprise that they will learn to protect us from the weaknesses of our emotions.
Think of the Children
Often altruism is used as an emotional lever to motivate us to action. “Do it for the children,” is my favorite example of overused emotional motivators. In this situation with the emotional appeal to altruism, I needed to think of others (the recipient) much more clearly. Giving money did not fix the problem, it only prolonged the agony for the recipient. It did, however, relieve our immediate emotional pain, even while causing some short-term financial stress for us.
There will be a next time. Thanks to the AI fraud detector I will be more prepared, and the AI will be there to slow down the transfer and give me time to reconsider.