If Facebook and the past election taught us anything it is that AI (Artificial Intelligence) can readily harm and endanger our democracy and way of life with automated fake news and spam. Businesses should take heed. Treating people like If-Then software programming statements and filling websites and social media feeds with our likes, while blocking our dislikes, can have unintended consequences.
So, what went wrong? Miscommunication. It really is that simple. Websites and social media are designed to talk at users. When this communication is a one-way street, whether by generating feeds of only those things you like and agree with, or through a commercial website that talks at its customer without engaging them, businesses lose customers and people lose touch. Our society also loses as we sink deeper into our siloed, collective echo chambers.
How can we turn this trend around? We can build our emotional intelligence, starting with improving our listening skills. It sounds hokey, but as AI grows in speed, power, and access, listening is increasingly critical not only for our customer base, but to our human race. Being emotionally intelligent and available, “humanness,” differentiates us from AI and Smart machines; machines may be able to think, but they can’t feel or express feelings. Consider Blade Runner or other science fiction movies.
What is Emotional Intelligence?
Travis Bradberry and Jean Greaves, renowned authors of Emotional Intelligence 2.0 say that, “Emotional Intelligence is the effective communication of the emotional and rational brain.1” At its core, emotional intelligence is about improving one’s communication skills by acknowledging emotions and using our rational brain to manage people, build relationships, improve leadership skills and make well-rounded decisions. The foundation of this is listening.
AI uses logic and algorithms to analyze us, but is it really listening to us? What does listening look like in an automated world, what does it mean? Sure, Amazon can make suggestions based on purchase histories and now has listening, I mean search devices that help us navigate our physical wants and consumer needs… But is that listening? Moreover, how do you build a relationship with a website, social media, or a business when it doesn’t talk back? It reacts and responds based on history and past actions. Are these reactions and responses satisfying, even if they are well informed? Let’s say that you were in the mood for Wisconsin fried butter one day. Not saying it’s bad or good, but let’s just say you went on a binge. That is what you wanted at the time, but you’ve since regretted making this purchase. Why? Because AI analyzed your behavior and now predicts you would like fried butter, or perhaps fried cheese or fried yogurt (Is that a thing?). These suggestions keep popping up, filling your website and feeds with fried dairy products. Even AI-predicted suggestions can get annoying no matter how good they once were.
Okay, this all sounds wonderful but how is this applicable to data science and marketing? Can you build an authentic relationship with data? How do you share emotions, acknowledge vulnerabilities and build trust with a bunch of 1s and 0s and if-then statements? Didn’t you just say people are more complicated than this?
Yes, I did. If we listen and pay attention to people we can accommodate, compromise and begin to earn people’s trust. A straightforward way to connect with someone is to use their language. As an Asian American living in the whitest state in the nation, I have learned to mimic peoples’ words, mannerisms and tone. I can ‘Ayuh’ with the best of them. When I lived in Austin, Texas, I even picked up saying y’all. Why? Using familiar language and tone puts people at ease. If listening skills are the building blocks to communication, shouldn’t we let our customers use language they feel most comfortable with when conversing with a vendor? Anticipating someone’s emotional needs is how you stay Emotionally Intelligent in an automated world. Look for opportunities to allay people’s fears or to increase someone’s hopes and aspirations. You guessed it: AI can help, but it takes emotionally and socially aware technicians to connect the emotional world with the automated world.
A straightforward way to create an authentic relationship through data science is to have website or social media content translated into the language that is most comfortable for its users. Sounds simple, but it’s complicated.
According to Harvard Business Review’s article Global Business Speaks English, “The fastest-spreading language in human history, English is spoken at a useful level by some 1.75 billion people worldwide – that’s one in every four of us.”2 The majority of commercial websites are in English. This makes sense. English is the universal business and science language, and it is inefficient to hold business meetings in numerous languages. The problem is that it assumes everyone is proficient with English. Only one in four multicultural, multilingual business people feel confident with or look forward to speaking English3.
This is significant because, if most business people are more comfortable in their native language and our shopping experiences via the Internet are private and individual, shouldn’t websites reflect peoples’ language needs?
Another significant fact we overlook is that the US is more multicultural than ever. More than ½ of all the kids in the US under the age of 18 are multicultural4. Most multicultural and multilingual populations live in large US cities. The ten largest cities in the US are multicultural5.
1. New York–Northern New Jersey–Long Island, NY–NJ–PA MSA 67% multicultural
2. Los Angeles–Long Beach–Santa Ana, CA MSA 72% multicultural
3. Chicago–Joliet–Naperville, IL–IN–WI MSA 68% multicultural
4. Dallas–Fort Worth–Arlington, TX MSA 71% multicultural
5. Houston–Sugar Land–Baytown, TX MSA 75% multicultural
6. Washington–Arlington–Alexandria, DC–VA–MD–WV MSA 64% multicultural
7. San Francisco–Oakland–Fremont, CA MSA 69% multicultural
8. Philadelphia–Camden–Wilmington, PA–NJ–DE–MD MSA 65% multicultural
9. Boston–Cambridge–Quincy, MA–NH MSA 65% multicultural
10. Atlanta–Sandy Springs–Marietta, GA MSA 62% multicultural
More than half the populations of each of these large cities come from multicultural, multilingual backgrounds. “Just 51 percent of New Yorkers speak only English at home, according to recent data from the Census Bureau's American Community Survey. As for the other 49 percent, well, the languages span the globe.”6 Economically, we can’t afford to ignore this fact. Just ten cities in the US alone contribute 5.4 trillion dollars to the nation’s economy, equating to the 4th largest economy in the world. If just 1/5th of that population speaks another language in these cities, the potential market is 1.08 billion dollars.
1. New York–Northern New Jersey–Long Island, NY–NJ–PA MSA 1.4 trillion
2. Los Angeles–Long Beach–Santa Ana, CA MSA 800 billion
3. Chicago–Joliet–Naperville, IL–IN–WI MSA 557 billion
4. Dallas–Fort Worth–Arlington, TX MSA 460 billion
5. Houston–Sugar Land–Baytown, TX MSA 450 billion
6. Washington–Arlington–Alexandria, DC–VA–MD–WV MSA 435 billion
7. San Francisco–Oakland–Fremont, CA MSA 370 billion
8. Philadelphia–Camden–Wilmington, PA–NJ–DE–MD MSA 358 billion
9. Boston–Cambridge–Quincy, MA–NH MSA 353 billion
10. Atlanta–Sandy Springs–Marietta, GA MSA 298 billion
The total is 5.4 trillion = 4th largest economy in the world. If all 10 cities were nations, they would be within the top 40 of the largest economies in the world. New York = 14th largest – as large as South Korea7
Unfortunately, the more diverse a city is the more segregated it is8. For example, here is a racial dot map of New York City, one of the most diverse cities in the US. This racial dot map was developed by the University of Virginia. Every person is represented by a colored dot. A blue dot for white, a green dot for African American, a red dot for Asian American, and an orange dot for Hispanic American.
There is unfortunately a correlation among racially and ethnically segregated cities like New York City: they also have segregated census tracts. Data science can help predict which language someone might speak based on where they live and their census tract.
This is an opportunity to use AI and data science to create targeted translations for micro segments of the population based on US Census tracts, making more user-friendly websites. Aren’t websites already translated? Don’t we need just one language in addition to English: Spanish? Yes, but there are many dialects within Spanish. Not surprisingly, Spanish (and Spanish Creole) speakers make up the bulk of the non-English speaking population, accounting for about 25 percent of New York City (or 1.87 million) residents. Chinese came in second, with around 419,000 speakers in the Big Apple. There are around 106,000 people who chat in French Creole, another 81,000 who speak French, and 186,000 Russian speakers. The American Community Survey also found 85,000 Yiddish speakers, while 47,000 speak Hebrew at home. Nearly 200,000 New Yorkers converse in an Indic language like Hindi, Urdu or Gujarati. And 53,000 communicate in Arabic.”9
Not every visitor to a website will want translated content. However, if people can choose which language they want - if the website can predict and offer a more suitable and comfortable language for them based on census tract data and where they live, it brings emotional intelligence to the automated world – connecting the emotional and rational. This is one example of how emotional intelligence can combine with AI to enhance and humanize the automated world.
1 Emotional Intelligence, page 7
4 2016 US Census projections
5 2014 US Census