Jeffrey Saltzman's Blog

Enhancing Organizational Performance

Posts Tagged ‘System 1

Privacy, Persuasion and Fundamental Rights

leave a comment »

Perhaps, not surprisingly, it started with a lie. In 1957, James Vicary, on a hot summer day, in a Fort Lee, NJ movie theater, claimed to have run an experiment where he said he inserted frames into a movie and flashed on the screen the words “Eat Popcorn” and “Drink Coca Cola”. He claimed this subliminal (meaning literally below threshold) advertising resulted in huge increases in the sales of popcorn (up 58%) and Coca Cola (up 18%).  Vicary stated that subliminal communication was so powerful and had such potentially dangerous uses that he suggested warning the public when subliminal techniques were in use, and even seemed to think that some sort of governmental regulation might be needed.

Congress held hearings, legislation was proposed, but not passed. The public felt they were being manipulated. Norman Cousins, editor of The Saturday Review, warned his readers about subliminal communications. Among the uses his article warned about was the potential to manipulate voting patterns for political candidates and influence the outcome of an election.

On the fifth anniversary of his “experiment”, Vicary admitted that it was a hoax, a ruse and that his goal was to revive his failing consulting practice (Advertising Age, Sept 17, 1962). Apparently, his thinking was it did not matter if his findings were real or not, just that his potential clients believed that they were real. By this time, he was the director of survey research for Dun & Bradstreet as he attempted to resurrect his career as a psychologist.  Some question whether the insertion of the words ever took place.

So, is subliminal perception and its ability to influence people pure bunk? Thijs Verwijmeren, et.al.  (Journal of Consumer Psychology, April 2011) came to a conclusion that subliminal advertising can have some limited effect, but it is not all that powerful. Subliminal ads, for instance, can’t make you do something you don’t want to do. Others question the very concept. If something can’t be perceived, because it is subliminal, how can it possibly affect behavior? The notion is that you can affect the subconscious mind without the conscious mind being aware of the affect. Regardless of the efficacy of this particular technique, the temptation to influence people, to change their behaviors continues, through various other avenues and methods.

Classical economic theory states that humans are rational thinkers and make decisions that are in their best economic interest after considering all the facts. Much economic policy over the years has been based on this concept and of course it is wrong, for humans are anything but cool, rational thinkers as they work through their decisions. We all take short-cuts in our decision-making using bias, heuristics or rules-of-thumb to get through the day (these are decisions or judgements we make without necessarily being conscious we are making them). Without them the number of decisions you would be required to make would simply paralyze you. Daniel Kahneman and Amos Tversky, two psychologists, drove those points home with a substantial body of research that gave rise to the field of Behavioral Economics. One cornerstone of their work was their definition and description of System 1 and System 2 type of thinking that humans use to make decisions.

System 1 thinking is automatic decision-making. It is quick and easy, requiring little to no effort when making a decision or passing judgement. System 1 thinking speeds decision-making and allows you to make thousands of unthinking decisions each and every day. The path you will drive to work, what you are likely to order in your morning coffee, do you put butter or cream cheese on your English muffin are all quick, ponderless, System 1 decisions you make. System 2 is when you use deliberative thought in order to make a decision. For instance, if you are ordering a new PC or Mac, if you are like most people, you methodically work through your options and the associated costs, prior to making a decision on what equipment to buy and how to configure it. You may check with friends and read reviews, part of your decision may be based on brand loyalty or a “coolness” factor that you perhaps can’t quite articulate, or one of our many human biases, such as WYSIATI, which stands for “What you see is all there is”, may come into play. Meaning you choose from the options before you and tend not to look for less obvious or unseen options. And you can be assured that the manufacturers of these devices are doing their best to influence your decision.

The speed limit sign says “30 MPH”, the sign on the escalator says “Stand Right, Walk Left”, in the parking garage there are signs that say “small cars only”, the express checkout line at the grocery store says “8 items or less”. We are informed that in order to enroll our kindergarten-aged child into school that we have to show proof of vaccination. All around us, every day there are attempts to influence our behavior, to modify what we are doing or to inform us what is allowable and what is not. While not perfectly so, these rules tend to be imposed when your behavior has the potential to negatively impact others around you, either directly or indirectly. The reason you are not trained to drive with a “use your best judgement on what your speed should be”, is 1. You may not be familiar with the road you are on, 2. If left to their own devices, some people’s judgement (especially younger or inexperienced drivers) might not be that good, 3. There is a tremendous potential for harm occurring to others if you make a poor decision. So, a sign is posted that informs you what is an appropriate speed for that road (and there are consequences to violating that sign’s speed limit). Your child’s ability to spread disease and contribute to epidemics (not in a good way) is the reason for the vaccination (and it does not matter if what you personally believe about vaccinations and autism goes against all the known science – no vaccination, no school – another penalty).

Writing in Scientific American (March 30, 2018), Marcello Lenca and Effy Vayena, describe Cognitive Liberty as “the freedom to control one’s own cognitive dimension (including preferences, choices and beliefs) and to be protected from manipulative strategies that are designed to bypass one’s cognitive defenses.” This was written in response to the Cambridge Analytica scandal where, in an attempt to influence the last presidential election, at least 87 million Facebook users, unbeknown to them, were targeted for customized digital ads and other manipulative information in a manner that “circumvents user’s awareness of such influence”. And that is the key difference between a speed limit sign, a vaccination requirement and subliminally trying to get you to drink more Coca Cola.  One approach is direct and in-your-face, it is transparent, while the other tries to influence you without you realizing you are being influenced. They continue, “most of the current online ecosystem, is an arm’s race to the unconscious mind: notifications, microtargeted ads, autoplay plugins, are all strategies to induce addictive behavior, hence to manipulate”.

The Cambridge Analytica CEO in an undercover interview with Channel 4 News in the UK stated, that it did not matter whether something was real and factual, just that people believed that it was. People, of course, are more inclined to believe information that supports their existing viewpoint, whether it is real or not. Remember Pizzagate – the falsehood spread by certain websites that specialize in spreading falsehoods and the alt-right during the election, that Hillary Clinton was running a child sex ring in the basement of a Pizza Parlor, except she wasn’t and the accused Pizza Parlor did not even have a basement. But never-the-less, a true believer went to that Pizza Parlor with his gun and started firing in a System 1 thinking pattern. He never paused to consider the information he was receiving in a rational manner. The shooter viewed himself as a “good guy with a gun” going to stop bad people, except it was all a delusion meant to influence behavior. A delusion that was crafted by the inappropriate use of big data to find people’s fears, to manipulate them and to capitalize on them. So here you have potentially deadly consequences to falsehoods spread on social media, for which there is no penalty.

It wasn’t the first time and it certainly won’t be the last time that occurs.  And to underscore that this wasn’t some accident, during the last presidential election you also had Republican operatives making statements on TV interviews such as “we are not going to let facts determine the outcome of this election”, or we are presenting “alternative facts”, in other words just like James Vicary’s “Drink Coca Cola” ruse, it did not matter whether it was real or not, just that a population critical to your success, potential customers or in the case of an election, potential voters, believed that it was real.  And because of that orientation and the lack of regulation or penalties around it, the spread of disinformation, enabled by social media and Russia attacking our democracy, reached unprecedented levels.

Our technology, once again, is much more advanced than the social structures we surround it with, at least at first.  Partly that is due to pace of innovation being quite a bit faster than the pace of social structure change. But this is nothing new. For instance, when we, as a species, starting writing down our stories, our gurus at that time were those who could read. They assumed a special elevated status within society and because of their skill set they were the bearers of the “word” and able to manipulate and persuade the masses, for who could argue with someone about a text that you could not read? The priests of today are those who can create and harness the technology that can influence the masses and those who can build smart systems to enable that to happen more effectively.  In a special section on AI appearing in The Economist (March 31st-April 6th, 2018), it was estimated that for each capable AI tech person in a company today, the value of that company increases by 5 to 10 million dollars, so it is no wonder that sophisticated AI talent today draw 6- and 7- figure salaries.

Sander van der Linden in Psychological Weapons of Mass Persuasion, (Scientific American, April 10th 2018), quotes a study covering 3.5 million people which “found that psychologically tailored advertising, i.e. matching the content of a persuasive message to an individual’s broad psychographic profile, resulted in 40% more clicks and 50% more online purchases than mismatched or unpersonalized messages”.  We have come a long way, with the help of our technology, from flashing “Eat Popcorn” and “Drink Coca Cola” on a screen. And more importantly he states that these messages when carried over into a political environment can have the ability to either suppress voting for the candidate these messages are targeted against or can swing some potential voters to switch candidates. When elections are often decided by a percentage point or two, that small effect can have a large impact. In addition to the USA presidential election, it now appears that Britain’s EU exit vote was influenced using the same techniques.

So yes, we are at risk, in an unregulated, wild-west of a technology world, elections can be affected, Cognitive Liberty, our very democracy can be undermined, autocrats/dictators or would-be tin-pot dictators with effective social media disinformation and targeted voting campaigns can be voted in. What can we do?

Tim Cook, the CEO of Apple, stated in a recent townhall meeting in Chicago that “privacy is a human right”.  That aligns him with his predecessor, Steve Jobs, who in 2010 stated, “”Privacy means people know what they’re signing up for, in plain English and repeatedly.” “I believe people are smart and some people want to share more data than other people do. Ask them. Ask them every time. Make them tell you to stop asking them if they get tired of your asking them. Let them know precisely what you’re going to do with their data.”

The European Union has just enacted GDPR or General Data Protection Regulations, which does align with the sentiments of both Steve Jobs and Tim Cook. In a nutshell, among the GDPR requirements are that individuals give permission for their data to be collected, be informed regarding what data is being collected, how it will be used, how and for how long it will be stored, and at any time they can see what data you have on them and they can demand that they be erased from your systems. There are many other requirements as well and violations result in large fines. In an editorial, The Economist recently called for the USA to adopt the EU data protection regulations.

But in addition to protecting the data we can work towards making people savvier about what they see on social media and how to determine reality from delusion or disinformation. For instance, some possibilities include (and I am sure if a group put their minds to it, many more possibilities would emerge):

  • Transparency can be increased, for instance, just like restaurants in NYC get cleanliness ratings, social media sites can get ratings regarding the veracity of the information they carry. Is the information verified in any fashion or is it just put out there?
  • Sites that label themselves as news, or TV stations that label themselves as news should adhere to certain news worthy standards in order to keep that designation. Each program should be clearly labeled as meeting “news standards” or should be clearly labeled “opinion” not just at the start of the show, but the whole time the show is on.
  • News organizations used to adhere to a separation of church and state. Meaning the news side of the business should not be influenced by the business side of the business. To achieve a certain news rating this standard would have to be met.
  • One method towards getting people to become better consumers of information is to educate them on how humans consume information and make decisions. It is a first step towards taking them out of System 1 thinking when appropriate and having them activate System 2 thinking.
  • Penalties can be implemented for knowingly spreading false information.

Social media now has the power to cause great harm to others and to our society. As the saying goes with great power comes great responsibility, but so far social media has not proven itself capably of operating in that responsible fashion. We are in a variation of the “Tragedy of the Commons” moment when it comes to social media. The tragedy of the commons describes a situation where individuals acting independently put their own self-interest above a common interest in a shared resource. Because each is only concerned about their own interest, they each use the resource until it is despoiled and of no use to anyone. Collective action is needed to save the resource so it can be used by all to mutual benefit. While originally the concept of the commons was based on shared unregulated grazing ground over the years it has morphed to mean “any shared and unregulated resource such as atmosphere, oceans, rivers, fish stocks, or even an office refrigerator”. While social media is an unlimited resource, it too will be despoiled if it is only used by individuals for their own self-interest without regard to the harm it is causing others and to society.

 

Written by Jeffrey M. Saltzman

April 15, 2018 at 4:28 pm

The Language of Business

with one comment

One hobby I have (which I have to admit I have not had much time to engage in recently) is browsing through New York book stores looking for books with old Jewish folk tales or stories of Jewish life from the “old country”. While these old folk tales may seem out of place or out of time in our modern world, I find that sometimes they have enduring kernels of wisdom. I occasionally take these old folk tales and translate them into modern organizational terms which I can use in my day-to-day work.

For instance, there is an old folktale that begins with two travelers, strangers, walking down a long dusty road.  As they walked, one of the strangers asked the other “What say you, shall I carry you or shall you carry me?” The second traveler ignored the statement for he was not about to carry the other.

Later on the traveler asked a second question as they passed a field of barley, “Has this barley been eaten or not?” Once again the second traveler ignored the first for it was obvious for all to see that the barley was still growing in the field.

Then they passed a funeral procession and the one stranger said to the other “What do you think, is the person in the coffin alive or dead?” The second traveler could no longer contain himself and asked the first why he was asking such ridiculous questions.

The first one said, “When I asked if I should carry you or if you should carry me, what I meant was shall I tell you a story or shall you tell me one to make this long journey easier for us.”

“When I asked about the barley, what I meant was has the growing barley already been sold to a buyer, for if it has already been sold, it is as though it is already eaten for the farmer and his family cannot eat it themselves.”

“And when I asked about the person in the coffin, being alive or dead, what I meant was, do you think the person had descendants, for if they had descendants who will carry on their legacy it is as though they are alive. But if they passed away with no one to remember them and carry on their work they are truly dead.”

Communications between two people, between groups of people, or between organizations who do not know each other well is often very difficult and can be subject to vast misunderstandings, even when they are speaking the same language. And with those misunderstandings sometimes comes suspicion and fear. It is as though the two groups who are communicating, whether they be black and white, Jew or Arab, Israeli or Palestinian, police vs. those being policed, are actually having two completely different conversations with neither group able to translate what the other is saying or have an understanding of their actions.

Even when the communication is clear there can be issues of saliency, the importance given to the words used in the communication between the two parties which causes misunderstandings and what can be interpreted as behavioral anomalies.

One story, that is not quite a folk tale but illustrates the point regarding saliency, is about a method that Jews used to obtain passports to escape the horrors of WWII.  Some families, who could not get legitimate documents struggled to obtain fake passports in their attempts to flee. One such story takes place in Poland in the late 1930’s and describes how documents were forged by studying old passports from different countries. Their style and format were then copied, creating new fake passports for those trying to escape.

One day a man, who was part of the Jewish underground, set out to collect old passports through whatever method he could. He was extremely successful and by the end of the day he had collected a large number.

On the way back from his activities he was stopped by the Polish police, was asked for his papers, and then confronted when they discovered the large number of passports he was carrying.

He was sure that they would take him to the police station, torture and then kill him as they tried to learn about the activities of the underground organization collecting the passports.

The man thought about how troubled his family would be if they never saw him again without any explanation. But, it was near the end of the day and the police told the man to go home and then come to the station in the morning for questioning. The man was terrified. The police knew who he was and if he stayed home and did not show up the next day or if he tried to flee that night, they would simply go to his house and kill his family.

If he showed up the next day at the police station he was certain they would torture him to obtain information and then kill him anyway. He did not know what to do. After much deliberation and consultations with his family and Rabbi, he went to the police station the next morning and approached the policeman who had stopped him.

The policeman asked him what he wanted and appeared not to remember the previous day’s incident. The man indicated that he had been stopped and a packet of passports had been taken from him and he was here to collect it. The policeman handed the man the packet of passports and told him to be on his way.

Saliency is a psychological concept which deals with how central an event, object, fact or perception is to you or another person. The amount of saliency an event or communication has is a combination of emotional, motivational and cognitive factors.

To the man with the passports in the story, being stopped by the police was extremely central to his very existence, for it was quite literally life or death. To the policeman, the man was one of hundreds of people that he had stopped and questioned that day.

The man who was stopped and questioned described the incident as a miracle, that his life and those of his family were spared. And from his perspective it certainly was, but what was the underlying mechanism of human perception that allowed that miracle to occur? Saliency.  To the policeman the incident was not nearly as salient, not nearly as memorable as it was to the passport procurer.

I have to admit something to you. Not being all that concerned about fashion, I buy irregular jeans. I grew up wearing jeans (one pair to wear while the other pair was being washed) and even today I am most comfortable pulling on a pair. I like to wear jeans. But I don’t like what jeans cost these days. So I go to a manufacturer’s outlet mall near me and I buy irregular blue jeans.

As I pull each pair off the shelf and examine them I am usually hard pressed to determine why they are called irregular. I look for the obvious, for instance, does it have 3 legs? (Well I could always use a spare, in case I get a hole in one knee). And I usually just can’t find anything wrong with them.

When I wear them, at least at first, I wonder if what is not obvious to me, the irregularities, are likely very obvious to those around me. I am sure I can hear people pointing at me and laughing as I walk by. But maybe it is not the pair of jeans that brings on the laughter.

But the reality is no one is looking at my jeans, let alone looking for the defect in my irregulars. I just think they are, at least for a moment, because the issue is more salient to me, until I forget about it.

When a manager makes an off-hand comment to a worker about that worker’s performance or future, what may be perceived by the manager as a minor topic or issue, just above the threshold of consciousness, may be perceived quite differently by the employee.

To the employee that comment might be indicative of whether or not they have a future with the organization, central to their very existence, while the manager might not even remember the comment the next day.

How many cases of miscommunication in the workplace, in politics, in negotiations, or in our everyday lives, are derived from a comment that has very different levels of saliency to the various people who might be listening to it? Managers may use what they perceive as throw away lines, about “future opportunities” or “earnings potential” not really thinking about just how closely the employee is listening or just how salient those message might be to the listener.

And now another aspect of communication that can impact decision-making has been shown to be whether the language you are speaking is your native tongue (Costa et.al., Your Morals Depend on Language, 2014).

A very common problem used to illustrate ethics as it relates to decision-making, is to ask a listener to imagine themselves on a bridge over-looking a trolley car track. There is a trolley car heading down the track that will shortly run into and kill five people, unless a heavy object is placed in its path. What can you do?

There is a very heavy man standing next to you. You are asked to consider pushing the heavy man onto the tracks to save the lives of five people. Sacrifice the one to save the five? Would you do it?  Not surprising to those of you whose native tongue is English, is that the vast majority of you would not push the heavy man in front of the trolley. That is reactionary thinking, or in the words of Daniel Kahneman, System 1 thinking.

But if the person you are asking is listening in English, and English is not their native tongue, you get a very different answer. Many would say that they would sacrifice the one to save the five. They need to apply some extra effort, which slows down their thinking and allows System 2 decision-making to kick in, which is not reactionary and allows people to consider more slowly, “is it worth sacrificing one to save five?”

Now take the scenario of having a group of managers in a global corporation sitting around making decisions about the operations of a company. The conversation is happening in English, but about half of the participants are not native English speakers. The two groups sitting around the table, the native vs. non-native English speakers may be coming to different conclusions, due to different decision-making systems that are kicking in as they consider organizational choices.

Language and the ability to use it properly is a critical aspect of good business performance. Yet as you dig into the use of language in business, whether it be definitional issues, saliency, or if you are speaking in your native tongue, the challenges of effective language use in today’s large, complex, global corporations are immense.

Written by Jeffrey M. Saltzman

December 24, 2014 at 4:48 pm

WYSIATI

with 27 comments

Daniel Kahneman coined the acronym WYSIATI which is an abbreviation for “What you see is all there is”. It is one of the human biases that he explores when he describes how human decision-making is not entirely based on rational thought. Traditionally, economists believed in the human being as a rational thinker, that decisions and judgments would be carefully weighed before being taken. And much of traditional economic theory is based on that notion. Dr. Kahneman’s life’s work (along with his co-author Dr. Amos Tversky) explodes that notion and describes many of the short-comings of human decision-making. He found that many human decisions rely on automatic or knee-jerk reactions, rather than deliberative thought. And that these automatic reactions (he calls them System 1 thinking) are based on heuristics or rules of thumb that we develop or have hard-wired into our brains. System 1 thinking is very useful in that it can help the individual deal with the onslaught of information that impinges on us each and every day, but the risk is when a decision that one is faced with should be thought through rather than based on a knee-jerk reaction.

System 1 decisions are easy, they are comfortable, and unfortunately they can also be wrong. But wrong in the sense that if one learned how to take a step back and allow for more deliberative thought prior to the decision, some of these wrong decisions or judgments could be avoided. A simple example from Dr. Kahneman’s book “Thinking Fast and Slow” will illustrate the point.

“A bat and a ball together cost $1.10. The bat cost $1.00 more than the ball. How much does the ball cost?” Fifty percent of the students who were posed this simple question, students attending either Harvard or Yale got this wrong. Eighty percent of the students who were asked this question from other universities got it wrong. This is System 1 thinking at its finest and most error prone. It is fast, easy, comfortable, lets you come up with a quick answer or decision, but one that is likely wrong. Knowing who reads this blog I’ll let you figure out the answer yourself.

WYSIATI is the notion that we form impressions and judgments based on the information that is available to us. For instance we form impressions about people within a few seconds of meeting them. In fact, it has been documented that without careful training interviewers who are screening job applicants will come to a conclusion about the applicant within about 30 seconds of beginning the interview. And when tested these initial notions are often wrong. Interviewers who are trained to withhold judgment about someone do a better job at applicant screening, and the longer that judgment is delayed the better the decision.

This notion of course flies in the face of Malcolm Gladwell’s best seller “Blink” in which he talks about the wonders of human’s ability to come to decisions instantly and a whole generation of manager’s have eagerly embraced his beliefs  – including a few CEO’s I know. Why? It is easy, it is intuitive, it is comfortable and it plays to the notion that I am competent and confident in my work. The only problem is that when put to serious scientific scrutiny, it is often wrong.

A few months ago I introduced this concept to an HR group I was talking to. I explained how untrained HR people in a rush to judgment will jump to conclusions about someone, perhaps too rapidly. One 30-year HR veteran insisted that this may be all well and good but of course did not apply to her. After all, with her 30 years of experience her rush to judgment was of course going to be accurate. She “just knew” who were going to be good employees. I let it drop, and I think I was labeled a trouble-maker by the group. That is a label I can embrace.

We tend to develop stories based on the information at hand; piecing the information we do have into a narrative, often without asking the question, “what information am I missing”? In the area of survey research I have often seen researchers confidently presenting the “drivers” of one type of behavior or another. Say for instance, the drivers of employee engagement. But since the analysis is based on a “within” survey design, the only drivers that can possibly emerge are those that you asked about in the survey in the first place. So the researcher, in designing the 30-50 item survey, is limiting the drivers to those items that they decided to ask about in the first place. The researcher likely has in their head a model of what is important in driving engagement when designing the questionnaire, a model that was designed based on another 30-50 item or fewer questionnaire. It becomes a tautology, it becomes true because I tested it and it came out as true, but the only thing I tested is what I already believed.

There are techniques that can be applied that lead towards more deliberative and better decision-making processes. If you were walking briskly down a busy road and someone asked you “how much is 17 x 24?” you would do what every other human would do to figure that out, you would stop and think.

Written by Jeffrey M. Saltzman

April 8, 2013 at 9:55 am