Questions? (877) 643-7244
FutureNow Post
Friday, Sep. 23, 2005 at 4:57 pm

Persuasion: The Art Of Getting What You Want

By Jeffrey Eisenberg
September 23rd, 2005

PersuasionA friend of ours, Dave Lakhani, just sent me a copy of his new book Persuasion: The Art Of Getting What You Want . He has an interesting website www.askthepersuader.com to promote the book. It’s worth exploring because it is different. Let us know what you think.

Add Your Comments

Comments (3)

  1. I’ve read this book – it’s very, very good. Not quite as ground-breaking as “Influence” by Cialdini… but a required tool in the kit of anyone who persuades for a living.

  2. To myth or be mythed

    Often, I receive, either comic or serious, a list of myths that people seem to hold dear. Marketing and it’s subset, sales, has many myths to offer and it might be fun to gather such a list and uncover what each myth supports or condemns. However, the thought of this exercise also made me wonder why it was that supposedly smart people ended up believing such strange things.

    This pondering sent me to the annals of psychology. To some extent these errant beliefs seem to stem from bounded awareness/intentional blindness and framing. Perhaps there may well be another factor at work. We seem to be hard wired to ‘believe’.

    Daniel Gilbert, a professor of psychology at Harvard, has explored how we go about believing and understanding information. In a series of papers.1 Gilbert and co-authors have explored the belief process using two alternative philosophical viewpoints. What is your take on this posit of Gilbert?

    Cartesian systems

    The first view is associated with the work of Rene Descartes. When it came to belief, Descartes suggested the mind performs two separate mental acts. First, it understands the idea (left brain). Secondly, the mind assesses the validity of the idea that has been presented. This two-stage process seems intuitively correct. After all, we can all imagine being presented with some novel idea, holding it in our minds and then pondering the truth or otherwise associated with the idea. The Cartesian approach fits well with folk psychology.

    Descartes was educated by Jesuits and like many 17th century philosophers’ generally deployed psychology and philosophy in the aid of theology. Like anyone of any sense, Descartes was well aware that people were capable of believing things that were not true. In order to protect the Church, Descartes argued that God had given man the power to assess ideas. Therefore, it clearly was not God’s fault when people believed things that were not true.

    As Gilbert (1993, op cit) notes, Descartes approach consisted of two axioms. Firstly, the mental separation and sequencing of understanding and believing and secondly, that people have no control over how or what they understand, but are totally free to believe or disbelieve ideas as they please.

    Spinozan systems

    Spinoza’s background and thinking could not be much more different from Descartes. Born a Jew, Barauch de Espinoza (later to become Benedict Spinoza) outraged his community and synagogue. The tensions finally resulted in Spinoza being excommunicated, accused of abominable heresies and monstrous deeds. The order of excommunication prohibited other members of the synagogue from having any contact with Spinoza.

    Freed of the need to conform to his past, Spinoza was able to explore anything he chose. One of the areas he turned his considerable mental prowess to was the faults contained in the Cartesian approach. Spinoza argued that all ideas were first represented as true and only later (with effort) evaluated for veracity. Effectively Spinoza denied the parsing that Descartes put at the heart of his two-step approach. Spinoza argued that comprehension and belief were a single step. That is to say, in order for somebody to understand something, belief is a necessary precondition. Effectively all information or ideas are first accepted as true, and then only sometimes evaluated as to their truth, once this process is completed a corrected belief is constructed if necessary.

    Libraries

    Gilbert et al (1990, op cit) use the example of a library to draw out the differences between these two approaches. Imagine a library with several million volumes, of which only a few are works of fiction. The Cartesian approach to filing books would be to put a red tag on each volume of fiction and blue tag on each volume of non-fiction. Any new book that appeared in the library would be read, and then tagged as either fiction or nonfiction. Any book that is unread is simply present in the library until it is read.

    In contrast, a Spinozan library would work in a very different fashion. Under this approach, a tag would be added to each volume of fiction but the non-fiction would be left unmarked. The ease of this system should be clear; it requires a lot less effort to run this system than the Cartesian approach. However, the risk is that if a new book arrives it will be seen as non-fiction.

    Gilbert et al note that under ideal conditions both systems produce the same outcome if allowed to run to conclusion. Therefore, if you pick up a copy of Darwin’s ‘The expression of emotions in man and animals’ and asked the Cartesian librarian what he knew about the book, he would glance at the tag and say non-fiction. The Spinozan librarian would do pretty much the same thing, concluding the book was non-fiction because of the absence of a tag.

    However, imagine sneaking a new book into the library; say the latest Patricia Cornwell thriller. If you took the book to the librarian and asked them what they knew about the book, their response would reveal a lot about the underlying process governing the library’s approach to filing. For instance, the Cartesian librarian would say “I don’t know what sort of book that is. Come back later when it has been read and tagged appropriately”. The Spinozan librarian would glance up and see the absence of a tag and say “it doesn’t have a tag so it must be non-fiction” – an obviously incorrect assessment.

    A testing structure

    The picture below taken from Gilbert (1993) shows the essential differences between the two approaches, and suggests a clever way of testing which of the two approaches has more empirical support.

    Say an idea is presented to the brain2, and then the person considering the idea is interrupted in some fashion. Under a Cartesian system, the person is left merely with an understanding of a false idea, but no belief in it. However, if people are better described by a Spinozan approach then interrupting the process should lead to a belief in the false idea. Therefore, giving people ideas or propositions and then interrupting them with another task should help to reveal whether people are Cartesian or Spinozan systems when it comes to beliefs.

    The empirical evidence

    It has long been suggested that distracting people can affect the belief they attach to arguments. For instance, in their 1994 review Petty et al3 report an experiment from 1976 which clearly demonstrated the impact of distraction techniques4.

    To test the impact of distraction, students were exposed to a message arguing that tuition at their university should be cut in half. Students listened to the ideas that were presented over headphones. Some heard strong arguments others heard relatively weak arguments. At the same time, the students were subjected to a distraction task that consisted of tracking the positions of Xs that were flashed on a screen in front of them. In the high distraction version of the task, the Xs flashed up at a fast pace, in the low distraction task the rate was reduced heavily.

    The results Petty et al found are shown in the chart below. When the message was weak, people who were highly distracted showed much more agreement with the message than did the people who only suffered mild distraction. When the message was strong and distraction was high, the students showed less agreement than when the message was strong and the distraction was low. Distraction did exactly what it was meant to do… prevented people from concentrating on the important issue.

    Petty et al conclude “Distraction, then, is an especially useful technique when a person’s arguments are poor because even though people might be aware that some arguments were presented, they might be unaware that the arguments were not very compelling.” Something to bear in mind at your next meeting with brokers perhaps? The next time an analyst comes around and starts showing you pictures of the next generation of mobile phones just stop and think about the quality of their investment arguments.

    Is there more direct evidence of our minds housing a Spinozan system when it comes to belief? Gilbert et al (1990, op cit) decided to investigate. They asked people to help them with an experiment concerning language acquisition in a natural environment. Participants were shown ostensibly Hopi words with an explanation (such as a monishna is a bat). They had to wait until the experimenter told them whether the word they had been given was actually the correct word in Hopi or whether it was a false statement.

    Subjects also had to listen for a specific sound which if they heard required them to press a button. The tone sounded very shortly after the participant had been told whether the statement was true or false. This was aimed at interrupting the natural processing of information. Once they responded to the tone, the next Hopi word appeared preventing them from going back and reconsidering the previous item.

    When subjects were later asked about their beliefs, if they worked in a Spinozan way then people should recall false propositions as true more often after an interrupt than the rest of the time. As the chart below shows, this is exactly what Gilbert et al uncovered.

    Interruption had no effect on the correction identification of a true proposition (55% when uninterrupted vs. 58% when interrupted). However, interruption did significantly reduce the correct identification of false propositions (55% when uninterrupted vs. 35% when interrupted). Similarly one could look at the number of true-false reversals (the right side of the chart above) When false propositions were uninterrupted, they were misidentified as true 21% of the time, which was roughly the same rate as true propositions were identified as false. However, when interrupted the situation changes, false propositions were identified as true some 33%, significantly higher than the number of true propositions identified as false (17%).

    In another test Gilbert et al (1993, op cit) showed that this habit of needing to believe in order to understand could have some disturbing consequences. They set up a study in which participants read crime reports with the goal of sentencing the perpetrators to prison. The subjects were told some of the statements they would read would be false and would appear on screen as red text, the true statements would be in black text.

    By design, the false statements in one case happened to exacerbate the crime in question and in the other case they attenuated the crimes. The statements were also shown crawling across the screen – much like the tickers and prices on bubble vision. Below the text was a second row of crawling numbers. Some of the subjects were asked to scan the second row for the number (5) and when they saw it, they were asked to press a button.

    At the end of experiment, subjects were asked to state what they thought represented a fair sentence for the crimes they had read about. The chart below shows that just like the previous example, interruption significantly reduced the recognition of false statements (69% vs. 34%), and increased the recognition of false statements as being true (23% vs. 44%).

    The chart below shows the average recommended sentence depending on the degree of interruption. When the false statements were attenuating and processing was interrupted there wasn’t a huge difference in the recommended jail term. The interrupted sentences were around 4% lower than the uninterrupted ones. However, when the false statements were exacerbating and interruption occurred the recommended jail term was on average nearly 60% higher than in the uninterrupted case!

    Strategies to counteract naïve belief

    The thought that we seem to believe everything in order to understand it is more than a little disconcerting. It would seem to render us powerless to control our beliefs. However, the absence of direct control over our beliefs doesn’t necessarily imply we are at their mercy.

    Two potential strategies for countering our innate tendency to believe can be imagined. The first is what Gilbert (1993, op cit) calls ‘unbelieving’. That is, we can try to carry out the required analytic work to truly assess the veracity of an idea. This certainly appeals to the empiricist in me. My own personal viewpoint is that we should accept very little at face value and use evidence (left brain) to assess how likely the proposition actually is.

    For example, we are often told that stock markets earnings can grow faster than nominal GDP over extended periods. Of course, in year-to-year terms there isn’t a close linkage between the two. However, in the long run earnings (and dividends) have grown substantially below the rate of nominal GDP growth (on average earnings have grown 1 – 2% below the rate of nominal GDP; see Global Equity Strategy, 16 August 2002 Return of the Robber Barrons for more on this).

    Growth investing is another example. I don’t doubt that some growth (vs. value) investors are very successful. However, the empirical evidence shows that picking winners in terms of growth is exceptionally difficult and fraught with danger (in that buying expensive stocks with high-embodied expectations obviously opens up considerable downside risk if reality falls short of expectations). In contrast, buying cheap stocks offers a margin of safety against disappointment. (See Global Equity Strategy, 16 March 2005, Bargain Hunter for more details).

    So regularly confronting beliefs with empirical reality is one way of trying to beat the Spinozan system. However, ‘unbelieving’ is a risky strategy since it relies on you having the cognitive wherewithal to be on your guard. Gilbert et al have shown that cognitive load, pressure and time constraints all undermine our ability to reject false beliefs.

    The second potential belief control mechanism is called ‘exposure control’. This is a far more draconian approach than ‘unbelieving’. False beliefs can be avoided by avoiding all beliefs, just as a dieter who loves doughnuts may choose to avoid shops that sell doughnuts, we can try to avoid sources of information that lead us to hold false beliefs. This is a conservative strategy that errs on the side of exclusion and it excludes false beliefs, but it may also exclude some true beliefs. However, it doesn’t suffer from the problems of overload, pressure or time constraints unlike the ‘unbelieving’ strategy.

    All of this suggests that a combination of these strategies is likely to be optimal. When you are really trying to assess the validity of an argument, do your best to avoid distraction. Turn off your screens, put your phone on call forward, and try to cut yourself off from all the sources of noise. Of course, management and colleagues may well think you have taken leave of your sense as you sit there with your screens off, but try to ignore them too. If you are likely to be distracted, then either wait until later when you can give the assessment the time and effort it requires or simply follow an exclusion strategy. Energy flows where attention goes.

    Where are the myths herein? Is persuasion effected by distraction?

    Jim Ronay

    ________________________________________

    Footnotes:

    1 Gilbert, Krull, Malone (1990) Unbelieving the unbelievable: Some problems in the rejection of false information, Journal of Personality and Social Psychology, 59. Gilbert (1991) How mental systems believe, American Psychologist, 46 Gilbert, Tafarodi, Malone (1993) You can’t believe everything you read, Journal of Personality and Social Psychology, 65 Gilbert (1993) The Assent of Man: mental representation and the control of belief in Wegner and Pennebaker (eds) The Handbook of Mental Control

    2 This hints that we support a Spinozan view of the human mind. Descartes was famous for arguing the difference between the brain and the mind, Spinoza in contrast saw the two as impossible to separate, they are two sides of the same coin from a Spinozan viewpoint. For more on this see Antonio Damasio’s first and third books, Descartes’ Error and Looking for Spinoza respectively.

    3 Petty, Cacioppo, Strathman and Priester (1994) To think or not to think, in Brock and Shavitt The Psychology of Persuasion

    4 Petty, Wells and Brock (1976) Distraction can enhance or reduce yielding to proganda, Journal of Personality and Social Psycholgy, 34

  3. Have not read the book, though took a look at the site. He wants my email address, he did not get it.
    Though i am sure the book is very good.

Add Your Comments

 

Print this Article
Share

Jeffrey Eisenberg, founder of FutureNow, is a professional marketing speaker and the co-author of New York Times and Wall Street Journal bestselling books Call to Action and Waiting For Your Cat to Bark. You can friend him on Facebook.

More articles from Jeffrey Eisenberg

Marketing Optimization Blog
FREE Newsletter Sign-Up
send it once every: