Author: Francesca Gino

  • One Weird Trick to Save $345 Billion

    The April 15 deadline for filing personal income tax returns in the United States is quickly approaching. According to the 2012 Taxpayer Attitude Survey, 87 percent of Americans believe that cheating on taxes is wrong. Moreover, 95 percent of those surveyed reported that their personal integrity affects their honest reporting on tax forms.

    And yet: According to a recent estimate, the gap between actual and claimed taxes due in the United States amounts to roughly $345 billion, more than half of which, the IRS estimates, is lost because people misrepresent their income and deductions. These people give in to the temptation to cheat even though they must sign a statement at the bottom of the tax form declaring that the information provided is “true, correct, and complete.”

    The survey data captures what people think — but not how they act. From research that I’ve done, the same tendency exists in other facets of our lives. When confronted with the opportunity to cheat, most people engage in behavior that violates their own ethical goals.

    Fortunately, simple interventions can help. For instance, consider a study that my colleagues and I conducted a few years ago [PDF] in collaboration with a major U.S. car insurance company. As part of the study, we sent 13,488 of the company’s customers a form that asked them to report the number of miles they had driven the prior year, as indicated on their cars’ odometers. Cheating by under-reporting mileage would come with the financial benefit of lower insurance premiums.

    On about half of the forms sent out, customers were supposed to sign to indicate their truthfulness at the bottom of the form. The other half of the forms asked the customers to sign at the top of the form. The average mileage reported by customers who signed the form at the top was more than 2,400 miles higher than that reported by customers who signed at the bottom of the form.

    Our follow-up research [PDF] demonstrated that signing at the top of the form (before reporting information that could be inflated) increased the salience of ethical standards by highlighting people’s self-identity and improving their ethicality.

    This research hints at how simply nudging people toward more ethical behavior can have important implications for organizations, which commonly bear substantial costs from dishonesty. For instance, according to a recent estimate, U.S. companies lose approximately $600 billion per year to employee theft and fraud. Most of us understand that we slip up occasionally, despite our best intentions, and that others do as well. And so it’s useful for organizations to consider some simple interventions that can help their customers and employees stick to their ethical principles.

    Organizations often use codes of ethics that employees must read and sign to indicate their intended compliance. But codes are insufficient on their own. To be effective, they need to be integrated into the organization’s culture, and their importance just be stressed and discussed. For instance, the CEO as well as senior management in an organization should make their commitment to the codes of ethics visible and clear to employees, and communicate the value they put on ethics in orientation programs, annual reports, newsletters, meetings, and training sessions.

    Organizations can also benefit more from the type of ethical nudge that would likely improve our honesty on our tax forms. Think, for instance, about the contracts we sign that explicitly stipulate the terms and agreements that different parties are expected to adhere to during a transaction or negotiation. Though we might hope that people read these documents carefully before signing, having them sign at the bottom of the form might cause them to miss important information and sign to terms they may not be able to uphold. When organizational representatives provide inaccurate numbers and sign contracts without carefully considering all the details of a business agreement, the business relationship and the company’s reputation are put in jeopardy.

    Moving the signature line to the top of a contract, along with a statement declaring the numbers reported are accurate, might cause signees be more truthful about the information they are declaring. And it may also lead them to pay more attention to the details specified in the document they are signing. The application of such simple ethical nudges could span to other contexts, such as reminding financial advisers of their fiduciary duty to their clients and doctors of their Hippocratic oaths.

    The estimated U.S. personal income tax gap of $345 billion is clearly formidable. But we may be able to narrow it over the years, one signature at a time.

  • Can Light Make You More Honest at Work?

    If waking up this past week was harder than usual for you, you are not alone. Although daylight saving time throws our circadian rhythms out of sync, the ongoing rationale for changing the world’s clocks twice each year has been energy savings. And recent national study suggests that there may be another benefit: crime reduction.

    Researchers from University of Virginia and College of William and Mary examined how daylight saving time influences crime rates in the U.S. using data from the three weeks before and after the springtime switch over a four-year period. Their analyses revealed some compelling results: daylight saving time reduced robbery by 51 percent, rape rates by 56 percent, and murder by 43 percent. The researchers estimated that since 2007 the daylight saving time resulted in over $550 million in avoided social costs of crime per year. More lighting, the researchers argue, increases the likelihood of being seen by witnesses (or the police), which in turn discourages crimes.

    Empirical evidence seems to support their argument: research from the ’60s and ’70s shows that criminal assaults are most frequent during hours of darkness and that dark rooms promote aggressive behavior. Darkness promotes criminal activities by producing anonymity, as dishonesty is more likely when offenders cannot be identified.

    My colleagues Chen-Bo Zhong, Vanessa K. Bohns, and I wanted to investigate the extent to which lighting conditions would affect people’s honesty within organizations. If darkness spurred society’s criminal element, would it have the same effect on well-intentioned and apparently trustworthy employees?

    In one laboratory experiment, we placed participants in a dimly or well-lit room and asked them to complete 20 math problems under time pressure. The participants received a cash bonus for every correct answer. Since we were interested in whether darkness affects cheating rates, we left it up to the participants to score their own work and to pay themselves from a supply of money they had received at the beginning of the study. While there was no difference in actual performance on the math problems, almost 61 percent of the participants in the slightly dim room cheated while “only” 24 percent of those in a well-lit room did. Eight additional fluorescent lights in the room where the study took place reduced dishonesty by about 37 percent.

    We wanted to take this one step further. Was it the lighting levels that changed behavior or perceptions of the lighting levels? To figure this out, we introduced sunglasses into the equation. In another experiment, some participants wore a pair of sunglasses and others wore clear glasses while interacting with an ostensible stranger in a different room (in actuality they interacted with the experimenter). Each participant had $6 to allocate between him- or herself and the recipient and could keep what he or she didn’t offer. Participants wearing sunglasses were more selfish: the amount of money they gave was 14 percent less than the amount shared by those wearing clear glasses. In addition, they reported feeling more anonymous during the study.

    A dark alley actually provides some anonymity to criminals. But in this research, darkness had no bearing on actual anonymity — yet it still increased dishonesty and other morally questionable behaviors. The experience of darkness may thus induce a sense of anonymity that is disproportionate from actual anonymity in a given situation. In fact, follow-up research has found that brightness increases self-awareness, reflective behavior, and self-control.

    I am not suggesting we flood executives’ offices with light to promote ethical behavior. But we should probably pay more attention to the many ways in which we are in the dark. Our work life is full of such situations: we may feel anonymous when we communicate via e-mail, when we post information online without revealing our identity (hello, internet trolls!), or when we work remotely rather than in the office. So, the next time you are on your computer to chat or text, you may consider raising the blinds and ask the person on the other end to do the same. More generally, being aware of the factors that make you feel you are in the dark will help you follow your moral compass.

  • The Strange Behavioral Logic of the Sequester Stalemate

    Imagine you’ve just returned from your annual physical, and it didn’t go well. Your doctor informed you that you’re overweight and are likely to have health problems if you don’t drastically change your diet and exercise routines. To make things worse, she set a hard deadline by scheduling a follow-up appointment with a specialist. You can avoid this follow-up, but only if you make some hard decisions and change your habits in meaningful ways. So why, when you have just days to go before the appointment, will you find yourself sitting on the couch, chip bag in hand?

    Now, swap “you” with “the U.S. government” and you can begin to understand the psychology of the sequester.

    A little background: Last year, amid quite a bit of negotiating, U.S. politicians could not come to an agreement regarding taxes and spending. Thus, they agreed to agree on a sensible budget by a certain deadline. As a result, Congress and the president must find some way to agree on meaningful fiscal changes before March 1, or the U.S. budget faces arbitrary and automatic cuts across the board.

    The hope was that by setting up these arbitrary cuts — something that would seem unacceptable and would negatively impact a whole host of U.S. citizens — Congress would be motivated to come to some reasonable compromise in the allotted time. And yet here we are, with no solutions.

    Now we’re all left asking: how did this happen?

    Aside from the usual banter about a dysfunctional government, behavioral science research can help explain the reasons behind the current stalemate. We know from hundreds of research studies that goals do motivate people: specific, difficult goals make people strive harder to accomplish what they set out to do. One example, ironically, comes from government: In 1961, president John F. Kennedy gave a speech that set the goal of getting people to the moon and safely back within a decade. At the time, the U.S. had only launched an astronaut 115 miles above the earth. Going to the moon was a much more difficult goal: astronauts would travel 270,000 miles from home. Of course, you know the rest of the story: less than 10 years later, the U.S. landed on the moon.

    We also know that people who set goals for themselves consistently perform better and more effectively on tasks (and particularly on onerous ones) than people who set no goals. These specific, difficult goals are intended to motivate us to do things that we do not like to do, like negotiating over budget issues with a counterpart who does not share our views. Or, in the case of your imaginary appointment, eating vegetables.

    But research also suggests that goals are not always beneficial. When people violate their goals (eating that bag of chips), they experience further delays in task completion and tend to perform poorly. So if you only have five pounds to go, you’re more likely to try hard to lose it. But if you fail to accomplish your goal, or think that reaching it is nearly impossible, you’re more likely to experience negative emotions and resignation.

    Making goals realistic also matters. In a 2004 study, participants were asked to complete a proofreading task within 30 days, and received payment upon task completion. In one condition, participants chose a difficult deadline for themselves. In another condition, they chose a realistic deadline for task completion. And in a third condition, they had no specific deadline. Over 83% of participants who identified a reasonable deadline completed the task in the allotted 30-day timeframe, while only about 62% of the participants in the other two conditions with no goal completed the task within the allotted time.

    In the case of the U.S. fiscal cliff negotiations, let’s first make two important assumptions: that the parties at the table actually want to come to some agreement, and that coming to this agreement is incredibly difficult. With this in mind, it is likely that the March 1 deadline may have been demotivating rather than motivating. Knowing that avoiding the sequester is nearly inevitable, or is going to include a lot of work, may have led to resignation rather than high motivation. And such resignation led us to the current stalemate in negotiations.

    It’s also important to consider the way time plays into solving problems. It was easy for the government to make plans to fix the budget crisis, but it’s unclear as to whether its goals were actually within reach. And because the commitment involves a large group of people, members of Congress may have felt that additional time afforded them the opportunity convince the other side to see the issues their way. This, of course, is a problematic assumption. In fact, research by my Harvard colleagues Mike Norton and Todd Rogers indicates that people have an overly optimistic view of the future when it comes to wants and preferences.

    In one study, they asked respondents to indicate their political orientation and then answer the following question: “In twenty years (by 2031) how do you think the electorate in the United States will be different? Will it be more conservative than today, more moderate than today, or more liberal than today?” The results? Conservatives were more likely to believe that the future will be more conservative than the other two groups, moderates were more likely to believe that the future will be more moderate, and liberals were more likely to believe that the future will be more liberal.

    Similarly, in my own research, I find that when we experience disagreements or conflict in negotiation and decide to meet at a later time to discuss further, we believe that the additional time will help the other party realize that our perspective is the right one. But the problem is that both parties share this belief, and thus the additional time turns out to be unhelpful to reaching a compromise or any other form of agreement.

    When they agreed to the sequester deadline, both sides may have believed that the future would magically solve any disagreements. But such wishful thinking, as the budget debate is now showing, is often a good predictor of failure.