Author: Justin Fox

  • Jeff Bezos Brings His Low-Margin Ways to Newspapers

    Way back in the first decade of the new millennium, when Craigslist seemed like the biggest threat facing newspapers, founder Craig Newmark paid visits to lots of media companies and media conferences. Yes, his site was definitely taking classified advertising away from papers, he would say. But newspapers were still spectacularly profitable, he’d add, and might be just fine if they weren’t so intent on preserving those profit margins.

    These days, it’s an open question whether newspaper companies can maintain any kind of a profit margin at all. The New York Times seems to have turned the corner toward a modestly profitable future in which circulation revenue pays most of the bills. But The Washington Post, which billionaire Jeff Bezos agreed to buy yesterday, had been losing money since 2008.

    So Newmark was definitely right that newspapers need to learn to accept much lower profit margins. But switching from a high-margin business to a low-margin one is really hard. High margins sound like a good thing, and they can be. They’re evidence of what Warren Buffett — himself a long-time newspaper investor and major shareholder in the Washington Post Co. — dubbed a moat, which keeps competitors out and customers in. But when disruptive innovation threatens to breach a moat, high-margin companies usually find themselves especially ill-prepared to fight back.

    That’s partly because, as Clayton Christensen, Stephen Kaufman, and Willy Shih wrote in the 2008 HBR article “Innovation Killers,” standard financial metrics make new investments look much less attractive than existing business lines. It’s partly because managers of well-moated companies tend to turn complacent — or just don’t need much skill to run such a business in the first place. (Buffett, in his latest Berkshire Hathaway shareholder letter, tells of a newspaper publisher who confessed, “I owe my exalted position in life to two great American institutions — nepotism and monopoly.”) And finally, the owners or shareholders of high-margin businesses tend to see those margins as their due, and are thus unwilling to countenance lower-margin strategies. One of the things that bothered the newspaper industry most about Newmark, in fact, was that he didn’t seem interested in making much of a profit at all.

    The result of all this was that, while the decline of American newspapers (especially the big regional papers) was probably inevitable in the age of the Internet, the reluctance and at times inability of newspaper companies to transition from high-margin business models to low-margin ones has made things much worse. Layoffs and other cutbacks meant to preserve profit margins have only sped the decline in revenue, while bold new investments have been few. And for the most part the margins have declined anyway. Profits of more than 20% of revenue used to be common at newspaper companies. In 2012, the Pew Charitable Trusts said in its latest State of the News Media report, “the operating margin for Gannett was 9.9%, New York Times 5.4%, McClatchy 15.1%, E.W. Scripps 6.9% and A.H. Belo 8.1%. The Washington Post operated at a 9.2% loss.”

    Those are (apart from the Washington Post’s loss, of course) still pretty healthy margins by the standards of many industries. Amazon.com, which Bezos founded in 1995 and has run ever since, has an operating margin of just 1.5%. In a time of great change and disruption, Bezos has turned low margins — usually a sign of competitive weakness — into a competitive advantage. And while not much else is clear about what his ownership of the Washington Post will be like, a willingness to countenance low margins will surely be part of it.

    That’s happening across the industry as papers change hands. Some of the buyers are still financial investors like Buffett, who figures that if he gets in at a low enough price and concentrates on small-market papers with a monopoly of local news, he’ll make money even in a declining industry. But that’s rare. Private equity firms, which have long focused on buying into declining industries, have apparently deemed the decline of newspapers too precipitous for their tastes. Most of today’s new owners appear to be looking for something out of the investment beyond profit. Sometimes that’s political influence and hometown boosterism, as with the local developer and hotelier who bought San Diego’s daily paper. Sometimes it’s the opportunity to try out a new business model, as at the Orange County Register in California. And sometimes it’s just too early to tell, as with Bezos at the Post and John Henry’s acquisition a few days ago of the Boston Globe. One thing that unites all of these acquirers, though, is that don’t seem to have bought in expecting a 20% annual return on their investment. And that’s progress, of a sort.

  • China’s Impending Slowdown Just Means It’s Joining the Big Leagues

    China’s era of spectacular economic growth is coming to an end. That’s a popular theme at the moment, with any number of culprits cited — an overleveraged financial system, pollution, too little consumer spending, corruption, anti-corruption campaigns, and of course bad driving. It’s reached the point that the Chinese government’s International Press Center felt compelled to gather a group of reporters in Beijing earlier this week just so that Justin Yifu Lin, the former World Bank chief economist who is now a professor at Peking University and a government adviser, could tell them that he’s “reasonably confident the Chinese government has the ability to maintain a 7.5% to 8% growth rate.”

    Here’s the thing: a 7.5% to 8% GDP growth rate already is a significant slowdown from the nearly 10% annual pace at which China’s economy had been growing until last year. And while all the economic issues cited above are real, the big issue confronting the Chinese economy is something simpler and more encouraging. The country is on the cusp of succeeding in its epic quest to break into the ranks of the world’s affluent nations. When that happens, growth tends to slow.

    That’s been the finding of economists Barry Eichengreen of UC Berkeley, Donghyun Park of the Asian Development Bank in Manila, and Kwanjo Shin of Korea University in Seoul in two recent studies of growth slowdowns in emerging markets around the world. In the first, published in Asian Economic Papers last year, they reported a marked tendency toward slower economic growth when per capita incomes reach around $17,000 a year (in 2005 prices). In a newer working paper with more-complete data, they revise that to report two inflection points, one in the $10,000-$11,000 range and another around $15,000-$16,0000.

    China, with a per capita GDP of $7,827 in 2011 (in 2005 dollars, according to the latest edition of the Penn World Tables), is getting close to that first landmark. It also has a couple of other characteristics that Eichengreen, Park, and Shin have found to be identified with growth slowdowns — an aging population and an economy that has favored investment over consumption. Basically, it’s due. Or, as Eichengreen put it in an email when I asked him about it:

    Financial systems, deleveraging, and environment and political problems all differ across countries, but all fast growing, late developing countries slow down once the low hanging fruit has been picked. China can add several percentage points of growth a year by shifting 20 million workers from rural underemployment to urban employment, but once the pool of underemployed labor is drained it’s, well, drained. If you’re a technological latecomer, you can grow fast by importing foreign technology, but once you’ve succeeded in that you have to start investing in and developing your own, which is a harder task.

    Remember, these difficulties are the fruits of success. At 7.5% annual growth, China would cross the $10,000 per capita threshhold in 2015, and $15,000 in 2020. Well before then it would pass into the ranks of the world’s “high income” nations, according to the World Bank’s classification.

    Not that stuff couldn’t go wrong along the way. Eichengreen and Shin are co-authors (with Dwight H. Perkins, an emeritus professor at Harvard Kennedy School) of the 2012 book From Miracle to Maturity: The Growth of the Korean Economy (no, I haven’t read it, but you of course should), and Eichengreen sees important parallels in the Korean experience:

    Korean governments attempted to resist the inevitable slowdown. Their legitimacy derived from delivering growth, and there was the external threat from North Korea that caused them to further prize economic strength. Our estimates there show a break-point in growth potential in 1989. But Korea sought to keep the old growth rates going by boosting investment. That worked for about 7 or 8 years, but no longer. And they ended up with a financial crisis in 1997-8. The Chinese have studied this history too, which is part of the explanation for why they are prepared to accept the current lower growth rate and, not incidentally, are in the process of clamping down on financial excesses.

    None of this means there aren’t big risks inherent in such a slowdown for the Chinese government (which, unlike South Korea’s in the late 1980s is showing no signs transitioning towards democracy) and for the global economy (for which a Chinese slowdown will have an impact that earlier slowdowns in Korea and the other Asian tigers did not). But it is important to remember that for an emerging market, slowing down can mean you’ve arrived.

  • Facebook’s Scramble-and-Shake Strategy

    It’s hard to think of a company providing a mostly useful, non-polluting service that gets rooted against as much as Facebook does.

    Well, maybe I can think of a couple: AOL and Microsoft in the second half of the 1990s. AOL wasn’t entirely non-polluting, given how many unasked-for CDs it flooded mailboxes with to lure people to its dial-up service. But the negativity was more about its status as the Internet on training wheels. Microsoft, meanwhile, was scarily powerful &#8212 the company that was going to assimilate the Internet, Borg-style, and force us all to pay tribute.

    Facebook gets both criticisms. It’s social networking for beginners — less cryptic than Twitter, less mercenary than LinkedIn, more comprehensible and comprehensive than all those strange things like Snapchat and Tumblr and Instagram and Path. Of course the cool kids don’t want to hang out there any more. But Facebook has also risen so quickly to ubiquity that it can seem inordinately powerful. Changes in its privacy settings feel like infringements on our civil liberties. It’s effectively built by those who use it, but owned by Mark Zuckerberg and a few other folks — making the users, as Steven Johnson put it in Wired last year, “ultimately just tenant farmers on the land.”

    So let’s get this straight: Facebook is either totally over, or all-powerful.

    Clearly, it can’t be both. As someone who was a pretty early adopter, for a grownup, but doesn’t spend much time on Facebook now (I’m more of a Twitter guy), I’ve been an eager consumer of the Facebook-is-doomed literature. But since its spluttering IPO last year, I’ve been coming around to the notion that Facebook is headed for a status more interesting than either flash-in-the-pan or all-conquering global behemoth.

    For the foreseeable future, it looks like Facebook will be an important, profitable company that’s constantly struggling to reinvent itself as technology, users’ tastes, and the competitive landscape shift. It’s not going to be the social network — the “social graph” through which all of us organize all of our relationships. Unlike, say, LinkedIn, it doesn’t have a clearly defined niche that it can defend. It may remain a puzzle to Wall Street. But it combines a giant user base — 1.1 billion users worldwide — with the nimbleness that comes of being just nine years old.

    This last characteristic stood out in Facebook COO Sheryl Sandberg’s appearance at the D11 conference this week. Her interactions with Walt Mossberg, Kara Swisher, and various audience members were notable chiefly as a textbook lesson in how to stay on message without being boring and/or wildly irritating. But they also revealed at least a little bit about how Facebook sees its competitive advantage.

    Consider how CEO and founder Zuckerberg spends most of his time. “What he really wants to do is be in the office in his conference room with our product teams reviewing products,” Sandberg said. “That’s what he loves to do.” Or the company’s reaction to the lukewarm reception that the new Facebook Home app for Android phones has received so far. “The feedback we’re getting is very bimodal,” Sandberg said. Heavy Facebook users love how it effectively takes over their phones; everybody else hates it because it takes over their phones. So Facebook will keep tinkering: “We’re committed to monthly rollouts of this,” she said. Or how the company shifted from treating mobile as an afterthought just two years ago to organizing everything around it — with mobile going from approximately zero to 30% of revenue in just a year.

    It’s basically hustle as strategy — scrambling and looking for the next thing, and understanding that you probably won’t get anything right the first time around. One of my favorite Facebook stories of the past year has been that of the posters plastered on the walls at Facebook HQ trying to persuade employees to switch to Android phones. The company had become too iPhone-centric, but ordering employees to get Androids wasn’t its style. So … recruiting posters.

    Getting Facebook employees to use Android phones was particularly important, TechCrunch’s Josh Constine wrote, because of the way Facebook looks for bugs in its apps:

    During my digging I found out Facebook forcibly updates employees to the most recent beta version of apps like Facebook For Android and Facebook Messenger. If they run across a problem in one of the Android (or iOS) apps, they can take advantage of a bug-reporting feature Facebook builds into its internal betas. It’s called “Rage Shake” and the name is spot-on. Employees just violently shake their phone and it automatically logs its current state and sends details to Facebook’s mobile bug-squashers.

    It’s hard to hate a company that builds “Rage Shake” functionality into its software. It’s also hard to believe it will be going away anytime soon.

  • Who Should Actually Have Say on Pay?

    It’s say-on-pay season at American corporations. What shareholders have been saying, in overwhelming numbers, is yes! At 74% of the 1,471 companies that have voted so far in 2013, according to Equilar’s say-on-pay tracker, the “yes” percentage exceeded 90%. That’s up from 69% in 2012 and 2011. Only 31 companies (2%) have gotten sub-50% no-confidence votes in 2013.

    One key reason for shareholders’ positive tone is that the stock market has been doing well. Since say-on-pay hit the U.S. in 2011 (it was part of the Dodd-Frank Act), academic researchers have found that the chief determinants of how shareholders vote appear to be (a) stock performance, and (b) the voting recommendations of proxy-voting advisors ISS and Glass-Lewis, which are based in part on returns to shareholders over the previous three years. To a large extent, say-on-pay — which was introduced in the UK in 2002 and has spread to several other countries, most recently Switzerland — is a simple exercise in bandwagon-following.

    That’s not all it is, though. The size, growth, and design of paychecks do play into both the voting recommendations from ISS and Glass Lewis and the votes of shareholders. There is evidence that say-on-pay votes have led British companies to make executive paychecks more sensitive to poor performance. Say-on-pay votes do have an impact. The question is, what kind of impact?

    Say-on-pay is part of a big shift in recent years toward giving professional money managers more tools to affect the governance of (and in some cases discipline the managers of) corporations. Most of these theoretically increase the power of individual investors, too, but for the most part individuals aren’t a factor in corporate elections. Professionals appear to control somewhere around 60% of the shares of American corporations, and have an even higher percentage of the vote in corporate elections. (Individual investors tend not to vote, and while brokerage firms used to vote the shares of customers who didn’t get around to voting themselves — almost invariably siding with management — the SEC stopped allowing that practice three years ago.)

    Driving these changes is a widespread belief that more needs to be done to hold CEOs and boards accountable. That’s understandable. But it’s far from clear that professional money managers have what it takes to play the role of effective watchdog. When it comes to executive pay in particular, these people are a deeply compromised bunch.

    In the latest issue of the Journal of Economic Perspectives, economist Burton J. Malkiel argues that most of the gigantic growth in asset-management-industry profits since 1980 “is likely to represent a deadweight loss for investors.” His reasoning, as I discussed in an earlier post, is that active money managers as a group underperform the market indices and that, while active management does play a key role in setting stock market prices, there’s no evidence that today’s gigantic active management industry is doing that job any better than its much smaller precursor of three decades ago. American corporations outside the financial sector may have many flaws, but I’m pretty sure their increase in profits over the past few decades hasn’t been a “deadweight loss” to the economy.

    What’s more, the asset management industry — in particular the alternative-asset subset of hedge funds and private equity — has exported many of its pay practices into the corporate sector. The idea was to get away from paying CEOs “like bureaucrats,” as Michael C. Jensen and Kevin J. Murphy urged in a famous 1990 HBR article. It was a successful campaign: CEO paychecks came to consist mostly of stock options.

    This shift to financial-markets-based compensation had some of the promised impact — CEOs did become less risk-averse (bureaucrat-like) in their decision-making. But it also inflated what Mihir Desai has dubbed a “giant financial incentive bubble”. In Desai’s telling:

    Financial markets cannot be relied upon in simple ways to evaluate and compensate individuals because they can’t easily disentangle skill from luck. Widespread outsourcing of those functions to markets has skewed incentives and provided huge windfalls for individuals who now consider themselves entitled to such rewards. Until the financial-incentive bubble is popped, we can expect misallocations of financial, real, and human capital to continue.

    Say-on-pay has done nothing to deflate this bubble; executive pay has kept going up in the U.S. and UK. Which makes sense — most asset managers have a shared interest with CEOs in keeping top-of-the-scale paychecks high. If we wanted to have a real impact on executive pay levels, we should probably have employees vote.

    While highly paid hedge fund and mutual fund managers set the tone for the CEO-pay discussion, though, they do not as a rule get involved in the details of pay packages and say-on-pay votes. Instead, they mostly outsource the decision-making to Glass Lewis and ISS. The people who set the compensation policy guidelines at these two firms are not paid like CEOs or hedge fund managers, and lots of thought and empirical research go into their recommendations.

    They have, however, bought into the argument that the main metric of executive performance should be shareholder returns and that most executive pay should be in the form of stock. They’re supposed to represent shareholder interests, so this seems logical. But beyond the compensation bubble that stock-based pay has helped create, its incentive effects are also potentially perverse. As Roger Martin argued in his book Fixing the Game, stock prices are all about (often incorrect) expectations of future earnings. Linking top executives’ pay to stock prices thus rewards them more for creating high expectations than for running their company well. With banks there’s an even bigger problem: shareholders provide only a tiny percentage of their funding, and are thus motivated to encourage risk-taking that endangers depositors and taxpayers. So paying bank CEOs mostly in stock is a recipe for a financial crisis.

    The proxy advisers do attempt to counteract these forces somewhat, by frowning upon stock and option grants that aren’t linked to other performance metrics. But it’s not clear that their approach yields better results. One recent study by David F. Larcker, Allan L. McCall, and Gaizka Ormazabal found that the stock market reacts negatively when companies adjust their compensation policies to adhere to the proxy advisory firms’ recommendations. I’m not convinced that really proves anything one way or the other, but I do think the current state of knowledge about the impact of executive pay on corporate performance is muddled enough that standardizing pay practices to conform with what ISS and Glass Lewis think is best is probably a bad idea. Sometimes a board of directors will have a better sense than the stock market or a proxy advisory firm of how well a CEO is performing. Do we really want to make it impossible for boards to exercise discretion?

    It’s not that say-on-pay is necessarily a disaster. Unlike some other corporate-governance reforms, it hasn’t imposed major regulatory burdens on anybody (public corporations were already holding annual shareholder votes), and for the vast majority of companies it has been a nonissue. The votes are non-binding, and there’s at least a chance that they’re changing pay practices for the better.

    But it’s worth remembering that the explosion in American executive pay over the past three decades coincided with and was in part driven by an increase in shareholder clout. It may be that shareholders just had the wrong tools in the past, and say-on-pay will allow for a more surgical approach to governing CEO compensation. It’s also at least possible, though, that the shareholders have been the problem all along.

  • Seven Fun Facts About Corporate Taxes

    Thanks to U.S. Sen. Carl Levin’s Permanent Subcommittee on Investigations, corporate taxes were all over the headlines this week. Subcommittee staff produced a fascinating 40-page “memo” on Apple’s creative tax avoidance, then Apple CEO Tim Cook submitted to what turned out to be a not all that heated grilling from Levin and colleagues. On the assumption that, while this story will wax and wane over the coming months, it’s not going away, here’s some context (for you headline purists, the final two items are admittedly at least as much opinion as fact):

    Corporate profits are taxed twice. When a corporation makes a profit, it’s subject to corporate income tax. When it then distributes that profit to shareholders as dividends or buybacks, it gets taxed again as dividends or, eventually, capital gains. This isn’t the only double tax out there: Most of us in the U.S. pay income and/or payroll taxes when we earn money, and sales taxes when we spend it. Also, a lot of corporate shares are owned by tax-exempt pension funds, 401(k)s, and university endowments that don’t pay taxes on dividends or capital gains. But it’s worth remembering that the corporate income tax isn’t the only tax on corporate income.

    Economists hate the corporate income tax. Double taxation is one issue. Another is that a tax on corporations is generally a tax on productive investment, and economists think we need to invest more in productive things (as opposed to say, housing). Yet another is that corporations are more mobile than individuals, and taxing them might cause them to move their operations elsewhere. Some also argue that taxes on corporations end up falling most heavily on those corporations’ employees. And there are other objections. One can find economists of widely varying political beliefs who think the optimal corporate tax rate is zero, or at least much lower than it is now.

    Economists can be kind of naïve. If the corporate tax rate were zero, we would all incorporate ourselves. Accountants, tax lawyers, and economists who actually spend time around the tax system point out that when you eliminate one form of taxation, you have to raise rates on other taxes to make up for the lost income, and those higher rates increase the incentives for tax avoidance and evasion. That means we should want a broad tax base and low rates. Along those lines, most developed countries tax both corporate income and shareholder income, but to make up for the double taxation and other issues they tax them at lower rates than they do other income. In the U.S., though, while dividends and capital gains are taxed at lower rates, corporate income tax rates have remained stubbornly high.

    U.S. corporate tax rates are the world’s highest. Since Japan lowered its rate last year, the U.S. has had the highest corporate income tax rate in the world, with a combined federal and state average of 39.2%. This doesn’t mean most U.S. corporations pay that high a rate — over the past three decades the share of corporate profits going to taxes in the U.S. has actually declined substantially even as official rates held steady (see the chart below).

    Profits and Taxes

    That’s partly because, starting in the Reagan years, lawmakers leery of the political backlash from lowering corporate tax rates found other ways to cut corporate income taxes, mainly through various provisions allowing corporations to write off capital investments quickly. It’s also because the share of U.S. corporate profits coming from overseas has risen steadily, to 24% of the total in 2011 from 3% in 1950.

    Finally, while it’s hard to quantify, it seems likely that corporations have gotten far more aggressive over the years at finding ways to minimize taxes. None of which changes the fact that the statutory corporate tax rates in the U.S. are, by global standards, really high. The corporate income tax is the only tax of which I’ve heard economists other than Arthur Laffer argue that it might be on the wrong side of the Laffer curve — that is, that lowering the rate could bring in more revenue.

    The U.S. is especially aggressive about corporate taxes. Most developed countries now have what are called territorial tax systems, where an individual or corporation just pays taxes on the income earned in that country. U.S. citizens and U.S.-domiciled corporations, on the other hand, are taxed on their global income, with credits for any taxes paid overseas. But — and this is a big but — corporations don’t have to pay tax on their foreign earnings until they bring the money back to the U.S. This creates strange phenomena like the huge foreign cash stashes held by companies such as Google and Apple. It also leads to calls to switch to a territorial tax system in the U.S., which would at least remove this barrier keeping American companies’ overseas earnings from coming home in the form of new investments or (more likely) dividends and buybacks.

    Foreign income is great. Stateless income isn’t. The fact that U.S. corporations are getting a bigger share of their income from overseas isn’t necessarily a bad thing, and the fact that different countries charge different corporate income tax rates isn’t such a bad thing either. A growing global economy is good for the U.S., and tax competition between countries can be healthy. But the new twist, available to multinationals such as Apple, Google, Microsoft whose earnings flow mostly from intellectual rather than physical capital, is what USC law professor Edward D. Kleinbard has dubbed “stateless income.” These are corporate profits that are never taxed anywhere thanks to extremely aggressive corporate tax planning and quirks in, in particular, Irish and Dutch tax laws (the “Double Irish With a Dutch Sandwich” is a favorite tax-avoidance technique). This may be legal, for the moment. But it appears to go against the spirit of the tax laws even of low-tax Ireland.

    Corporations (probably) can’t have their cake and eat it too. Apple’s Tim Cook told Levin’s subcommittee Tuesday that “we not only comply with the laws, but we comply with the spirit of the laws.” Actually, what Apple is doing is taking advantage in the inability of tax law and international tax treaties to keep up with rapidly changing economic reality and highly creative tax lawyers — it’s complying with the lawlessness, basically. Legally, that may amount to the same thing. Ethically and, more important, politically it probably isn’t. Apple’s shareholders would really love it if Congress moved to territorial taxation or declared a tax holiday that allowed Apple to bring home its $102 billion in overseas cash tax-free. Corporate America in general would love it if Congress lowered the corporate tax rate. And it appears likely that all of us would be better off with lower corporate rates and a tax setup that didn’t discourage American corporations from bringing their money home. But as long as Apple, Google, and their ilk keep trying so hard to keep so much of their income out of the reach of all tax collectors everywhere, it’s hard to see elected officials budging.

  • How Jamie Dimon Became a Risk Factor

    The annual 10-K report that JPMorgan Chase filed with the SEC in February includes a 13-page section on “Risk Factors.” It’s a lawyerly, exhaustive, exhausting rundown of all the things that could possibly weigh on the earnings of a giant global bank, from regulatory changes to loans going bad to a liquidity crisis to the possibility that “one or more of its employees causes a significant operational breakdown or failure.” What’s missing, though, is something like this:

    CEO Risk: Much of JPMorgan Chase’s excellent performance relative to its peers in recent years can be attributed to its Chairman and CEO, who has proved to be a uniquely valuable combination of careful risk manager and hard-driving business leader. He won’t be around forever, though. In fact, he has threatened to leave if shareholders vote (non-bindingly!) to strip him of his Chairman title. The Corporation’s post-Jamie-Dimon future is extremely uncertain.

    The shareholder proposal to split Dimon’s job is to a certain extent silly. As Ben W. Heineman, Jr. wrote here last week, JPMorgan Chase already has a pretty formidable de facto chairman in lead outside director Lee Raymond, the former CEO of Exxon Mobil. The evidence on whether splitting the chairman and CEO roles improves performance is mixed; when done under shareholder pressure, according to the University of Minnesota’s Aiyesha Dey, it may actually hurt.

    Also, there’s a lot of confusion out there as to what the job of corporate chairman is supposed be. Roger Lowenstein stated on HBR.org that it’s to keep an eye on the CEO on behalf of shareholders. In a similar vein, Eliot Spitzer told Bloomberg Businessweek that, “Even if the leader is spectacular, we want checks on power. We might accept that Thomas Jefferson was a remarkable president, but that doesn’t mean we repeal checks and balances.”

    Those are both somewhat dubious assertions. Do we really want to subject our CEOs to the same checks and balances as our political leaders? A big part of the attraction of the corporate structure is that it allows for both quicker and more-long-term-oriented decision-making than the political process tends to produce. Corporations resemble dictatorships more than democracies — and that’s not necessarily a bad thing. Also, the chairman and the rest of the board are by law responsible not just to current shareholders but to the corporation, meaning they’re free to take into account the concerns of employees, creditors, customers, and other stakeholders. This is of particular import at a giant bank like JPMorgan Chase, where bondholders, depositors, the Federal Deposit Insurance Corp., and U.S. taxpayers together have far more money at stake than shareholders do.

    Still, chairman and CEO are different jobs, and it is worth asking if Dimon (or Raymond, if you prefer Heineman’s version) is performing the first adequately. The chairman’s duties — other than presiding over meetings and signing stuff — aren’t defined with much precision in JPMorgan Chase’s corporate bylaws. But the board of directors, with the chairman at its head, is responsible for determining the corporation’s long-term goals, for positioning it for the future, for making sure that its fortunes can withstand a change in CEO. It’s not clear that JPMorgan Chase’s board has succeeded at this. As pseudonymous investment banker The Epicurean Dealmaker recently wrote of Dimon,

    He has failed to accomplish one of the most important, difficult, and basic tasks a Chairman is supposed to do: establish a succession plan for the CEO. Each and every Board worth its perks and compensation should make finding and grooming successors to the firm’s current senior executives — especially the CEO — its most important agenda item. Not only has Jamie failed at this, he has actively fired key lieutenants and potential successors like Bill Winters and Steve Black, apparently on the basis that they posed too credible a threat to his own power.

    There are corporations where the chairman and CEO jobs are held by the same person that do a great job of succession. General Electric, where Heineman was senior vice president for law and public affairs, is one obvious example. But GE is a company with a well-established culture and a history of well-executed transitions. JPMorgan Chase is a relatively recent amalgam of three giant banks, two of which — Chase and Bank One — were themselves products of merger after merger after merger. The surviving corporate entity is actually Chemical Bank, which merged with Chase in 1995. And in the midst of the financial crisis, JPMorgan Chase added most of the operations of another giant, failed thrift Washington Mutual.

    That Dimon, who had only been at Bank One for four years when it merged with JPMorgan Chase in 2004, was able to steer this ungainly creature safely through the financial crisis was a spectacular accomplishment. Unlike at Goldman Sachs, where risk savvy has long defined the corporate culture, JPMorgan Chase’s ability to avoid company-threatening risks in the years leading up to the financial crisis can be chalked up largely to its CEO. This accomplishment alone makes Dimon perhaps the greatest banker, and one of the greatest chief executives, of his generation.

    Such a CEO, though, is most likely a one-of-a-kind phenomenon — and a good chairman should be focused on the risk of what happens when he’s gone. This is about more than just succession planning. What JPMorgan Chase really could have used after the crisis was for its board and chairman to define what the company stood for and what made it unique — and to help shape the environment in which it operated to favor a less crisis-prone variety of financial capitalism. Dimon seemed perfectly positioned for such a transformative role. He had unparalleled credibility, a happy shareholder base, and an apparent understanding that certain aspects of how banks operated in the 1990s and 2000s were neither healthy nor sustainable. He could have easily pushed for the kind of bold changes in executive pay and other practices that Sallie Krawcheck recommended in her HBR article “Four Ways to Fix Banks.”

    But he didn’t. It’s a little hard for an outsider to tell how hard Dimon has tried to reshape his bank and his industry since the crisis, and even how much headway he has made. But the superficial impression &#8212 reinforced by the current fight over his job titles — is that while he has continued to be a very good chief executive (“London whale” notwithstanding), Dimon has not succeeded in making himself dispensable.

  • Just How Useless Is the Asset-Management Industry?

    Writing under a pseudonym in the Financial Analysts Journal in 1960, mutual fund executive Jack Bogle made “The Case for Mutual Fund Management.” Bogle took the track records of four leading mutual funds going back to 1930 and compared them to the performance of the Dow Jones Industrials. Not only had the four beaten the Dow, handily, but during the period from 1950 through 1956, for which the brokerage Arthur Wiesenberger & Co. (the Lipper/Morningstar of its day) had calculated mutual fund volatility, all but one of them had fluctuated less than the Dow.

    “[M]utual funds in general have met the test of time, and performed in keeping with their stated policies and goals,” Bogle concluded.

    As tests go, Bogle’s had its flaws. The fact that four funds (they’re not named in the article, but Bogle once told me they were Massachusetts Investors Trust, Investors Incorporated — now Putnam Investors — State Street, and Wellington) that had survived since 1930 had performed well didn’t say anything about the performance of the many funds that didn’t survive, or the new ones that popped up in the 1950s. But it’s quite possible he was right that the tiny mutual fund industry of the 1930s, 1940s, and early 1950s had served its investors admirably.

    By 1960, though, the mutual fund business was booming, and selling investors on high-cost, high-risk products called “performance funds.” Within a few years, researchers armed with more statistical skills (and these new things called computers) were examining the industry’s performance and finding it wanting. “[W]e find no evidence to support the belief that mutual fund managers can outguess the market,” Jack Treynor and Kay Mazuy of the consulting firm Arthur D. Little reported in the July-August 1966 HBR (sadly, we don’t have the article online). Multiple academic studies soon backed up that conclusion.

    They’ve continued to back it up ever since. After costs, actively managed mutual funds trail the market. Yet while passively managed, much-lower-cost index funds have been available since 1976, when Bogle — who had a change of heart and, perhaps more to the point, had been ousted from his job running Wellington Management — launched the Vanguard 500 Index Fund, most investors still put most of their money in the hands of active managers.

    Why they do this a long-running puzzle. In the new issue of The Journal of Economic Perspectives, economist and long-time Vanguard board member Burton G. Malkiel poses it for the umpteenth time, and adds to it the observation that expense ratios on actively managed funds have over the past three decades risen substantially, even though economies of scale would seem to dictate that today’s much larger funds ought to have lower expenses as a percentage of assets (domestic equity funds in the U.S. had $3.5 trillion in assets in 2010, up from $25.8 billion in in 1980). And while it seems essential that we have at least some active managers in order to set security prices (if everybody put all their money in index funds, there would presumably be no link between stock price and value), Malkiel says that there’s no evidence that stocks were less efficiently priced decades ago than they are now. He concludes:

    The major inefficiency in financial markets today involves the market for investment advice, and poses the question of why investors continue to pay fees for asset management services that are so high. It is hard to think of any other service that is priced at such a high proportion of value.

    It’s a pretty harsh indictment. Malkiel is basically saying that the asset-management industry has no economic justification for being as big and rich as it is. He’s probably right about that, although I wouldn’t say his evidence is conclusive. The way he purports to show that markets haven’t become more efficient through the years is simply that mutual funds found it just as hard to beat the indexes in 1980 as they do now. And the troves of performance and expense data available for mutual funds allow us to subject them to scrutiny not really possible for most industries. I’d definitely be a little scared to learn what the true economic value added over the years by management advice has been, for example.

    One other thing that Malkiel fails to mention (although it’s clear from his data and even clearer in a recent report from the Investment Company Institute, the mutual fund trade group), is that the expense trend shifted a little over a decade ago. After rising sharply in the 1980s and modestly in the 1990s, mutual fund expense ratios actually dropped (from 0.84% to 0.69%) from 2000 to 2010. Part of that is the result of a continuing shift into index funds, which accounted for almost 30% of all equity mutual fund and ETFs in 2010 (up from 0.3% in 1980). But expense ratios on actively managed funds have also been falling, from 0.94% to 0.91% now in 2010.

    It’s possible that, thanks to the rise of index funds, investors have finally wised up to the role of costs in investment returns, and we’re entering a glorious new era of declining investment fees. It’s also possible that this is a cyclical phenomenon. During a bull market, investors don’t pay much attention to fees (or to the size of executive paychecks). When markets struggle, they do. So all it will take to get the asset-management industry down to an economically appropriate size and level of profitability is another decade or two of sideways stock markets. That should be fun.

  • The Great Netflix Doom-Avoidance Machine

    Reed Hastings has emerged from hiding. Well, maybe not hiding — he was still posting on Facebook and talking to the occasional magazine writer. But by his previous standards, the Netflix co-founder and CEO had been laying uncharacteristically low in the almost two years since the Great Qwikster Fiasco. It took a clear recovery from the company’s missteps, as evidenced by a blowout earnings report last month, to convince the Netflix PR team (and/or Hastings himself) that it was time to unleash him.

    First came a manifesto penned by Hastings that describes the competitive landscape in which Netflix finds itself and why Hastings thinks his company has what it takes to survive. Then Hastings talked to James B. Stewart at The New York Times about what he’d learned from his abortive effort to jettison Netflix’s DVD-rental business (lesson no. 1: don’t do things that “hurt people’s real love for Netflix”). And now, in the latest Bloomberg Businessweek, Ashlee Vance offers a fascinating cover-story look at how Netflix and Hastings work (complete with a visit to the glass cube atop Netflix’s HQ where Hastings goes for quiet time).

    Hastings’ return to the spotlight is excellent news for anyone interested in business. He is quite entertaining, as CEOs go. And more important, his company keeps posing such fascinating questions. Such as, how does it keep making money?

    Netflix competes in an industry where control of distribution channels was long key to success. If you owned the pipes, as it were, you owned the customers. Netflix piggybacks on distribution networks built by others. For DVD delivery, that distribution network is at least a neutral carrier: the U.S. Postal Service. For streaming, though, Netflix relies on internet service providers such as Comcast and AT&T, data centers owned by Amazon, and tablets and digital media receivers made by Apple (and Microsoft, and Samsung, and others) to get its movies and TV shows to customers. All these companies are much bigger than Netflix and are all direct or near-direct competitors. Yet it survives and, seemingly, thrives among them. According to BusinessWeek, Netflix now accounts for about a third of all downstream Internet traffic in North America on an average weeknight. It is becoming the mistletoe of the media business — a parasite (sort of) that is more prominent and beloved than its hosts.

    It’s fair to say that this continuing success has surprised a lot of people. Search on the phrase “Netflix doomed,” and you come across a bounty of dire predictions. Part of this must be psychological. While writing this post, I have repeatedly caught myself typing “Netscape” — an association that does Netflix no favors. But there have also been real reasons to be skeptical.

    In the DVD-rental business, Netflix faced content costs that weren’t appreciably different from those of bigger archrival Blockbuster, and had a much lower-cost delivery system. As venture capitalist Bill Gurley explained a couple of years ago, a 1908 Supreme Court ruling made it impossible to stop Netflix from buying and renting out any DVDs it wanted. So Netflix could exploit its advantages — a subscription-based business model, a huge catalog of titles, an ever-improving recommendation engine, and a super-efficient system for getting DVDs to your local post office — without having to fear that its reservoir of content would dry up.

    In streaming, Netflix has to cut deals with content owners for movies and TV shows, and as it has grown it has had to pay ever more. Last year, Netflix reported spending $30 million on DVDs — and $2.8 billion on streaming content. And it’s constantly losing content as licensing deals expire — forcing it to sign new deals at higher prices, and pay to develop its own shows.

    As a result, Netflix’s margins are much lower in the streaming business (21% in the most recent quarter) than in the DVD business (46%). But streaming is the future, DVDs are a declining business, and the last quarter was the first in which Netflix made more profit domestically from streaming than from DVDs.

    So how does Netflix stay profitable in streaming? One answer — probably the most important answer — is that it’s really good at software engineering. Hastings is a veteran software engineer, and Netflix pays its engineers more than the competition and sets them loose to solve interesting problems. Delivering its programming via tens of thousands of Amazon Web Services data centers and getting it to work seamlessly on myriad gaming consoles, tablets, smartphones, and other devices takes tons of code and some really smart design. “We’re using Amazon more efficiently than the retail arm of Amazon is,” Netflix’s cloud architect told Businessweek‘s Vance.

    Engineering is also key to figuring out what customers will like. Netflix’s recommendation engine has become a huge asset, driving 75% of viewing. Data on customer viewing habits is also increasingly driving Netflix’s decisions on which content to acquire and how much to pay for it.

    Netflix’s calculation is that if it can continue using its engineering prowess to keep its customers happy, and help it acquire new ones, its frenemies in the content and delivery business will decide that they can make much more money working with it than trying to thwart it. It’s also betting that once it has established itself as one of the big players in streaming, it won’t go away anytime soon. “Once a subscription video service has achieved profitability and scale in a market (20% to 30% of households),” Hastings writes in his “Long-Term View” manifesto, “it is very likely to be able to sustain that profit stream for many decades. At that percentage of households, our advantages in content acquisition and member acquisition are considerable.”

    This still feels a lot more tenuous than the competitive positions held by broadcast networks for decades, and the cable networks now. Quitting Netflix is easy; its subscriber churn rate is an estimated 40% to 50% a year. Holding on to customers will require continual upgrades in technology and content (not to mention avoiding Qwikster-like missteps).

    But upgrades are what software engineers do. Netflix has figured out how to succeed, for now at least, in a world where it doesn’t own the pipes and can never afford to stop improving. A lot of its would-be competitors really don’t.

  • How Long Will You Be Willing to Tweet for Free?

    Seven years ago, Nicholas Carr and Yochai Benkler made a bet. They did this in the comments section of a blog post, there was no money at stake, they never followed up Benkler’s suggestion that they appoint “between one and three people” (I’d call that two) to determine the winner, and they never decided when exactly the bet would come due (two years? five years? ten?).

    For a time, though, it seemed like a big deal. Carr, the former HBR executive editor who now writes brilliantly skeptical books about the digital revolution, had observed entrepreneur Jason Calacanis’s attempt to unseat news aggregator Digg by offering paid jobs to the most influential diggers, and proposed that “large-scale social media” was on the verge of being “subsumed into professional media.” Benkler, the law professor (then at Yale, now at Harvard) and prophet of the transformative power of “commons-based peer production,” responded with a suggestion that they make it a formal wager:

    We could decide to appoint between one and three people who, on some date certain — let’s say two years from now, on August 1st 2008 — survey the web or blogosphere, and seek out the most influential sites in some major category: for example, relevance and filtration (like Digg); or visual images (like Flickr). And they will then decide whether they are peer production processes or whether they are price-incentivized systems. While it is possible that there will be a price-based player there, I predict that the major systems will be primarily peer-based.

    Carr said that was a good idea, but the term should really be ten years, or maybe five. And the bet was on, sort of. The Guardian wrote about it. Tim O’Reilly wrote about it. I wrote about it in my column for Time. And, of course, somebody created a Wikipedia page about it.

    And then, well, most everybody forgot about it. I finally thought a couple months ago to check in on where things stood and discovered that, last year, Carr and Benkler had each claimed victory. Sigh.

    After watching the collisions and collusions between peer-based and professional media during the hunt for the Boston marathon bombers, I decided to look again. I found that Matthew Ingram, a professional journalist with great enthusiasm for social media, had declared Benkler the winner. So had some guy on Quora. Benkler is a neighbor and friend, so I can’t claim to be entirely neutral here, but I think I’m a lot more sympathetic to Carr’s views than Ingram and the Quora guy (his name is Lee Ballentine) are. Still, given the wording of the bet, I’ve also got to say Benkler is the winner, so far (it seems like we ought to check back in one last time in 2016). Jason Calacanis’s professionals didn’t displace Digg, the volunteer users of reddit and Twitter and Facebook did. And if you survey the traffic-giants of the Internet, as Benkler did in his claim of victory, it’s clear that more of them are, to use the terms that Benkler laid down in 2006 and Carr agreed to, “peer production processes” than “price-incentivized systems.”

    What most them aren’t, though, is charitable organizations. Of Alexa’s 25 top sites globally, only Wikipedia is nonprofit. The vast majority of the content created and shared on Facebook and YouTube and Blogger and Twitter may not be “price-incentivized,” but much of it is promotional and self-interested, as Carr noted in his claim of victory, while the people maintaining the infrastructure and determining the rules are definitely profit-incentivized.

    As far as the wager goes, this is irrelevant. While Benkler is clearly a big fan of Wikipedia, Project Gutenberg, and other such cooperative efforts, he never said the “peer production” he was talking about had to be nonprofit. In the 2002 Yale Law Journal article in which he introduced the concept, Benkler mentioned Google’s PageRank, Amazon’s customer reviews, and Sony’s EverQuest (a massively multiplayer game) as examples of it.

    But the coexistence of peer production and money making within the same organization does carry with it lots of potential for conflict. As Benkler writes in his latest book, The Penguin and the Leviathan: How Cooperation Triumphs Over Self-Interest, the evidence that humans are motivated by things other than money keeps growing every day. But the context in which we act can determine whether we emphasize self-interest or cooperation. Benkler cites a study by Varda Liberman, Steven M. Samuels, and Lee Ross in which they had subjects play the Prisoner’s Dilemma game, in which two people are given a choice between cooperating or not. When they labeled this the “Community Game,” subjects cooperated 70% of the time; when they called it the “Wall Street Game,” that dropped to 33%.

    This is a reality that all the for-profit empires built upon peer-production may eventually have to face. All of them have succeeded by persuading users to treat them as Community Games, and share freely. They have done this in large part by, to borrow Tim O’Reilly’s phrase, creating more value than they capture. But you have to wonder if, as companies like Google and Facebook and Twitter inevitably become more entwined in the Wall Street Game, they’ll be pressured to capture more of the value they create, and the willingness of their users to engage in unpaid commons-based peer production will diminish. If that happens, the results of the Liberman-Samuels-Ross experiment would seem to indicate that the fall-off would be not gradual but precipitous. Capitalizing on others’ unpaid labor is great business — until suddenly it’s not.

  • Experiencing the Media’s Pro-Am Future in Boston

    Last week began in the Boston area with reenactments of bloody battles fought 238 years ago. I got there at 5 a.m. for one of them, standing with my family near the edge of Lexington Green as dawn broke and then, a little while later, a guy on horseback (presumably supposed to be Paul Revere, although in reality he’d lost his horse by then) rode past us to Buckman Tavern to warn the gathered patriots that the British Army was not far behind.

    It was a surprisingly affecting moment. One got at least a faint sense of how eerie, and terrifying, it must have been for the local farmers and tradesmen waiting in that tavern in the dark. And when the sound of British drums and fifes started wafting up Massachusetts Avenue, it carried real menace. A heavily armed professional force was marching up the road to confront a bunch of amateurs.

    The professional soldiers beat the amateurs in that first skirmish on Lexington Green, but they lost the day. By the time British got to Concord, where they were supposed to locate and capture a stash of military supplies (which had been moved by then anyway), the local minutemen had mustered a force big enough to turn the army back. And as the British retreated back to Charlestown they were harassed all the way by an enemy seemingly lurking in every house and behind every bush.

    The amateurs won the war too, although it took a few years and by then they weren’t all amateurs anymore. There were more of them, and they knew the territory better.

    The U.S. armed forces and government have of course since grown into massive, professional organizations. Then again, Lexington and Concord at least are still governed by town meetings, and the amateur roots of the American Revolution still inform political debate and shape civic life across the country.

    Which brings us (sort of) to the rest of what happened last week. On the afternoon of the battle reenactments, a pair of young men whose place on the amateur-professional continuum has yet to be definitively determined (I’m guessing pretty amateur, but really, who knows) set off deadly homemade bombs near the finish line of the Boston Marathon. I was working at home and distractedly looking at Twitter when I first saw reports of the explosions, well before the news was on TV or radio. It didn’t take long, in fact, for an exasperated line of chatter to develop in my Twitter stream berating CNN and the other cable news networks for not having switched over yet to marathon news.

    My immediate concerns centered around the marathon runner who had spent the previous night at our house. Before long Twitter had pointed me to the Boston Athletic Association site, where I was able to discover that my friend had crossed the finish line just over 10 minutes before the bombs went off, and thus was almost certainly okay (indeed he was). After that I became a mostly passive, if unnervingly close-by, observer of Boston’s — and the American news media’s — strange, scary week.

    Most of the strangeness had to do with the events themselves (bombs go off at marathon, well-liked local kid turns out to be apparent terrorist, major American city shuts down as police hunt him down) but the communications revolution of the recent years meant that we experienced the events in an unfamiliar way, too. The internet, and especially social media, have elevated amateurs, giving all of us the opportunity both to report and spread news ourselves and to consume it in largely unfiltered fashion. This in turn has put pressure both on government authorities and established media to react to what’s going on online, by turns by debunking it and passing it on.

    I’m really not sure what this is going to mean for the relationship between citizens and government. New communication tools and technologies may enable the occasional “Facebook revolution,” but they also allow for government surveillance and monitoring of previously unimaginable scope.

    For the news media, though, economics are dictating what appears to be a permanent shift in the professionals-to-amateurs balance. Lots of businesses are being affected by changes in how they communicate with their customers. But for those in the business of communicating, it’s been especially dramatic — and devastating. Newspaper ad revenues in the U.S. have, after you adjust for inflation, fallen to levels last seen in the mid-1950s. As Nicco Mele and John Wihbey wrote earlier this month:

    At about the time the Berlin Wall fell, there were roughly 56,000 editorial jobs among American newsrooms. That number is now likely below 40,000, according to Pew, and one can imagine it falling further.

    So it’s not just that the amateurs are newly empowered. There are also ever-fewer professionals, with fewer resources at their disposal. Mele and Wihbey described this decline in the context of a proposal for how to keep the number of pros from dropping below 30,000. Let’s hope somebody figures that out — beyond the obvious self-interest involved, I do happen to believe that professional newsgathering has a lot of societal value. But the amateurs aren’t going away, and we pros shouldn’t want them to. We need all the help we can get.

    In the last few days, a lot has been written about the failings of either the professionals or the amateurs in Boston, and both sides have plenty examples to point to. There were lots of media pros repeating nonsense on the air, while on on Twitter amateurs and pros joined forces to disseminate the names of two suspects who were not in fact suspects. There were also plenty of professionals — and amateurs — getting it right. Even in the case of the much-maligned reddit, if Ryan Sholin’s account is to be believed, amateur moderators were furiously trying to stamp out errors and misinformation last week.

    I’m pretty sure Sholin’s account is to be believed, actually. Not because he works at Gannett, a newspaper publisher, but because I’ve been following him on Twitter for a couple of years and have never found him to be a purveyor of nonsense. Which is sort of the way things are headed. We all have to make our own decisions on who or what to trust.

    The difficulty, of course, is that most of us don’t have the experience, the judgment, or, really, the time to make informed, rational assessments of everything we see and read. I think I’m reasonably sophisticated at sorting through the news, and I’m definitely experienced at it, but for a while there Friday morning I got totally sucked in by, although happily didn’t retweet, incorrect reports on Twitter (presumably from scanner chatter) that police had Dzokhar Tsarnaev surrounded in a house in Watertown. It’s almost over, I thought. But it wasn’t. Reading Twitter last Friday was, as Felix Salmon put it, “an exercise in massively multivariate real-time Bayesian analysis.”

    Still, I stuck with Twitter as my primary news filter, and I got better at sifting wheat from chaff as time went by. The array of information — from descriptions of what Pete Williams was saying on NBC to links to the best new articles in the Boston Globe or New York Times to speculations about what was really happening in Watertown as the day wore on to reports from co-workers of loud bangs down the street — was simply greater and delivered in more timely fashion than one could get from any other one source. Yes, some percentage of it was misinformation, but the media were publishing and broadcasting errors long before the social media age. What’s different now is the speed at which error can be disseminated and the fact that anybody can disseminate it. But bad information can also be debunked more quickly, and debunked by anyone.

    Sorting through the noise and getting stories straight should continue to be a way for media professionals to add real value. It may even help their employers stay in business.The professionals won’t be able to do it in a vacuum, though, and they shouldn’t want to. The media have gone pro-am. This brings all sorts of complications with it, but the amateurs can also provide essential reinforcement to the professional media’s dwindling ranks. There are more of them, and they often know the territory better. I’d rather have them on my side than fight against them.

  • Reinhart, Rogoff, and How the Macroeconomic Sausage Is Made

    After watching a presentation by Kaggle founder and CEO Anthony Goldbloom at a conference last year, I went up to the front of the room to ask him a question about macroeconomics.

    Kaggle organizes competitions in which data scientists (which in most cases means anybody who wants to sign up) compete to build predictive models based on huge troves of data. Goldbloom founded the company after working as a macroeconomic modeler at the Reserve Bank of Australia and the Australian Treasury.

    “Could you use the Kaggle approach to make macroeconomic predictions?” I asked him.

    “No,” he replied. “Not nearly enough data.”

    I couldn’t help but think back to that as controversy erupted this week over Harvard economists Carmen Reinhart and Kenneth Rogoff’s oft-cited three-year-old finding that economic growth plummets when a country’s debt-to-GDP ratio exceeds 90%. Three University of Massachusetts economists — Thomas Herndon, Michael Ash, and Robert Pollin — came out with a working paper that recrunched the Reinhart and Rogoff data set and arrived at a very different result: instead of average -0.1% growth in countries with debt/GDP of more than 90%, they came up with 2.2% growth.

    Most of the attention since then has focused on an Excel error that Herndon, Ash, and Pollin found — which caused five countries to be excluded from the analysis — and Reinhart and Rogoff have subsequently acknowledged. That’s pretty embarrassing, but it only changed the result by 0.3 percentage points. Most of the difference had to do instead with how Reinhart and Rogoff weighted the results from different countries. They chose to give each country’s average growth in a particular debt/GDP range the same weight, regardless of how many years the country had been in that situation. As Herndon-Ash-Pollin write, this isn’t an indefensible approach (they do argue that Reinhart and Rogoff should have devoted a lot more ink to defending it). But by taking a different approach, and instead weighting countries’ results by how many years they were above 90% debt/GDP, they were able to get a very different result.

    This is watching the sausage of macroeconomics being made. It’s not appetizing. Seemingly small choices in how to handle the data deliver dramatically different results. And it’s not hard to see why: The Reinhart-Rogoff data set, according to Herndon-Ash-Pollin’s analysis, contained just 110 “country-years” of debt/GDP over 90%, and 63 of those come from just three countries: Belgium, Greece, and the UK.

    This is a problem inherent to macroeconomics. It’s not like an experiment that one can run multiple times, or observations that can be compared across millions of individuals or even hundreds of corporations. In the words attributed to economist Paul Samuelson, “We have but one sample of history.” And it’s just not a very big sample.

    So what to do about it? One response is to dig for more data, and Reinhart and Rogoff have been doing that, going back to 1800 to examine episodes of public debt overhangs. Another is to have different people crunch it in different ways, which is what Herndon-Ash-Pollin did, or assemble different data sets, as several other scholars have done.

    But the biggest challenge may be how to present it. My reading of Reinhart-Rogoff, Herndon-Ash-Pollin, and the other papers linked to in the preceding paragraph is that rising debt loads do weigh on growth. Yes, there’s causation at work in both directions: low growth results in bigger debts — which has clearly been the case in the U.S. over the past couple of years. But attempts to separate that effect out by looking at growth rates well after a spike in debt do indicate slower growth after higher debt. And for economists of every school but so-called modern monetary theory, it’s logical that big debts would eventually eat up resources and slow growth.

    What there isn’t, though, is an obvious tipping point where debt becomes too high, and deficit spending becomes a drag rather than a stimulus. At least not one that’s obvious before the fact. The initial Reinhart-Rogoff research seemed to indicate a sharp dropoff in average growth after debt passed 90% of GDP. But they also reported a significantly smaller dropoff in median growth, and their subsequent analyses, as well as the Herndon-Ash-Pollin rework of their data, similarly show a dropoff but not a dramatic inflection point.

    In the 1990s, the consensus seemed to be that for the U.S. the inflection point was a public debt/GDP ratio of 50% — which is exactly what the country was nearing at the time. Higher than that, and the bond market vigilantes would punish the U.S. with much higher interest rates on government debt. The central teaching of what came to be known as Rubinomics was that cutting the deficit would actually stimulate the economy as it brought interest rates down.

    Now, of course, U.S. public debt is up to 76% of GDP, yet the bond market vigilantes all seem to have retired or moved to Europe. In the long aftermath of a global financial crisis, with deflation a real threat, the U.S. can get away with running huge deficits with no immediate consequence. In fact, the Keynesian reasoning goes, big deficits now will lead to a better long run growth picture (and thus lower future debt/GDP ratios).

    Is this reasoning correct? Well, right now the evidence would seem to support it: The U.S. is muddling through, while austerity measures have pushed Europe back into recession and most of Southern Europe into depression. For whatever it’s worth, Reinhart and Rogoff have advocated continued deficit spending too — at least for now.

    But this is macroeconomics. It’s hard to muster conclusive evidence, and almost impossible to generate much in the way of useful predictive ability. One response to this fog would be to throw up our hands and not do anything at all. Another is to acknowledge that our knowledge is limited and proceed anyway on a mix of data, theory, and intuition.

    This, to a certain extent, is what the Reinhart-Rogoff project of the past few years (most notably their book This Time is Different) has been all about. It’s a combination of history, data-crunching, and informed opinion — intended to be consumed and debated by an audience of far beyond academic macroeconomics. Which is exactly what’s happening now. That can’t be a bad thing, can it?

  • Building a Better Bitcoin

    How much is a bitcoin worth?

    Well, it’s worth whatever somebody will pay for a unit of the online currency, which as I write this is $209, up from $142 last Friday, $44 a month ago, and $4.93 a year ago. This huge run-up — the latest spike began with EU’s botched rescue of Cyprus’s banks — has led to much talk of a bitcoin bubble.

    The word “bubble” has been greatly overused in recent years. My understanding is that you’re in a bubble when the price of an asset becomes completely detached from its intrinsic value. It’s a bubble when the price you pay for share of stock cannot conceivably be recouped from the earnings of the company (this was Cisco in 2000) or the price you pay for house cannot conceivably be recouped from rental earnings (this was Phoenix in 2005). The only way you can avoid losing money on your investment is for a greater fool to come along — in the case of real estate, a greater fool backed by an even-greater-fool lender — and take the asset off your hands.

    Bitcoins have no intrinsic value. They lay claim to no stream of future earnings. A price of $198 per bitcoin is surely not justified by the fundamentals. But neither is a price of 10 cents. There are no fundamentals.

    So as an asset, Bitcoin (I’m trying to follow Maria Bustillos’ rule of capitalizing the system but lowercasing the coins) is clearly in a bubble, and always has been. But maybe asset pricing is the wrong lens to be looking through here. A dollar bill lays claim to no stream of future earnings, yet nobody says there’s a “dollar bubble” because somebody’s willing to give you a candy bar for one. This even though a dollar is almost certain to buy you less in a few years than it does now. According to the Bureau of Labor Statistics, a 2013 dollar has one-tenth the purchasing power of the 1950 version.

    By contrast, bitcoins have been skyrocketing in value. This sounds like a good thing, but for a currency it’s really not. An economy where bitcoins were the means of exchange would have experienced 98% deflation over the past year. No one would be able to repay any loans, or really do business at all. What we want out of a currency is not price appreciation but stability. Monetary economists differ on whether the optimal stability is inflation of 0% or in the low single digits. Nobody thinks 98% deflation is healthy, and all but a small minority seem to think any deflation at all is a bad thing.

    So … bitcoins are without intrinsic value as assets, yet they have risen too fast in value to be much use as a currency. Kind of makes your head hurt, doesn’t it? But it also sounds a bit like a familiar commodity, gold, that’s also been on a roll, with its dollar price quintupling over the past decade. Gold has, over time, not been the greatest of assets to invest in. It’s not the greatest of currencies, either: Back when the gold standard was widely adhered to, nations struggled regularly with deflation. There’s persuasive evidence that the primary cause of the Great Depression was a refusal to unlink currencies from gold until too late. Still, gold has held onto its purchasing power over time. It remains something that people turn to in times of financial uncertainty such as now. And while there are skeptics these days who talk of a gold “bubble,” they don’t really mean it. That is, they may expect the price of gold to decline from the current $1,575 an ounce, but they don’t expect it to suddenly lose most of its value, as assets tend to do when real bubbles burst.

    There are some key differences between gold and bitcoins: Gold is a shiny metal that can be made into jewelry, electronic components, and dental fillings — meaning it has some intrinsic value, albeit nowhere near $1,575 a troy ounce. Bitcoins are made of otherwise valueless digits. And while mankind has treated gold as a store of value for millennia, bitcoins were first unleashed upon the world in January 2009, by a mysterious and pseudonymous cryptographer (or cryptographers).

    But there are important similarities. Both bitcoins and gold are pretty much impossible to counterfeit. (That is, whatever fakes you might be able to produce won’t get past an expert.) Also, bitcoins are “mined” — by computers that have to solve a tough mathematical problem in order to free a block of 25 coins. This isn’t exactly the same as gold mining, but in one crucial aspect it’s the same. Unlike dollars, which can be created at will by the Federal Reserve, the supply of both bitcoins and gold is determined by forces outside the control of elected or appointed government officials. Given the long history of governments debasing their currencies to the point of worthlessness, the limited-supply, non-governmental nature of gold and of bitcoins has its attractions.

    Bitcoins actually have an advantage over gold in this regard, because bitcoin mining generates a steady, predictable increase in supply, whereas the gold supply grows in fits and starts. Bitcoin creation is thus a bit like Milton Friedman’s “k-percent rule,” which proposed that the money supply be made to grow automatically at a steady rate that averts both inflation and deflation.

    The advantage of a “quasi-commodity money” like Bitcoin, writes University of Georgia economist George Selgin, “is precisely that by resorting to it one can avoid leaving the management of money either to central bankers or to the blind forces of nature. Instead, supply is determined once and for all by artificially-arranged resource constraints.”

    The question, really, is whether you can pick the right “resource constraints” ahead of time. Friedman’s rule of a steady percentage increase in the money supply proved problematic when the Federal Reserve actually tried it from 1979 to 1982 — it wasn’t clear which measure of the money supply was the right one to target. Since then the focus has moved on to inflation or, recently, nominal-GDP targeting.

    With Bitcoin it’s the supply that’s targeted, with steadily decreasing returns to mining and the number of bitcoins set to top out at 21 million around 2040. From the Friedman perspective, this is not a great k-percent rule. Instead, it’s a recipe for severe deflation — fewer coins chasing around after a growing number of things available for sale. The flipside of deflation is rising prices for bitcoins in other currencies, which is what we’re witnessing now. Prices have risen so far, so fast, that it seems inevitable that they’ll collapse at some point (because remember, they have no intrinsic value). That doesn’t have to be the end: Bitcoin already survived a price boom and bust in 2011. But far fewer people were paying attention then, and on its current trajectory the market seems destined to become dominated by gamblers and greater fools.

    Then again, lots of important financial innovations start out like that. They become topics of public fascination, prices are bid up to crazy levels, then they crash. Sometimes that’s the result of design flaws, but often it’s just because we don’t know how to use them yet. The infamous 1720 South Sea Bubble in Great Britain springs to mind. It wasn’t that publicly traded corporations such as the South Sea Company were such horrible things, just that nobody really knew how to behave around them. Eventually, we figured it out. Sort of.

    I tend to agree with Felix Salmon’s verdict that Bitcoin would be better off if it had been “designed to be used primarily as a payments mechanism, rather than as a store of value and a unit of speculation.” (The 21-million-coin limit seems like the most obvious flaw in this regard.) But I also know that nobody really knows what the right currency for this networked, globalized age will be. It’ll take experimentation, trial and error, and the occasional financial bubble to get us there.

  • Why President Kagame Runs Rwanda Like a Business

    In Western business circles, Rwandan President Paul Kagame is widely regarded as a hero. The leader of the rebel army that put a halt to the massacre of the country’s Tutsi minority by its Hutu majority in 1994, Kagame has been the country’s president since 2000 (and was the vice president and de facto leader before then). He has presided over an economic and social rebirth, with Rwanda making dramatic gains in health and development indicators (watching its recent progress on Gapminder is a remarkable sight). And he has assembled a high-powered Western fan club consisting of, among others, Howard Schultz, Bill Gates, and Tony Blair.

    In other circles, Kagame is not so popular. Amnesty International and Human Rights Watch both accuse him of heavy-handedly stifling political dissent. A United Nations report held him responsible for killings by a rebel group in the neighboring Democratic Republic of the Congo. Britain and Belgium have cut back on aid. In a lengthy Newsweek article in January, former New York Times correspondent Howard French depicted Kagame as an out-of-control tyrant.

    The best attempt I’ve seen at sorting out these opposed narratives was an article published last fall by Time‘s Alex Perry that weighs the scales at least modestly in Kagame’s favor. So when Harvard Business School Professor Michael Porter invited HBR last month to attend a class where Kagame was the guest speaker, and talk to him and Kagame afterwards, I was curious but also a little worried about being enlisted as a Kagame salesman.

    Porter is a member in good standing of the business-community Kagame fan club, and has just finished a new version of his case study (an earlier one is available here) on Rwanda’s economic transformation. It describes the country’s successful efforts to build what Porter long ago dubbed “clusters,” concentrations of industry and expertise that enable it to build competitive advantage. So far, Rwanda’s three big clusters are coffee, tea, and tourism, but Porter is convinced there are more to come.

    The case study doesn’t hide the fact that Kagame has many critics, but it doesn’t dwell on political issues. Curiously, though, Kagame’s Q&A with Porter’s students ended up dwelling only on political issues. This was mostly Kagame’s fault — he was only asked two questions, one about the Congo and one about what will happen when his current (and, according to Rwanda’s constitution, last) presidential term ends in 2017, and gave such long, rambling, combative answers to both that there was no time for anything else. Kagame’s staff said I could quote anything he said in class, but it was just too much; to get a flavor of what it sounded like, see Alex Perry’s epic Q&A with Kagame from last year.

    After witnessing that, I tried a different approach in my interview, mostly staying away from politics. The edited results are below:

    Clearly you’ve been very interested in getting outside input from the business community. And yet you bristle at getting it from the multilateral community.

    President Paul Kagame: If you want to learn anything about a country, I think you need to ask the one who wants to make investment in that country. The one who is thinking about the risks involved. They’re thinking about the return. If somebody comes to your country and says, “You know, this is a place to invest,” actually that is a good place. You see?

    But if you send someone and say, “Go on, look and find for me something that is at fault,” in any place in this world somebody will come up with piles and piles to things to report about.

    These human rights groups, they come with that kind of mindset. They’re critical. They even start being critical on issues that the people where they have gone don’t find a problem with.

    That’s the difference between these two worlds. Therefore, we always want to ally ourselves with these ones [gestures at Porter] because that’s where the real life is. That’s where the people living in my country are going to find something to make a difference for their lives. When somebody’s coming to invest in Rwanda and finds it ripe for investment, it’s a good place. No matter what else you say about it.

    This, for me, is the focus. People started asking in 2005, “Oh, President, are you going to leave when your term is up?” Then after that they say the same question. And I say, “What does it matter to you? You’re diverting me from the real issues of the day.”

    [To Porter] How did you hook up with this guy? How did this partnership evolve?

    Michael Porter: We met through Michael Fairbanks, who was at Monitor originally and then founded his own firm. It was more than a decade ago, probably closer to 15 years ago. The country was at a very interesting place on a very interesting track. So I’ve had the opportunity to be involved in the journey.

    There’s a zillion countries that say they’re the next Silicon whatever, and then there’s lots of countries that do really basic resource exploitation. Rwanda seems to have picked interesting places in between, especially with coffee and figuring out that coffee washing was really important.

    Was that something that came bottom up? Was that advice from people like Professor Porter? How did you focus on a strategy like that when so many countries struggle to focus?

    Kagame: It is a combination of factors. We have seen in our country that good ideas and different initiatives, well, they come from where they come from. Sometimes they may be picked up by the leaders from the people on the ground.

    For example, coffee. Rwanda used to grow coffee many years ago, and because they were getting nothing out of it they gave up on it. And we said, “You know what? We used to be good at coffee. Now the coffee has kind of disappeared, but we have an idea of how it can work for you.”

    You start in one area. Then success leads to another, you know? It keeps going like that.

    Porter: The principle that Rwanda illustrates so well is that building and diversifying an economy has to start with what you have. In these cases Rwanda had eroded assets, but there was a foundation and a proof point, a market test that these areas could be successful and they could be competitive.

    Then the discussion was really about, “How do we move forward? How do we upgrade? How do we make things more sophisticated? What are the bottlenecks? What are the constraints?”

    In the initial three areas of coffee, tea, and tourism, that effort has advanced quite far now. Rwanda is winning international awards and marketing globally, and tourism is booming.

    The next areas of growth partly are ones that are connected to the first ones. I think it was very clear that Rwanda has to develop its logistics and its physical infrastructure. IT [information technology] was an area where I think the president and his team understood that this was not only good for citizens in general, and good to enable government and healthcare and education, but it also provided an area where Rwanda could build economic activity. There was no other place in the region that really had taken that space.

    My biggest effort in Rwanda really has been on private sector development and organization and upgrading. Ultimately that’s what makes a country successful or unsuccessful. Government can’t do it. You have to have a vibrant private sector. It has to be competitive. It has to create good jobs. It has to be profitable.

    Kagame: Right. And what people like Professor Porter brought to this situation is that critical thinking, that understanding of these global issues and how they interconnect. And for us it is also the readiness to actually test and implement. You start with one thing, it gives good results, and it becomes an incentive to keep trying.

    So what do you want to be next big thing after coffee, tea, and tourism? Or are you going to wait and see how it develops?

    Kagame: We are continuing with that and concentrating on the progress we’re making. But we’ve also discovered mining in Rwanda. We have more resources than we knew we had. So that’s an area that brings in money. And the services industry has been critical. In fact, it is among the leaders contributing to our GDP growth, with huge potential.

    All of these need powering. We need energy. So we are doing everything 24 hours a day thinking about how to increase our energy capacity.

    Of course, building human capacity is critical. We keep sending our people to institutions of higher learning in the sciences, engineering, and management. It’s the focus because we want our people to understand how the new world works.

    Porter: I think the IT area and financial services have now risen to the point that they represent a genuine opportunity. Broadband access is really quite unique for a country at this stage of development. And IT is now starting to interface with healthcare and education, and is powering financial services.

    Here’s a country where a critical part of the strategy was bringing the citizens together and giving them a sense that they are part of the solution, part of a nation, that they are Rwandans, not members of an ethnic group. Given the history, I think task number one was nation building and reconciliation.

    Kagame: Sometimes direct and simple conversations make a difference. I go to these rural areas and meet people and ask them, “How many of you own small businesses? Or have shops?” Many of them put up their hands, and I ask, “When somebody walks into your shop and is looking for soap or sugar or whatever, do you first ask them whether they are Catholics or Hutus or Tutsis? What does it matter to you? You want a customer, and that’s all you want.”

    People grasp it very quickly. They start valuing each other. They say, “Oh, I need him for what I don’t have and he needs me for what he doesn’t have.” That’s creating an awareness in society like never before: Yes, we need each other. We are more similar than different. It helps the society to move forward.

    And it’s your sense that business and economic activity do that?

    Kagame: I would rank it number one. The rest will follow. At the end of the day we’re just human beings. You want food and you want it for your family. Plus, you really need dignity, to be able to do something on your own and benefit from it. And there’s nothing that does that better than being able to do business.

    Porter: Or have a good job. Or make your farm more productive. These basic truths have become more understood in Rwanda. I give the president and his team a lot of credit for creating that atmosphere.

    I also think that Rwanda is unique, in my experience, in government being able to actually get things done. In most countries, things don’t get done. Roads don’t get built on time. Schools don’t get established. Teachers don’t get trained. Vocational training doesn’t work. And I think Rwanda, partly out of scarcity of resources and partly out of good leadership, has been able to actually implement and execute.

    The government is very disciplined, very focused on plans. Very focused on accountability. There’s an annual kind of national retreat of all the leaders in the country who really think about where we are, where we’ve been, where we need to go.

    It sounds like it’s run like a corporation.

    Porter: It’s really run much more rationally than most governments. Again, I think that’s partly possible because of the history.

    Kagame: Yes. It’s like you’re thrown in a swimming pool and you are trying to learn how to swim on the spot and get yourself out of trouble.

    Again, it’s an issue of incentivizing the people to act in a certain way. Even by using some of the simple conversations I was talking about. For example, when aid has been suspended or cut, you have to explain what has happened, and how and why it has happened.

    Then you also challenge them, saying, “But for how long are we really going to depend on handouts? When someone has decided to take it away, what happens to you? It is better to start focusing on what we can do for ourselves so we don’t always find ourselves stranded.”
    And you know, you see people lighting up. That is the moment when you bring in ideas, initiatives, some of the things that can work on the ground. They just grab it so quickly.

    We used to have people who would be fed by World Food Program and so on. That’s the situation we inherited. We said, “No, we need to feed ourselves. We can feed ourselves. This is how to do it because one day the World Food Program won’t show up.” In just three years from that time, we have had surpluses all the time.

    Porter: There’s some just marvelous data now in terms of just how much progress has been made, in education and healthcare and FDI and all kinds of areas. It seemed impossible a decade ago, but it’s happening and it’s reinforcing itself.

    And it’s all because of a certain pragmatic, forward-looking, we-have-to-figure-this-out mindset. That philosophy, that mindset really does have to come from the top but I think it’s going to sustain itself.

    Kagame: Right. Leadership, combined with the sense people have of how they link up with the leadership. Working together for a common goal that is theirs.

    Porter: It’s a very rich story about management and leadership and strategy and communication. And I think this is not a politics story. At the core is the private sector of economy — self-sufficiency, running a business well. It’s fascinating to see that play itself out.

    Photo by Jimmy Ushkurnis

  • The GOP Needs a New Product, Not a New Brand

    Since coming up short in the November elections, its fifth popular-vote loss in the last six presidential elections, the Republican Party has been engaged in an anguished discussion of what went wrong, and what needs to change.

    The latest example is the 100-page Growth and Opportunity Project (GOP, get it?) report that the Republican National Committee released last week. According to the RNC, “the Republican Party needs to stop talking to itself,” and figure out how to be more appealing to minorities, the young, and women. It also needs to get with this digital stuff (the biggest part of the report is devoted to campaign mechanics, especially the GOP disadvantage in use of data, social media, and other digital tools) and figure out how to use billionaire donors to its advantage rather than letting them hijack the party’s agenda.

    For somebody who came of voting age in the 1980s, this is all quite disorienting. I always thought self-flagellation was a Democratic thing. But times change, and the Republicans now are in a situation a lot like the Democrats then — still holding on to an advantage in statehouses and in the House of Representatives, but facing ever-stiffer headwinds at the national level.

    So what should the Republicans do about it? A key word in much of the discussion so far has been “rebranding.” (The RNC report uses the word “brand” five times and “rebranding” twice.) If only the party didn’t come across as so old and so angry and so white, the reasoning goes, it’d get more votes. This explains the sudden bursts of enthusiasm for the likes of Florida Senator Marco Rubio and — over the past few weeks — Maryland neurosurgeon Benjamin Carson.

    It could well be that a charismatic candidate who appealed to minorities, made better use of campaign technology, and embraced some modest policy changes (mainly on immigration and gay marriage) could sweep Republicans back into the White House. It’s not like Obama’s popular-vote majority was that overwhelming, and the current Democratic mix of affluent professionals, minorities, unionized workers, and the young isn’t exactly a natural coalition.

    But I think the Republicans are going about this all wrong. The party has been selling pretty much the same product for more than three decades now, while market conditions have changed. So far the self-examination has focused chiefly on its sales techniques; as detailed in the RNC report and Robert Draper’s New York Times Magazine cover story last month on young Republican operatives, GOP pollsters have been convening lots of focus groups in which people tell them the party comes across as old, angry, and out of touch. What most Republican leaders don’t seem to have worked very hard at yet is figuring out what voters outside the GOP base need and want.

    It’s like the flailing companies in Ted Levitt’s classic HBR article “Marketing Myopia” that err by thinking their job is to sell a product rather than satisfy a customer need. And because all of us at HBR have A.G. Lafley and Roger L. Martin’s Playing to Win on the brain these days, I also can’t help but contrast the Republican reaction so far to what Lafley’s Procter & Gamble did to revive the skin-care brand Olay.

    Oil of Olay was a bit like today’s GOP — its customers were aging and dying off, and younger women didn’t even really consider buying it. So P&G decided to target a different demographic, women from their mid-30s onward who were just beginning to notice signs of wear and tear on their skin, and reformulated the product using better, more expensive ingredients to fight “the seven signs of aging.” After that came a rebranding (from Oil of Olay to Olay), and all sorts of smart marketing and pricing choices that led to a spectacular revival.

    I realize this is skin lotion we’re talking about, not politics and policy. But the idea of starting by rethinking of what potential customers need, then building a strategy aimed at winning those customers over, has widespread application.

    In their book Grand New Party, published a few months before Barack Obama’s 2008 electoral triumph, young Republican thinkers Ross Douthat and Reihan Salam took just such a strategic approach to building a future Republican majority. The need they identified among the working-class “Sam’s Club voters” they deemed crucial was for policies that battled the economic insecurity that bedevils more and more Americans. Among other things, they recommended shifts in the tax code to favor young families (a much-expanded child tax credit, for example), health-care reform that combined catastrophic coverage for all with the removal of a lot of the current incentives for overspending, and job-creating investments in alternative energy.

    I don’t know if this would actually be a winning strategy for the Republican Party (one conservative critic at the time called it “Sam’s Club socialism“), but it is at least a strategy, and a forward-looking and hopeful one, that is not inconsistent with Republican tradition.

    After the book came out, the financial crisis and subsequent Great Recession made the problem of economic insecurity much, much worse. Instead of embracing the Douthat-Salam policy solutions or even the basic notion that insecurity is a problem, Republicans on the national level mostly ran bellowing loudly in the opposite direction. They did this in large part because the Obama administration was trying to address these problems, and they figured that outright opposition was a better tactic than tinkering at the margins of President Obama’s plans. As a result, I am hard-pressed to think of a single serious Republican legislative proposal at the national level over the past four years that addressed the problems of economic insecurity in a constructive way (yes, I’m sure somebody more wonkish and sympathetic than I will be able to identify a couple, but I think the basic point will stand).

    The tone has changed a bit since the November election, with even hard-line House Majority Leader Eric Cantor backing a new “Making Life Work” initiative “to improve the lives of you and your family” and the expanded child-tax credit undergoing a revival. But so far it’s just talk, and has yet to run the apostasy-punishing gauntlet of a Republican primary season.

    The primaries are of course a big part of the Republicans’ dilemma. If the people in charge of Oil of Olay had to be reelected by customers every two years, it would have been a lot harder for them to abandon those customers in search of new, more-profitable ones. And in a two-party system, the need to cobble together a majority adds a complexity to strategic choices that corporations selling products often don’t face.

    But without a strategic approach that starts with the wants and needs of potential voters — not with the current set of GOP policies — the party is doomed to drift.

  • Why Apple Has to Become More Open

    This is an age of openness and transparency. Hierarchies and tyrants are so 20th century. So why is it that the most successful corporation on the planet in this new millennium has been a secretive hierarchy run by a (now-deceased) tyrant? It’s a question I couldn’t help asking Don Tapscott after reading the new ebook he has co-authored with Anthony D. Williams, Radical Openness: Four Unexpected Principles for Success. (There’s a related TED talk by Tapscott, in case that’s how you prefer to digest your information.)

    Tapscott and Williams are the guys behind Wikinomics and Macrowikinomics (I’m still waiting for Wackywikinomics), and their new book is a nice, brisk little read that tempers its cheerleading for openness with a some welcome grains of sober realism. But then there’s that Apple question, which I posed to Tapscott in an email. What follows is our edited exchange:

    I spent the first third of the book, where you talk about how companies need to be more open and transparent, thinking to myself, “But what about Apple?” Then to your credit, you bring up the paradox yourself. Apple is probably the most successful and innovative company on earth over the past decade, and it’s extremely closed and secretive. What’s up with that?

    I think people misunderstand Apple. It’s more open than most observers note, and that’s been at the heart of much of its success. Its greatest liability is a legacy of traditional closed behavior in several areas. And today the company has become more open in a number of areas, as the pressures of the market forces are forcing it to open up even further.

    To begin, Apple is more transparent than one might think. Like all companies it has four major stakeholders: customers, employees, business partners, and shareholders. Yes, Apple is obviously super-secretive with its customers about product announcements. That can be a powerful marketing technique if you have the market muscle to pull it off. At the moment Apple can get away with this behavior, because it so powerful in the niche markets in which it operates and it has a small array of products. But as it tries to expand its share of the corporate market, business customers will demand to be well briefed regarding Apple’s intentions and its product roadmaps.

    Most companies don’t have the luxury of creating market hype through secrecy. For them, openness is a better strategy. RIM is a good example of this. The struggling company was forced to share details about its Blackberry 10 platform. It had to pump up the buzz around its platform to encourage its current customers to stay loyal and developers to make apps. And as Apple continues to lag well behind Android in market share, eventually it will be forced to be more open with consumers as well.

    It’s also true that Apple keeps its own employees in the dark. They have few details about what is in the company’s product pipeline. But such opacity does not derive from some inherent benefits of closed work systems. Rather it is purely in aid of ensuring secrecy of its product strategy with customers, and over time this will change too because closed silos hurt serendipity and innovation. The open work systems within Google, for example show how full collaboration within an enterprise pays off.

    So you’re predicting that the Great New Secret Apple Product hype will be a thing of the past in a few years?

    Yes. As competition intensifies and as Apple gets more serious about the enterprise marketplace, it won’t be able to maintain its product secrecy hype. They already need to share important information with enterprise customers.

    What about its other stakeholders?

    In terms of its supply chain partners, Apple itself has huge transparency and visibility. It can see down through Foxconn all the way to suppliers two levels below. True, this visibility has not extended to the rest of the world, but in this age of transparency Apple can no longer keep its supply chain practices secret.

    Everyone knows about Foxconn’s labor relations and how its factories are run like minimum-security prisons. This is an enormous problem for Apple, not just for its reputation but also the disruptions such working conditions can cause in the supply chain. So as Apple becomes naked it is being forced to get buff — to clean up its supply chain practices.

    As for transparency with its shareholders, Apple is actually very open — respected as being one of the best companies on the planet for providing shareholders with pertinent information. The only area in which it had been faulted is whether it had been sufficiently candid about Steve Job’s declining health.

    Paradoxically, Apple’s success is largely due to other kinds of openness — sharing and collaboration. The iPhone and iPad are nice pieces of hardware, but that’s not what created their market success. By opening up what are called “application programming interfaces,” Apple has enabled its customers and the world of software developers to build apps on its platform. In this sense Apple’s corporate borders are quite open and porous. It is a design and marketing company at the heart of an enormous business web of suppliers and software developers — based on openness.

    To be sure, Apple is obsessed with guarding its intellectual property, and has sought refuge in outmoded laws that today stifle innovation in our economy. But increasingly this comes at a cost too. Android, the open-source software platform developed by Google, quickly became the dominant operating system in the mobile marketplace. Open-source software stimulates creativity and attracts attention. Over time, openness tends to win out in the market. Google was smart to release the Android software’s inner workings so that manufacturers could tweak the operating system to their specific devices. Other companies have made similar moves. IBM gave away $400 million worth of software to the open-source Linux movement, and in exchange received many billions in savings and new business. That decision also blocked Microsoft from the enterprise computing market. I’d attribute 10 percent of IBM’s value to the strategic decision to embrace Linux.

    I still don’t think Android generates anything like the profits Apple makes from the partly closed iOS ecosystem. Again, do you figure that’s just a matter of time, that eventually Android will win?

    Yes. All things being equal (they haven’t been equal in the past due to the ineptitude of mobile device manufacturers and Microsoft’s inability over two decades to create a coherent and truly user friendly PC operating system) openness will win out. Of course there can be benefits to being closed, and if you have a proprietary lockup in a market margins can be significant. But the arc of competitiveness is towards transparency, collaboration and increased sharing of intellectual property, as Anthony and I discuss in the book.

    Every company needs a portfolio of intellectual property — kind of like a mutual fund. IP such as trade secrets should be protected. Some should be shared in a limited way as Nike placed 400 patents in the GreenXchange. And some should be placed in a commons as IBM did with Linux.

    In fact it’s not only companies but entire industries that need to rethink their strategy. A great example is the pharmaceutical industry, which must place clinical trial data in a commons or the industry will collapse. Such sharing doesn’t undermine profitability. It enables it, as a rising tide lifts all boats and pharma companies compete on a higher level where they provide protected or proprietary products and services.

    Apple became the world’s most valuable company for a number of complex reasons, including the design genius of its brilliant founder. But when you look under the surface, it is more open than you might think and there is no real evidence that its secrecy has helped it succeed. As it comes under more intense competitive pressures, Apple will need to become an even more open company. This will be good for its customers and everyone.

  • The (Not Very Deep) Meanings of the Dow’s New Record

    You’ve got to admire the staying power of the Dow Jones Industrials Average. It was replaced as the best measure of U.S. stock market performance in 1923, when Standard Statistics Co. unveiled its new stock index, which later became the S&P 500 (the Dow was and is simply the crude average of the prices of 30 subjectively chosen stocks). Within the investing business, both the Dow and the S&P have long since been supplanted by the more focused indexes compiled by the likes of Russell Investments and MSCI.

    Yet it was the Dow’s busting through its all-time record Tuesday that got all the headlines — not the Russell 3000, which surpassed its 2007 peak last month, or the S&P 500, which as I write this is about 15 points short of its record close of 1565.15 on Oct. 9, 2007. (And don’t forget the poor Nasdaq Composite, the glamor index of the late 1990s, which is still miles away away from its March 2000 pinnacle.)

    It’s a remarkable testament to the power of brand, and of habit. (And to having the nation’s leading financial newspaper invested in your success for more than a century — although not so much any more.) It’s also one that would be pedantic to harp on, given that, despite its flaws, the Dow is sending pretty much the same message as the S&P and Russell 3000: the stock market is back!

    Yeah, the comeback is less impressive if you adjust for inflation. But if you factor in dividends, as the Wall Street Journal‘s Justin Lahart points out, all the gains are restored and then some.

    So the stock market is doing great. What exactly does that mean? There are three main ways of explaining stock prices:

    The first is basic economics — a share of stock is worth the present value of the future cash flows associated with it. Put another way, a company’s stock-market value is a function of how much money investors think it will make in the coming months and years. So a rising Dow or S&P 500 means investors think big American companies will be making more money. Which would be great economic news, except for a couple of things. One is that the link between big-company earnings and U.S. prosperity is weaker than it once was (the corporations that make up the S&P 500 get something more than 40% of their revenue from outside the U.S.). The other, even more important, caveat is that stock market investors aren’t reliable oracles. The S&P 500 hit its all-time nominal peak in October 2007, with a global financial unraveling already under way. Investors can miss on the downside, too: “The stock market has predicted nine of the five last recessions,” as economist Paul Samuelson liked to say.

    A second way to think of stock prices is as a reflection of dynamics only tangentially related to economic value. Wall Street Journal co-founder Charles Dow saw “waves on a beach” in the movements of the average he created; subsequent observers saw all sorts of other, more intricate patterns. For a time this chart-reading was dismissed by financial economists as purest hokum, but it has regained some favor in recent years. Academic research has shown that stock prices clearly exhibit momentum — once they’re moving in one direction, they tend to keep moving that way. And when they change direction, they often do so far more sharply than any calmly rational revision of economic forecasts would warrant. So the fact that stock prices are nearing an all-time high could mean that they’ll keep rising. Or it could mean they’re about to collapse.

    Finally, there’s supply and demand. Stock prices go up when more people are buying stocks. This could be because they’re gung ho about corporate earnings prospects (explanation 1), or that they’re mindless trend-followers (explanation 2). But it could also just mean that they have some money and need a place to put it. Right now the main alternative to investing in stocks — fixed-income investments ranging from savings accounts to corporate bonds — is not attractive. Interest rates are infinitesimally low, and because there’s simply no room for them to go much lower, the other way of making money on fixed income — by seeing the value of bonds go up as interest rates fall — isn’t available. Stocks may be volatile and unpredictable, but at least they can go up. So people buy them. And they go up.

    As explanations go, this last one is pretty frustrating: stock prices are going up because people are buying stocks. But it has the virtue of being true, something one can never be sure of with the other two.

  • Confronting Just-Back-From-Vacation Dread

    On the way to work one day last week, I caught myself having a terrible thought.

    “Let’s see,” I mused. “No more vacations coming up soon. In fact, probably no more long vacations this year. Excellent.”

    It’s not that I don’t like vacations. It’s just that returning from them, as I was doing last week, is such a pain.

    I had pretty wiped out my to-do list before I left, so it wasn’t that I was way behind. I’d also spent 20 minutes every day during vacation checking my emails and responding to the handful of urgent ones, so I didn’t return to a scary backlog. But there was still no avoiding a big post-vacation pile of little questions and proposals begging for responses. And, because I’d told people who I’m working on things with that I’d be back in the office on Tuesday, there was some big stuff, too. By the end of the day I had five article drafts and a book manuscript in my inbox, all requiring some sort of response. Plus I had a moderately busy calendar of meetings and events for the week.

    The result: I started the week playing catch-up, yet never really caught up. I wasn’t working on the things I really wanted to be working on, I was feeling guilty about not getting back to people, and I didn’t feel like I was making much progress. It seemed like it would take a week or two to get back to where I felt in control again — which is what got me looking forward to the long stretch of vacation-free months ahead of me.

    Not all jobs are like this. Back when I was a newspaper reporter, a colleague would simply cover for me when I was gone. When I got back, I’d have to do a little bit of catching up on what I’d missed, but there wasn’t ever any kind of backlog I needed to work through — if something really important happened on my beat, the paper had already run an article on it. Same goes for any job where someone else has to take over your duties while you’re on vacation.

    Interestingly, bank regulators in the U.S. require that bank employees “in key or influential positions” take off at least two straight weeks every year — long enough that someone else has to take over their duties for that time, making it harder to cover up any ongoing shenanigans.

    I’m not going to hold my breath for a similar decree from magazine regulators. For one thing, there aren’t really any magazine regulators, at least not in my country. And the nature of the job that I do — which is, at least in this respect, similar to the work of most white-collar workers these days — doesn’t lend itself to vacation handoffs. I work on projects with durations in the weeks, months, and sometimes years, usually involving several people, with each playing a distinct role. It’s usually easy enough to put off my part of the work for a week, while it would take tons of coordination and bringing-up-to-speed to hand it to somebody else. So, most of the time, finding someone to take over for me isn’t practical — and vacations become postponements of work rather than vacations from it.

    This wouldn’t be a problem if I perfectly calibrated my workload to leave room for a few weeks of vacation a year. I have lots of freedom to decide how much work to take on, so in theory this would be possible. But in practice, I’m quite impressed with myself when I get the work/time balance right for a single workweek. Adding in the vacation calculation is simply beyond me.

    Seeking advice, or maybe just solace, I typed “vacation” into the hbr.org search box. It turns out that if we wanted to start a spinoff publication called the Harvard Vacation Review, we’d totally be able to fill it. We’ve run lots of pieces on the importance of vacations — so okay, I’ll keep taking vacations.

    There was also Dorie Clark’s piece on “How to Take a Month Off,” which I remembered reading when it came out. But part of the attraction of taking a week off is that you don’t have to spend a year planning for it, as Dorie did for her big trip.

    Then I came across Peter Bregman’s “The Right Way to Come Back from Vacation,” which addressed exactly the catch-up problem I was facing. Peter’s advice was basically stop trying so hard to catch up. Block out lots of time for activities that aren’t related to clearing out your backlog, and spend time thinking about who you are at your best and what your goals are. Then tackle the inbox, but with a clear sense of what’s important and what you can leave hanging.

    There was no way I was going to exactly what Peter advised. I’m too wary of self-help advice for that, especially when it’s self-help advice from a guy I’ve known since he was a freshman in college (and I was a junior). But I found myself nonetheless following his recommendations in spirit, by ignoring my to-do list for three hours and writing this. And now I really do feel much better — I took control, I accomplished something tangible, and I forced myself to think about things bigger than just catching up. True, I’m still far from caught up. But I’m no longer resolving never to take another vacation.

  • When the Minimum Wage Makes Economists Smile

    Chairman of the President’s Council of Economic Advisers is a grand title, and it’s been held by some pretty impressive academic economists over the years (Arthur Okun, Marty Feldstein, Joe Stiglitz, Ben Bernanke, Cristina Romer — to name a not-entirely-randomly chosen few). But it’s usually hard to detect the Chairman’s fingerprints in administration economic policy. The big decisions are made in the West Wing of the White House, not over in the Executive Office Building where the Council is housed.

    So when a chairman does have a clear impact, it gets noticed. Then-chairman Glenn Hubbard, for example, pushed for and got a reduction in taxes on dividends in 2003. And, in the State of the Union Address Tuesday night, current Chairman Alan Krueger got a kind of shout-out from the President, in the form of a proposal to raise the federal minimum wage all the way from $7.25 to $9 and index it to inflation after that.

    In 1992, when New Jersey raised the state minimum wage from $4.25 to $5.05, Krueger and his then-Princeton colleague David Card surveyed 410 fast-food restaurants in New Jersey and eastern Pennsylvania before and after the wage hike. The idea was to compare changes in fast-food employment in New Jersey, where the minimum wage had risen, with those in Pennsylvania, where it stayed constant at $4.25. The surprising result: fast-food employment went up in New Jersey relative to Pennsylvania.

    This was surprising because the basic supply-and-demand model of economics teaches that, when you raise the price of something (in this case, low-skilled labor) demand for it will go down. There had been a number of philosophical objections posed to this approach through the years — among them the argument that employers possess more power and information than individual workers in most labor markets, allowing them to push wages below the optimal level in the absence of collective bargaining or government intervention. But Card and Krueger now had empirical evidence that the “textbook model,” as they put it, didn’t work. The New Jersey fast food restaurants did pass their increased wage costs on to customers in the form of higher prices — but they weren’t enough higher to hurt business.

    This research was a sensation, as economic research goes. It got lots of media attention back in the early 1990s, and has continued to inspire economist after economist to attempt to refute or back up its conclusions (Google Scholar lists 8,780 citations, and Wikipedia summarizes some of the major work). It’s probably accurate to say that most economists still don’t believe that raising minimum wages is a reliable way to increase employment (one hopes that Brian Barry and Anil Kashyap of the University of Chicago will ask their Economic Experts Panel about this soon) — but I also get the sense that the percentage of economists who think it has a substantial negative effect on employment has declined since the initial Card-Krueger research was published. Economists in general have become a bit less trusting of “textbook models” than they were in the 1970s through 1990s. And while it’s dangerous to equate changing fashions in the economics profession with truth, I’ll go ahead say that the business groups making dire claims about the negative economic impact of a minimum wage increase are mostly blowing smoke.

    productivitygap2.jpegWhat ails the U.S. economy, and in particular its workers, though, goes well beyond the minimum wage. According to the Bureau of Labor Statistics, 3.8 million people, or about 3% of the country’s wage and salary workers, made the minimum wage or less in 2011. Yet workers across the income spectrum, except for those at the very top, have been stuck in neutral for a while. Except for a brief uptick during the dot-com era, labor’s share of income has been on a steady decline since the early 1970s. Through the years, this has been mostly attributed to globalization (capital can go anywhere on the globe in search of the highest return, while workers are generally stuck in the country they came from) and technological change (machines are replacing workers). Lately there’s increasing sentiment, perhaps not so much among economists as among others who care about economic policy, that maybe political decisions and changing social mores have played big a role, too.

    As Jonathan Schlefer wrote on hbr.org in November, early economists like Adam Smith and David Ricardo believed wages were set by “habits and customs of the people,” to use Ricardo’s words, as much as by economic forces. That’s where something like the minimum wage — or collective bargaining by labor unions — comes in. If, in a free market, the wages and salaries paid closely approximate the actual value of the work done, then minimum-wage laws and unions can only get in the way. But if labor markets are naturally riddled with inefficiency and affected by custom and habit, then laws and unions can conceivably bring a healthier economy — and higher profits for business — by raising wages.

    Studies of retailers seem to indicate that this might the case. So does the current example of the Northern European countries, which combine strong unions and high wages with higher competitiveness rankings than the U.S.

    Of course, it’s really hard to imagine at this point that unions will ever regain much of a foothold in the U.S. private sector. The minimum wage is irrelevant to most of the workforce, too. And it may still be that most reliable way to increase the wages of American workers is simply to upgrade their skills. But I get the sense that the conversation about pay in the United States is just getting started — and that it’s not going to be dominated by the models out of economics textbooks.

  • Can the NFL Stop the Goalposts from Moving?

    This year’s Super Bowl, as has become custom, brought Americans together like no other event. There was the game, which in the end turned out to be pretty exciting. There was the power outage, which was pretty dramatic, too. There was Beyoncé’s half-time performance. And there were of course the ads. In an age where TV audiences keep shrinking and fragmenting, the Super Bowl appears to have once again set a viewership record. It is a spectacle that has become so popular that many of us pay attention mainly because everyone else is paying attention, too — a self-reinforcing network effect that can be really hard to break.

    The only hint the Super Bowl’s status might be at all endangered came, interestingly, from a series of ads sponsored by the National Football League. One, showing the development of football rules and safety equipment through the decades, has been running during games for a while. But the league debuted several more on Sunday night, most of them heartstring-tuggers meant to play up our sentimental attachment to the game.

    Clearly, the NFL is worried — and for good reason. The medical evidence that the game extracts a terrible toll on its players has been piling up. As Paul Barrett details in a cover story in the latest Bloomberg Businessweek, some of the same plaintiffs’ lawyers that successfully took on asbestos manufacturers and tobacco companies are now targeting the NFL, accusing it of long covering up the risk of severe brain injuries to players.

    Barrett figures that, since these lawyers want the NFL to keep thriving so it can pay for a big ongoing settlement, the league isn’t in any real danger. But it’s not as simple as that. Norms for what is acceptable behavior (and acceptable entertainment) can shift quickly and unpredictably. Just in the past few decades we’ve seen rapid shifts in public attitudes on, for example, smoking, race, sexual orientation, and seatbelt-wearing. And in the narrower field of sports that pose a danger to noggins, boxing has gone in the U.S. from spectacle with near-universal appeal to somewhat unsavory fringe sport. Surely this could happen to football, too.

    Legal scholar Cass Sunstein, in a fascinating 1996 examination of social norms (that link is to JSTOR; an earlier, free version is here), referred to such shifts as “norm cascades.” He said they were often set off by “norm entrepreneurs” — activists or politicians who by signaling commitment and building coalitions can induce “a ‘tipping point’ when norms start to push in new directions.” Sunstein’s very interesting (if also very 1996) list of norm entrepreneurs: civil rights leader Martin Luther King Jr., conservative thinker William Bennett, Nation of Islam leader Louis Farrakhan, feminist legal scholar Catharine McKinnon, President Ronald Reagan, and religious activist Jerry Falwell.

    This is what has the NFL concerned. If football, in its current form and with its current health risks, were suddenly presented to the American people out of the blue, it’s pretty hard to imagine it catching on. Despite the NFL’s best efforts, it hasn’t really caught on in a big way anywhere outside of North America. In the U.S., the sport has history and ubiquity on its side. Here accepting its risks (at least as spectators; I’ve never played tackle football, and happily my kid has never showed the slightest interest) is the social norm.

    Football can also, I should add, be hugely entertaining. But this fall, despite the fact that two of my favorite teams (the Alabama Crimson Tide and the San Francisco 49ers) had spectacular years, I found myself watching with increasing queasiness. The culprit, I think, was a column I’d read in August by the conservative commentator and baseball fan George Will. “Are you ready for some football?” it began. “First, however, are you ready for some autopsies?”

    Football, Will concluded, “is a mistake because the body is not built to absorb, and cannot be adequately modified by training or protected by equipment to absorb, the game’s kinetic energies.” Now I don’t know that this qualifies Will as a “norm entrepreneur” — it was just one column, and I haven’t noticed him harping on the subject since. But he did begin shifting my norms. I now pay close, dismayed attention to on-field injuries — one of the big attractions of this Super Bowl for me was that there weren’t many of them. And I find it harder and harder to sit through entire games.

    If the NFL can make football less dangerous by improving equipment and changing on-field norms (as it has tried to do with its harsh treatment of the New Orleans Saints after it came out that players got bounties for injuring opposing quarterbacks), the threat to the league will fade. Failing that, though, football remains at constant risk of a norm change. Or, to put it differently, the NFL may find the goalposts being moved on it.

    This phrase has become popular lately in American politics (although it may have originated in the United Kingdom, with reference to rugby). It is usually wielded as a complaint: moving the goalposts is cheating. In football, if moving the goalposts were even remotely practical, it would be cheating. In politics and other social activities, it is actually — if you can get away with it — a sign that you’re succeeding. Norms, be they behavioral, political, social, or of some other sort, play an enormous and underacknowledged role in shaping our world.

    In business, for example, the current norm allows U.S. corporations to go to absurd lengths to avoid paying taxes without facing ostracism from peers or all that much disapprobation from anywhere else. Without a shift in that norm, Mihir Desai argued in HBR last year, it’s hard to imagine a truly successful reform of the corporate tax system. Jonathan Schlefer argued recently on hbr.org that rising income inequality is the product more of changing norms than of economic forces. And then there’s the NFL, currently the most economically successful sports league in the world. Its continued success may rest on what sure seems like a potentially fragile social norm — our willingness to keep supporting an activity that permanently and seriously damages a high percentage of its participants.

  • How Amazon Trained Its Investors to Behave

    In March 2000, Barron’s reported that 51 Internet companies were burning cash so fast that they’d be broke by the end of the end of the year. The article (it’s behind a seemingly unbreachable paywall) has acquired the reputation of having marked the end of the dot-com boom. The Nasdaq composite index peaked on March 10 at 5132, and by the end of the month was in a full-on collapse (as I write this, it’s only at 3155, despite years of gains).

    The Burn Rate 51 was made up mostly of now-forgotten companies like drkoop.com and CDNow. But it also included a certain Internet bookseller from Seattle. The Barron’s article mentioned that a 690 million euro convertible bond sale in February had bought Amazon some more time (the list was based on 1999 year-end data) — but that the company would still run out of cash in 21 months.

    In fact, Amazon was only operating at such a high burn rate because it could. Once investors stopped giving it free money, the company quickly cut back on its investments and its losses. By the fourth quarter of 2001 — that is, within about 21 months — it was turning a profit.

    That opportunistic approach to financial markets has defined Amazon since it went public in 1997. And while it has certainly burned many buyers of Amazon shares through the years — Amazon’s stock price took a decade to get back to its 2009 peak — the long-run returns have been spectacular. In Morten T. Hansen, Herminia Ibarra, and Urs Peyer latest ranking of long-run CEO performance in HBR, Amazon’s Jeff Bezos now ranks No. 2, behind the late Steve Jobs, with an industry-adjusted shareholder return of 12,266% during his tenure.

    So when Amazon reports below-consensus earnings, as it did Tuesday, and the share price jumps, as it did after-hours Tuesday and again Wednesday morning, the reaction isn’t quite the puzzle it seems. Slate’s Matthew Yglesias cracked that “Amazon, as best I can tell, is a charitable organization being run by elements of the investment community for the benefit of consumers.” But what’s really going on is that Jeff Bezos has trained elements of the investment community to expect that low profits (or big losses) now represent investments that will eventually pay off, not signs of trouble.

    How has Bezos done this? Well, he’s a hedge fund veteran who has always taken a skeptical view of Wall Street, treating it more as a loopy rich uncle than the efficient information processor of standard finance theory. When Uncle Wall Street (also known as Mr. Market) is in a generous mood, Bezos is always ready to take advantage by putting investment ahead of profitability. But he’s also always been ready to shift gears when the mood turns stingy.

    And so Amazon thrived in the crazy late 1990s, when Henry Blodget made his name as an analyst by making outrageous guesses about how high Amazon’s stock price would go and seeing them come true long before he expected. It also thrived right through the gloomy early 2000s, when bond analyst Ravi Suria made his name picking apart Amazon’s balance sheet and worrying that the company wasn’t generating enough cash to make its bond payments (along with the 690 million euro issue in 2000, the company had sold $1.25 billion in bonds the year before). Suria was wrong about that. In fact, Amazon retired the last of the $1.25 billion bond issue just before the debt-market meltdown of autumn 2008. Nice timing, huh?

    Of course, none of this would have worked if Bezos hadn’t been making the right strategic bets, running the company spectacularly well, and catching the occasional lucky break. There are lots of corporate executives who think they know better than Wall Street. Most turn out not to.

    But when you combine Amazon’s success with its resolute unwillingness to take financial markets too seriously, the result is an amazing thing to see. Clayton Christensen has long complained that standard financial metrics can be enemies of innovation and growth. As he and two co-authors wrote in 2008:

    The emphasis on earnings per share as the primary driver of share price and hence of shareholder value creation, to the exclusion of almost everything else, diverts resources away from investments whose payoff lies beyond the immediate horizon.

    With Amazon, though, nobody emphasizes EPS. Or, when they emphasize earnings, it’s in the opposite direction from what Christensen’s worried about. A few months ago, I heard analyst Mark Mahaney, now of RBC Capital markets, argue (at about minute 26 on the video) that Amazon’s razor-thin profit margins were a source of competitive advantage:

    You really develop very sustainable moats around a business when you run it at low margins. Very few companies want to come into Amazon’s core businesses and try to compete with them at 1% margins or 2% margins.

    This sounds eerily similar to what Yglesias was saying, half-jokingly, on Tuesday:

    Competition is always scary, but competition against a juggernaut that seems to have permission from its shareholders to not turn any profits is really frightening.

    Amazon has this permission because it has trained its shareholders to believe that everything will work out in the end. As a result, it has a shareholder base that’s geared for the long-run. The biggest holder, by far, is Bezos himself. After that the No. 1 institutional holder, by a good margin, is Capital Group, the giant Los Angeles mutual fund complex with a reputation for having long investment horizons. I’ve been looking through transcripts of the company’s past couple years of quarterly earnings conference calls with analysts (thanks, Seeking Alpha), and have learned that Bezos never participates (most CEOs do) while CFO Tom Szkutak always concludes his remarks with the sentence, “We believe putting customers first is the only reliable way to create lasting value for shareholders.” Nobody complains.

    Being long-term oriented isn’t necessarily the same as being right. Amazon could make the wrong bets. Bezos could get more interested in space travel than selling massive quantities of stuff at just above cost. But Bezos seems to get this. From an interview with Fortune’s Adam Lashinsky last year:

    “We believe in the long term, but the long term also has to come,” says Bezos, explaining that periodically Amazon wants to “check in” with its ability to make money.

    Just “check in,” mind you. Wouldn’t want to get hung up on flawed financial metrics when there’s a world to conquer. Which Bezos can get away with — now that he has housebroken the investment community. The key to success in dog training, I’m learning (we just got a puppy), is to appeal to instinct and memory. Reasoning with the animal, or getting mad at it, doesn’t get you anywhere. Neither does mockery, whatever Will Ferrell says. Bezos did lose his cool a bit over Suria’s claims back in 2000 (“hogwash,” he kept calling them). But in general he has exuded the steady authority of a good dog trainer. Hey, Wall Street! Roll over!