Author: Grist – the Latest from Grist

  • Egger’s Head: School lunches

    by Grist

    Robert Egger has a lot going on in his head. Just ask him.

    As a nonprofit entrepreneur, a serial searcher for ordinary people
    doing extraordinary things, a deeply deep thoughts kind of guy, Egger
    gives us something to ponder every week.

    In this installment, he scratches his noggin over the school lunch crisis, which has become a hot topic of conversation and debate lately. We’ve got it covered from soup to nuts, as they say. You can catch up here on the tasty Grist discussion about school lunches.

    Related Links:

    Why even the childless should care about school lunch

    A teacher openly crusades for better school food—and gets seared

    Underground school lunch blogger hits ‘Good Morning America’






  • Paul Krugman on ‘Building a Green Economy’

    by Joseph Romm

    Nobelist Paul Krugman has a long piece in the upcoming Sunday New York Times Magazine, basically climate economics 101.

    It is nearly 8000 words, so while you should read the whole thing,
    I’ll post some of the highlights below. I’ll also throw some links to
    the scientific and economic literature that the NYT, in its infinite wisdom/stupidity, refuses to include.

    The essay isn’t primarily about the science, but this is what Krugman has to say on that, starting with the opening paragraph:

    If you listen to climate scientists—and despite the relentless campaign to discredit their work, you should—it is long past time to do something about emissions of carbon
    dioxide and other greenhouse gases. If we continue with business as
    usual, they say, we are facing a rise in global temperatures that will
    be little short of apocalyptic. And to avoid that apocalypse, we have
    to wean our economy from the use of fossil fuels, coal above all …

    This is an article on climate economics, not climate science. But
    before we get to the economics, it’s worth establishing three things
    about the state of the scientific debate. The first is that the planet
    is indeed warming. Weather fluctuates, and as a consequence it’s easy
    enough to point to an unusually warm year in the recent past, note that
    it’s cooler now and claim, “See, the planet is getting cooler, not
    warmer!” But if you look at the evidence the right way ­—taking
    averages over periods long enough to smooth out the fluctuations—the
    upward trend is unmistakable: each successive decade since the 1970s
    has been warmer than the one before.

    Second, climate models predicted this well in advance, even getting
    the magnitude of the temperature rise roughly right. While it’s
    relatively easy to cook up an analysis that matches known data, it is
    much harder to create a model that accurately forecasts the future. So
    the fact that climate modelers more than 20 years ago successfully
    predicted the subsequent global warming gives them enormous credibility.

    Yet that’s not the conclusion you might draw from the many media
    reports that have focused on matters like hacked email and climate
    scientists’ talking about a “trick” to “hide” an anomalous decline in
    one data series or expressing their wish to see papers by climate
    skeptics kept out of research reviews. The truth, however, is that the
    supposed scandals evaporate on closer examination, revealing only that
    climate researchers are human beings, too. Yes, scientists try to make
    their results stand out, but no data were suppressed. Yes, scientists
    dislike it when work that they think deliberately obfuscates the issues
    gets published. What else is new? Nothing suggests that there should
    not continue to be strong support for climate research.

    And this brings me to my third point: models based on this
    research indicate that if we continue adding greenhouse gases to the
    atmosphere as we have, we will eventually face drastic changes in the
    climate. Let’s be clear. We’re not talking about a few more hot days in
    the summer and a bit less snow in the winter; we’re talking about
    massively disruptive events, like the transformation of the
    Southwestern United States into a permanent dust bowl over the next few
    decades.

    Now, despite the high credibility of climate modelers, there is
    still tremendous uncertainty in their long-term forecasts. But as we
    will see shortly, uncertainty makes the case for action stronger, not
    weaker. So climate change demands action …

    At this point, the projections of climate change, assuming we
    continue business as usual, cluster around an estimate that average
    temperatures will be about 9 degrees Fahrenheit higher in 2100 than
    they were in 2000. That’s a lot—equivalent to the difference in
    average temperatures between New York and central Mississippi. Such a
    huge change would have to be highly disruptive. And the troubles would
    not stop there: temperatures would continue to rise.

    Furthermore, changes in average temperature will by no means be the
    whole story. Precipitation patterns will change, with some regions
    getting much wetter and others much drier. Many modelers also predict
    more intense storms. Sea levels would rise, with the impact intensified
    by those storms: coastal flooding, already a major source of natural
    disasters, would become much more frequent and severe. And there might
    be drastic changes in the climate of some regions as ocean currents
    shift. It’s always worth bearing in mind that London is at the same
    latitude as Labrador; without the Gulf Stream, Western Europe would be
    barely habitable.

    But there are at least two reasons to take sanguine assessments of
    the consequences of climate change with a grain of salt. One is that,
    as I have just pointed out, it’s not just a matter of having warmer
    weather—many of the costs of climate change are likely to result from
    droughts, flooding, and severe storms. The other is that while modern
    economies may be highly adaptable, the same may not be true of
    ecosystems. The last time the earth experienced warming at anything
    like the pace we now expect was during the Paleocene-Eocene Thermal
    Maximum, about 55 million years ago, when temperatures rose by about 11
    degrees Fahrenheit over the course of around 20,000 years (which is a
    much slower rate than the current pace of warming). That increase was
    associated with mass extinctions, which, to put it mildly, probably
    would not be good for living standards …

    For what the science says we risk if we stay anywhere near our current path of unrestricted emissions, see:

    Nature Geoscience study: Oceans are acidifying 10 times faster today than 55 million
    years ago when a mass extinction of marine species occurred

    M.I.T.  doubles its 2095 warming projection to 10 degrees F – with 866 ppm and Arctic   warming of 20 degrees F
    Our
    hellish future: Definitive NOAA-led report on U.S. climate impacts
    warns of scorching 9 to 11 degrees F warming over most of inland U.S. by 2090
    with Kansas above 90 degrees F some 120 days a year—and that isn’t the worst
    case, it’s business as usual!

    Science:
    CO2 levels haven’t been this high for 15 million years, when it was 5
    to 10 degrees F warmer and seas were 75 to 120 feet higher—“We have shown
    that this dramatic rise in sea level is associated with an increase in
    CO2 levels of about 100 ppm.”

    An   introduction to global warming impacts:  Hell and High Water

    He has a discussion of the low cost of action:

    Just as there is a rough consensus among climate
    modelers about the likely trajectory of temperatures if we do not act
    to cut the emissions of greenhouse gases, there is a rough consensus
    among economic modelers about the costs of action. That general opinion
    may be summed up as follows: Restricting emissions would slow economic
    growth—but not by much. The Congressional Budget Office,
    relying on a survey of models, has concluded that Waxman-Markey “would
    reduce the projected average annual rate of growth of gross domestic
    product between 2010 and 2050 by 0.03 to 0.09 percentage points.”
    That is, it would trim average annual growth to 2.31 percent, at worst,
    from 2.4 percent. Over all, the Budget Office concludes, strong
    climate-change policy would leave the American economy between 1.1
    percent and 3.4 percent smaller in 2050 than it would be otherwise. And
    what about the world economy? In general, modelers tend to find that
    climate-change policies would lower global output by a somewhat smaller
    percentage than the comparable figures for the United States. The main
    reason is that emerging economies like China currently use energy
    fairly inefficiently, partly as a result of national policies that have
    kept the prices of fossil fuels very low, and could thus achieve large
    energy savings at a modest cost. One recent review of the available
    estimates put the costs of a very strong climate policy—substantially
    more aggressive than contemplated in current legislative proposals—at
    between 1 and 3 percent of gross world product.

    Such figures typically come from a model that combines all sorts of
    engineering and marketplace estimates. These will include, for
    instance, engineers’ best calculations of how much it costs to generate
    electricity in various ways, from coal, gas and nuclear and solar power
    at given resource prices. Then estimates will be made, based on
    historical experience, of how much consumers would cut back their
    electricity consumption if its price rises. The same process is
    followed for other kinds of energy, like motor fuel. And the model
    assumes that everyone makes the best choice given the economic
    environment—that power generators choose the least expensive means of
    producing electricity, while consumers conserve energy as long as the
    money saved by buying less electricity exceeds the cost of using less
    power in the form either of other spending or loss of convenience.
    After all this analysis, it’s possible to predict how producers and
    consumers of energy will react to policies that put a price on
    emissions and how much those reactions will end up costing the economy
    as a whole.

    There are, of course, a number of ways this kind of modeling could
    be wrong. Many of the underlying estimates are necessarily somewhat
    speculative; nobody really knows, for instance, what solar power will
    cost once it finally becomes a large-scale proposition. There is also
    reason to doubt the assumption that people actually make the right
    choices: many studies have found that consumers fail to take measures
    to conserve energy, like improving insulation, even when they could
    save money by doing so.

    But while it’s unlikely that these models get everything right, it’s
    a good bet that they overstate rather than understate the economic
    costs of climate-change action. That is what the experience from the
    cap-and-trade program for acid rain suggests: costs came in well below
    initial predictions. And in general, what the models do not and cannot
    take into account is creativity; surely, faced with an economy in which
    there are big monetary payoffs for reducing greenhouse-gas emissions,
    the private sector will come up with ways to limit emissions that are
    not yet in any model.

    I have links to some of the key literature on this here:

    Introduction
    to climate economics: Why even strong climate action has such a low
    total cost—one tenth of a penny on the dollar

    Despite
    its many flaws, EIA analysis of climate bill finds 23 cents a day cost
    to families, massive retirement of dirty coal plants and 119 GW of new
    renewables by 2030—plus a million barrels a day oil savings

    New
    EPA analysis of Waxman-Markey: Consumer electric bills 7 percent lower in 2020
    thanks to efficiency—plus 22 GW of extra coal retirements and no new
    dirty plants

    The triumph of energy efficiency:  Waxman-Markey could save $3,900 per household and create 650,000 jobs by 2030

    And of course he discusses what scientific uncertainty means for economic modeling:

    Finally and most important is the matter of uncertainty.
    We’re uncertain about the magnitude of climate change, which is
    inevitable, because we’re talking about reaching levels of carbon
    dioxide in the atmosphere not seen in millions of years. The recent
    doubling of many modelers’ predictions for 2100 is itself an
    illustration of the scope of that uncertainty; who knows what revisions
    may occur in the years ahead. Beyond that, nobody really knows how much
    damage would result from temperature rises of the kind now considered
    likely.

    You might think that this uncertainty weakens the case for action, but it actually strengthens it. As Harvard‘s
    Martin Weitzman has argued in several influential papers, if there is a
    significant chance of utter catastrophe, that chance—rather than what
    is most likely to happen—should dominate cost-benefit calculations.
    And utter catastrophe does look like a realistic possibility, even if
    it is not the most likely outcome.

    Weitzman argues—and I agree—that this risk of catastrophe,
    rather than the details of cost-benefit calculations, makes the most
    powerful case for strong climate policy. Current projections of global
    warming in the absence of action are just too close to the kinds of
    numbers associated with doomsday scenarios. It would be irresponsible—it’s tempting to say criminally irresponsible—not to step back from
    what could all too easily turn out to be the edge of a cliff.

    For more on Weitzman, see

    Harvard
    economist: Climate cost-benefit analyses are “unusually misleading,”
    warns colleagues “we may be deluding ourselves and others”

    Krugman’s key conclusions are:

    Stern’s moral argument for loving unborn generations as
    we love ourselves may be too strong, but there’s a compelling case to
    be made that public policy should take a much longer view than private
    markets. Even more important, the policy-ramp prescriptions seem far
    too much like conducting a very risky experiment with the whole planet.
    Nordhaus’s preferred policy, for example, would stabilize the
    concentration of carbon dioxide in the atmosphere at a level about
    twice its preindustrial average. In his model, this would have only
    modest effects on global welfare; but how confident can we be of that?
    How sure are we that this kind of change in the environment would not
    lead to catastrophe? Not sure enough, I’d say, particularly because, as
    noted above, climate modelers have sharply raised their estimates of
    future warming in just the last couple of years. So what I end up with
    is basically Martin Weitzman’s argument: it’s the nonnegligible
    probability of utter disaster that should dominate our policy analysis.
    And that argues for aggressive moves to curb emissions, soon …

    If it does, the economic analysis will be ready. We know how to
    limit greenhouse-gas emissions. We have a good sense of the costs—and
    they’re manageable. All we need now is the political will.

    Hear! Hear!

    Related Links:

    The problem with a green economy: economics hates the environment

    Revkin wants to talk ‘energy quest’ not ‘climate crisis’

    Krugman says what political media won’t: economists agree climate action is necessary, affordable






  • Sarah Palin recounts recent Mensa meeting

    by David Roberts

    “I knew that we’d be buddies when I met her when she said, ‘Drill here, drill now.’ And then I replied, ‘Drill, baby, drill’ and then we both said, ‘You betcha!’”

    —former half-term Alaska Gov. Sarah Palin, on meeting Rep. Michele Bachmann (R-Minn.)

    Related Links:

    Filling our short-term fossil-fuel needs

    Senate climate bill to fund Utah tar sands development

    Understanding the allure of ‘drill baby drill’






  • Senate Energy spox responds; more on fossil-fuel safety and our energy future

    by David Roberts

    Yesterday I took issue with this quote from Senate Energy Committee spokesperson Bill Wicker, about the mine accident in West Virginia:

    “This is a mining accident,” says Bill Wicker, communications director for the Senate Energy Committee. “This issue involves the health and safety of our miners, not our energy future.”

    As I wrote, it seems to me the dangers of coal mining should very much be part of our discussion about our energy future. I was puzzled, so I wrote Wicker to seek clarification. He (promptly and courteously) sent a response, which he said I could share:

    Mining coal is a dirty, dangerous business. And you’re right—too many people die doing it. The coal industry needs to clean up its act (every pun intended) in many ways, including safety.

    But people die regularly in car accidents. I don’t hear folks saying that traffic fatalities should be part of conversations about clean, green cars of the future. Automobile fatalities are seen mostly as a safety issue.

    This weekend, five people died in an explosion at a refinery near Seattle. I haven’t heard of anyone suggesting that our nation stop making transportation fuels. However, people are saying that the oil industry needs to do a lot more to protect the safety and health of its workers and the communities in which it operates.

    Closer to home, several people have died this year in multiple accidents on Metro. Does Metro’s track record call into question the role that public transportation can play in a cleaner energy future for our country … or is it more of a local management/safety problem?

    I could go on and on with the examples, David, but won’t. Thanks for asking, and I hope this helps.

    I think I better understand where Wicker’s coming from, but if anything, I disagree even more strongly now.

    To put it in his terms: The fact that millions die every year from auto accidents should very much be part of conversations about the future of transportation. Driving two-ton machines around—even “clean, green” two-ton machines—is inherently dangerous. Road accidents take a horrible toll on American families and the economy. Should that not inform our decisions about what transportation options we want to pursue in the future? And the rate of injuries and fatalities on public transit should also be part of the discussion. How common are they relative to automobile injuries? What are the prospects for minimizing them? Could they be reduced at a reasonable cost? These questions are core to our transportation decisions, not peripheral. Creating more sustainable communities means creating safer communities, and that means, among other things, creating communities with fewer cars and more spaces devoted to people.

    And yes, people regularly die in refinery accidents (and in coal mines). We can and should try to make them safer, but ultimately, working with fossil fuels is dangerous, not just to the public that suffers from the pollution but to the workers involved. It may not be a particularly compelling reason to leave fossil fuels behind—perhaps we think the benefits outweigh the suffering—but it should absolutely be part of the conversation.

    Putting these kind of issues in a silo doesn’t help. Well, no, scratch that: it helps the fossil-fuel industry for the public to think of every incident of maiming and death, every oil spill or coal-ash-impoundment failure, as an unforeseeable “accident” that’s a “safety issue” incidental to larger policy choices. But it doesn’t help the clarity of our public dialogue or the quality of our policy choices.

    Fossil fuels are dangerous. They have always killed people and will always kill people, no matter what safety measures we undertake. Confronting that fact clearly ought to be the first step we take in grappling with our energy future.

    Related Links:

    What does coal mining have to do with geoengineering?

    World Bank vote gives billions to coal

    Another Tragic Iceberg Awaiting Massey’s Titanic Violations: Brushy Fork Dam






  • Meet America’s most extreme energy geeks

    by Amanda Little

    Photo courtesy PNNL via FlickrJet-engine wind turbines, fuel made from big batches of algae, enzymes that trap power
    plant CO2. Sound seriously far-fetched? They may be. But these concepts are fetching
    serious investment dollars from the Department of Energy. DOE Secretary Steven
    Chu—a Nobel Prize-winning inventor himself—has launched a new program dubbed “ARPA-E.” It’s modeled
    after DARPA (Defense Advanced Research Projects Agency), the Pentagon’s
    technology-innovation program that was responsible for the
    internet, cell phones, GPS, and other technical breakthroughs. ARPA-E is doling out multimillion dollar grants
    to the nation’s most visionary energy innovators—thrill-seeking, over-achieving
    uber-geeks from start-up companies and universities across America. To offer a
    glimpse of what they’re up to—and what America’s energy future might look like—we
    singled out seven of ARPA-E’s 37 recipients. These guys (yes, they’re all guys)
    are pursuing high-risk endeavors that may never see commercial applications.
    But if they do, the rewards could be staggering in scope.

     

     

    The
    pioneer:
    Dr. Walter
    Presz, founder and Senior Technical Advisor at FloDesign Wind Turbine Corp.

    The
    concept:
    An entirely
    new spin on wind energy. With compact blades enclosed in a cylindrical casing, this high-efficiency turbine looks—and operates—like a jet engine. Instead of using energy to create thrust, it uses the thrust of the wind to create energy. An air pump behind the blades pulls in twice as much air as a conventional machine. In
    wind tunnel experiments, FloDesign’s small-scale prototype generated three
    times more energy than a standard long-blade turbine of the same size. The
    encased blades are also quieter and safer for humans—and birds—and the turbine’s
    compact size means it can be placed along highways, medians and
    bridges, in suburbs and maybe even cities-all places where bulky conventional
    wind turbines cannot go.

    The
    payout:
    $8,325,400.00

    The
    goal:
    A commercially
    viable prototype within two years, and ultimately a machine that is 30 percent
    cheaper than a conventional wind turbine of the same size.

    The hurdles: Because of its expensive fiberglass casing, FloDesign turbines require more materials than conventional turbines of the same size—which
    adds to production costs. And if winds exceed certain speeds, the generator that creates power inside the turbine could overheat. Presz is exploring cheaper materials for mass production, and also designing better
    air-flow controls for cooling.

    The
    promise:
    “I’ve
    worked on propulsion technologies for practically every aircraft in the skies
    today-from Stealth Bombers and the F16 to the Boeing 737. But
    this is by far the biggest reward I’ve worked for in my career. The U.S. is way
    behind Europe and Asia in wind. Now we have the potential to change the entire
    industry—pushing it from the propeller age into the jet age.”

     

    The
    pioneer:
    Dr. Donald
    Sadoway, Professor of Materials Chemistry, Massachusetts Institute of
    Technology

    The
    concept:
    Batteries
    made of liquid metals. Picture a container of oil and vinegar—these liquids
    don’t mix, they stratify into two layers. The liquid metals in MIT’s battery stratify
    too—into three distinct layers (cathode, anode and electrolyte) that interact
    with each other and conduct electrical current. Conventional batteries made of
    solid metals are expensive and hard to build big. But liquid batteries could be
    enormous in size—large enough to store power from wind, solar, and other
    intermittent sources of energy, and discharge it on demand. They could also be
    sited at or near the buildings they’re powering, eliminating the need for new transmission lines to urban
    centers. Don’t expect to see liquid car batteries, though—all that
    sloshing would disrupt the current.

    The
    payout:
    $6,949,624.00

    The
    goal:
    In the next 18
    months Sadoway and his team plan to scale up their prototype “from the size of
    a shot glass to the size of a deep-dish pizza box,” which could provide enough power
    for a home office. (By 2015, he plans to have a trash barrel-sized liquid battery that would power a small home.) For continuous wind and solar power on the grid, however, the
    batteries might have to be as big as an eighteen-wheeler, or bigger. It’s too
    early to put a timeframe on that super-sizing.

    The hurdles: Cost, scale, and the laws of
    physics. A lithium-ion battery—commonly used in small-scale
    applications like cell phones and laptops—that was big enough to power a house or a
    neighborhood, would cost more than 1000 times what we now pay for energy
    from the grid. We need a new approach. The question is whether the
    laws of physics will cooperate. Energy doesn’t like to be stored; it likes to
    move. Capturing and containing energy cheaply and on a grand scale “is a
    seemingly impossible challenge,” said Sadoway, “but that’s what makes it so
    exciting.”

    The
    promise:
    “All these
    people working to improve solar-cell efficiency and wind-turbine
    performance—that’s great. But it won’t make a difference if you can’t store
    and discharge that power on demand. Liquid batteries could give us electricity from the sun even when the sun isn’t shining, and from the wind when it isn’t blowing. Storage is everything. It’s
    a world-changer.”

     

    The
    pioneer:
    Ross
    Youngs, founder and CEO Algaeventures Inc.

    The
    concept:
    An
    affordable method for mass-producing algae to make alternative fuels, animal
    feeds, fertilizers, plastics, chemicals, and oils. The trick is the mass production part, because while it’s easy to grow
    algae, it’s hard to separate these tiny aquatic plants from their watery
    environment. Algaeventures’ new method uses an absorbent plastic membrane to
    rapidly “sop up” the water around the algae, making it possible to harvest,
    de-water, and dry algae on a massive scale using relatively little energy. Youngs’
    process could make algae-based biofuel cost-competitive with gasoline.

    The
    payout:
      $5,992,697.00

    The
    goal:
    Youngs is
    currently harvesting algae from water at a rate of 500 liters per hour. His
    goal is to reach 15,000 liters per hour—for proof of concept—by next
    year, and 50,000 liters per hour—for commercial applications—by 2012.

    The
    hurdles:
    Scaling up
    the volume and bringing down the cost. Youngs is fine-tuning the chemistry of
    his machine’s permeable membrane, experimenting with new, more absorbent and
    durable materials and perfecting the weave of the membrane’s tiny plastic
    threads. He’s also tinkering with ways to move the algae-laden water through
    the machine in ever-greater volumes.

    The
    promise:
    “All
    terrestrial plants evolved from algae. It has been around for
    billions of years. As a resource it’s incredibly versatile—in theory, it could
    be used in virtually every application fossil fuels are used for, but without
    the negative environmental effects. To me, it’s a panacea. It could be as critical to
    the future of civilization as it was to its formation.”

     

    The
    pioneer:
    Bruce
    Lanning, Director of Thin-Film Technologies, ITN Energy Systems, Inc.

    The
    concept:
    Smart windows:
    glass coated with a thin plastic layer of “electrochromic film” which, when
    excited by an electrical current, can control the amount of light and heat
    that passes through. (Think those eyeglass lenses that automatically tint in
    sunlight, only on a much bigger scale.) On hot August afternoons your
    office windows could switch from translucent to opaque—shutting out excess light and
    heat. On bright winter days they’d let the warmth penetrate. Smart controls can
    tint and un-tint windows automatically—maximizing daylight and minimizing
    the use of overhead lighting. The energy efficiency benefits could be huge,
    given that buildings lose 30 to 40 percent of their heat through windows.

    The
    payout:
    $4,986,249.00

    The
    goal:
    Scale up the window size, and develop a mass production process that will hold down cost. ITN’s current prototype window measures from 18 to 40 inches; most commercial
    applications require a 60-inch span. Lanning plans to make a 60-inch window, and predicts full-scale manufacturing of ITN’s plastic-coated smart
    windows within four years.

    The
    hurdles:
    Cost and
    durability. Window-dimming technology has been in development for years.
    Initially the film was deposited directly onto the glass window, a difficult
    process to affordably mass-produce. ITN cut costs by depositing the film
    onto a flexible plastic sheet that can be adhered to glass. Costs have to shrink
    even further, and the plastic film must prove durable enough to last for decades and withstand the
    elements. Lanning is also working on dimming speed (how long it takes the
    window to transition from clear to tinted and back again) and on the color of
    the tint (rose, yellow, blue, or grey).

    The
    promise:
    “Energy
    loss associated with windows totals four quads annually in the U.S. If we switched all
    the windows in the nation to LowE [the industry standard for highly efficient
    windows], still two quads of energy would leak out. Smart windows could eliminate
    all four quads.” [“Quads” is short for quadrillion BTUs of energy. But you knew that.]

     

    The
    pioneer:
    Dr. Emanual
    Sachs, founder and Chief Technical Officer of 1366 Technologies, Inc.

    The
    concept:
    Silicon-based solar at the cost of coal. Right now, more than 80 percent of all
    solar panels sold worldwide are made with high-cost crystalline silicon. Next-gen, thin-film technologies
    show some promise, but those depend on rare elements such as indium and
    tellurium. Silica—the principal component of sand and the second most abundant element on earth, after oxygen—could be the ticket to
    affordable solar. Making solar panels from silicon is wasteful; thin wafers
    are shaved off large cylindrical columns of refined silicon, which means that half
    the silicon ends up as dust. With his new “direct wafer” method, Sachs
    solves the problem by using molten silicon—no sawing needed. The single-step manufacturing process uses much less energy too, which cuts the cost of each wafer by more than 70 percent. If successful, “direct wafers”
    would open up a market for solar that’s unconstrained by cost or materials.

    The
    payout:
    $4,000,000.00

    The
    goal:
    Sachs’s
    prototype wafers are four inches square with efficiencies of roughly 12 percent. That’s a bit higher than thin film solar, but not as efficient as the 15 to 21 percent range of standard crystalline silicon panels. Sachs plans to produce six-inch square
    wafers—the commercial standard—with 16 percent efficiency by the end of 2011. Long term, he aims for
    21 percent efficiency at one-third the cost of today’s installed silicon-based panels.

    The
    hurdles:
    Efficiency
    and mass production. Currently Sach’s molten direct-wafer technology is about 20 to 50 percent less
    efficient than the standard. The more efficiency you strive for, the more
    difficult the challenge becomes. In other words, it’s a lot harder to get from
    18 to 20 percent efficiency than from 15 to 18. One efficiency-boosting approach Sachs is trying is to introduce textures into the molten crystalline wafers in order to trap more
    light. He also has to develop commercial-scale production methods. It’s not clear yet that his molten wafer method can make that leap.

    The
    promise:
    “Sunlight
    is the original, omnipotent form of energy—fossil fuels themselves are a
    product of plants grown by sunlight. The question is how to capture this diffuse
    resource. We are trying to harness the primordial power of nature with the
    least effort—to find out what nature wants to do and help it on its way.”

     

    The pioneer: Dr. Harry Cordatos, Chemical
    Engineer and Project Manager at United Technologies

    The concept: Equip coal-burning power plants with a filter that uses an artificial enzyme to capture CO2. Along with other air-breathers, we humans use the
    enzyme carbonic anhydrase to remove CO2 from our bodies. This enzyme reacts
    with CO2 faster and more efficiently than any chemical known to man. Taking a cue from the human body, Cordatos
    is incorporating a synthetic version of carbonic anhydrase into a thin polymer membrane
    which can capture CO2 before it enters smokestacks and channel the pollutant into a different chamber where it can be compressed and piped underground.

    The payout: $2,251,183.00

    The goal: Perfect the recipe for a synthetic version of carbonic anhydrase that can be placed inside a membrane (or filter), and measure the membrane’s performance in a smokestack environment. Within two years Cordatos hopes to show proof of concept for this process that could capture C02 at two-thirds the cost of prevailing commercial methods.

    The hurdles: Knowledge, durability, and cost. We
    currently use chemicals called “amines” to scrub CO2 from the air in enclosed
    environments such as submarines and space shuttles (five percent CO2 in the air
    can be lethal). The amine method could remove 90 percent of CO2 from
    smokestacks too, but it would raise the cost of electricity by about 80
    percent. Carbonic anhydrase is a vastly cheaper alternative, if Cordatos can determine how his synthetic
    anhydrase will behave inside a smokestack. There’s a high risk that contaminants
    in the flue gasses could deactivate the enzyme.

    The promise: 
    “It’s humbling to see how much better nature is than industry at doing
    the things we need to do. Over millions of years of evolution, human bodies
    have developed an extremely efficient method for removing carbon dioxide. This
    is as good as it gets! It would behoove us to try to mimic that.”

     

    The pioneer: Steve Bobzin, Director of
    Technology, CERES

    The concept: “Super crops” that produce high
    yields with far less water and nitrogen fertilizer. Adapting technologies from
    the human genome project, CERES identified traits within sorghum, switchgrass,
    miscantis, and other biofuel crops that enable the plants to use nitrogen and water more efficiently.
    Test crops grown in greenhouse laboratories have gotten as much as double the yield per
    acre for each crop, and the same yield per acre using half the nitrogen. These super
    crops could produce cheap cellulosic biofuels or be used as a biomass feedstock
    in power plants—competing with coal as well as oil.

    The payout: $4,989,144.00

    The goal: 
    To reproduce laboratory yields in the fields. Bobzin is testing four
    genetic traits in three crops (sorghum, switchgrass and miscantis) on roughly 10-acre
    plots in Arizona, Georgia, Tennessee and Texas—states with different
    climate challenges. If the three-year experiment reproduces greenhouse
    results, they’ll begin testing the seeds on larger plots in more places, putting the innovation on track for commercial-scale development.

    The hurdles: Mother Nature. Transitioning from the controlled greenhouse environment to
    the great outdoors introduces a range of risks: weather, humidity, insects,
    soil moisture, wind, and mold, to name a few. These stresses could inhibit the genetically tweaked traits from functioning as well as they did in the greenhouse experiments.

    The promise: “I have spent my entire career
    with a desire to do things that would improve life for society, and the promise here is greater than any other innovation I’ve worked for. We
    could replace oil, we could significantly offset the use of coal with homegrown
    crops—providing energy security and freeing ourselves from dependence on the
    Middle East while reinvigorating the rural economy.” 

    Related Links:

    Nuclear arms reduction is better than nuclear warfare

    Whoops: Energy Star approves gas-powered alarm clock

    Racing for cleantech jobs: Why America needs an energy education strategy






  • Major economies to hold climate talks in U.S. this month

    by Agence France-Presse

    WASHINGTON—The world’s 17 major economies accounting for the bulk of carbon emissions will meet this month in Washington in hopes of pushing forward slow-moving climate talks, a U.S. official said Thursday.

    Officials from the so-called Major Economies Forum—which accounts for more than 80 percent of the emissions blamed for global warming—will meet on April 18 and 19 in Washington, the official told AFP on condition of anonymity.

    The meeting marks part of a renewed push to seek progress after the rancorous U.N.-led climate summit in Copenhagen in December, which ended with a vague agreement that left few happy. A small number of developing nations including Sudan, Cuba, and Venezuela vociferously criticized Western nations at the Copenhagen conference, preventing formal approval of the fine-tuned agreement.

    Negotiators under the U.N. Framework Convention on Climate Change (UNFCCC) will gather in Bonn from Friday to Sunday in their first official talks since the strife-torn talks in the Danish capital. Germany has also invited some 50 environment ministers to a May 2-4 conference in Bonn.

    Todd Stern, the top U.S. climate negotiator, said after the Copenhagen meeting that he was leaning toward working out details of the next climate agreement in smaller settings while not bypassing the U.N. process.

    Negotiators hope to seal the next global agreement on climate change at a summit this December in Mexico. The treaty would succeed the Kyoto Protocol, whose obligations on cutting emissions expire at the end of 2012.

    Related Links:

    Northwest mountain towns become home efficiency lab

    Nuclear arms reduction is better than nuclear warfare

    Revkin wants to talk ‘energy quest’ not ‘climate crisis’






  • Ask Umbra’s Book Club: The three L’s—laziness, learning, and lawlessness

    by Umbra Fisk

    Dearest
    readers,

    I’ve so
    enjoyed reading all of your comments thus far about Dolly Freed’s Possum Living. The 9-to-5 grind,
    raising and slaughtering your own meat—stimulating threads.

    You know, I
    couldn’t help but notice how often Freed talks about the basis for her and her
    father’s lifestyle choice being that they are lazy. Tending a garden, raising bunnies,
    foraging for found edibles, distilling liquor, canning and preserving food,
    cooking, running, fishing, chopping wood, reading—to me, that doesn’t sound
    like laziness. What are your thoughts? Why do you think Freed considers herself
    to be lazy?

    In that
    same vein, Freed’s father “thinks compulsory education is a fraud—nothing but
    glorified babysitting,” allowing her to quit school in seventh grade. Freed
    comes across in the book as a confident and bright, if a bit rough around the
    edges, teenager, and we know that she went on to ace the SATs, put herself
    through college, and become a NASA aerospace engineer, environmental educator,
    business owner, and college professor. However, do you think her father did her
    an injustice by allowing her to drop out of school? Or did he give her a
    greater gift than a conventional education?

    For me, it
    was reminiscent of the Paskowitz family, whose story was detailed in the 2007
    documentary Surfwise. Dorian and his
    wife, Juliette, raise their nine children in a small camper—the kids never
    attended school and were forced to adhere to a strict diet and surfing regimen.
    I had a difficult time discerning whether the now-adult children were better or
    worse off for their unconventional upbringing. How about Dolly Freed? What do
    you think?

    And
    speaking of learning from our elders, what was your take on the Law chapter? Were
    you pleased to see as a footnote (in the newest edition) that an older and
    wiser Dolly Freed no longer agrees with what she wrote in this chapter?

    Learnedly,
    Umbra

    Related Links:

    Americans eat more processed food than, well, anyone

    Farm saved by community featured on CNN

    Ask Umbra’s Book Club: Is eating animals eating you?






  • Towns invest in smarter streets … in Mississippi

    by Jonathan Hiskes

    Two Mississippi towns want better options than auto-only streets, and now they’ve made it official. The towns of Tupelo (pop. 36,223) and Hernando (pop. 6,812) each passed Complete Streets legislation that ensures roads will be built and maintained for walkers, cyclists, and other forms of transportation—along with drivers.

    Yesterday St. Louis citizens voted to fund better mass transit. Now this in Mississippi—this stuff is getting around. Towns of these sizes don’t build a lot of transit infrastructure, so sidewalks, bike paths, and road safety features are all the better.

    “I’m proud of our city council’s unanimous support of this initiative as we pro-actively change Tupelo’s culture into a more walkable, cyclist-friendly community,” Tupelo Mayor Jack Reed said in a prepared statement.

    The National Complete Streets Coalition works to promote what its name suggests—streets designed for more than one use, and ones that work for children, seniors, wheelchair users, and sidewalk retailers. It’s fiscally responsible, says walkability guru Dan Burden

    “The big win for city government is that anything built to a walkable scale leases out for three to five times more money, with more tax revenue on less infrastructure,” he said in a news release.

    Note that this is all about happier, healthier, and safer living. It just so happens to be sustainable, but you don’t even have to use environmental selling points if they’re too distracting.

     

    Related Links:

    St. Louis votes for better transit, despite Tea Party campaign

    This week in comically evil corporate behavior

    WHO mobilizes 1,000 cities in urban health drive






  • Scientific models predict continued decline in Washington Post circulation

    by Joseph Romm

    OK, the Washington Post’s circulation will probably keep declining even in the unlikely event their coverage of global warming improves. But my headline is at least as scientific as the WP’s latest climate piece “Scientists’ use of computer models to predict climate change is under attack.”

    Memo to WashPost: Scientists use of computer models to predict/project climate change has been under attack for a long, long time by the anti-scientific disinformers. That ain’t news. The real news, which you almost completely ignore, is:

    The models have made accurate projections (see NASA: “We conclude that global temperature continued to rise rapidly in the past decade” and “that there has been no reduction in the global warming trend of 0.15-0.20°C/decade that began in the late 1970s”).
    When the models have gone awry, it is primarily in underestimating how fast the climate would change.
    Staying anywhere near our current emissions path — i.e. listening to the disinformers and doing nothing significant to restrict emissions — removes most uncertainty about the future climate impacts and leads with high probability to human misery on a scale never seen before.

    But what do you expect from an article that begins this way:

    The Washington Nationals will win 74 games this year. The Democrats will lose five Senate seats in November. The high Tuesday will be 86 degrees, but it will feel like 84.

    And, depending on how much greenhouse gas emissions increase, the world’s average temperature will rise between 2 and 11.5 degrees by 2100.

    The computer models used to predict climate change are far more sophisticated than the ones that forecast the weather, elections, or sporting results.

    Uhh, it’s not really that the climate models are more sophisticated. It’s that the climate is considerably easier to forecast than any of those other three.

    Climate has always been easier to predict than the weather: We know with incredibly high certainty that July of this year (or any year) will be hotter than January of this year (or any year) — and we know with high certainty the 2020s will be hotter than the 2000s — but it is basically a coin toss as to whether July 15, 2010 will be hotter than July 15, 2009.  As NASA notes, “When we talk about climate change, we talk about changes in long-term averages of daily weather.”  Long-term averages simply don’t change as rapidly as the weather and are inherently easier to project.

    The analogies to sporting events and elections are simply inane. They involve human behavior and thus aren’t model-able with the same basic laws of physics. They are apparently included in the article simply to amuse and confuse.

    The piece is a long litany of mostly irrelevant information and disinformer talking points:

    Climate scientists admit that some models overestimated how much the Earth would warm in the past decade. But they say this might just be natural variation in weather, not a disproof of their methods.

    Uhh, “some models”?  So some unnamed models may not have gotten it right. Or maybe it was just that some of the groups doing the measuring lowballed actually warming. The U.K.’s Met Office — which many scientists have said has underestimated recent warming — posted an analysis in December which concluded, “The global temperature rise calculated by the Met Office’s HadCRUT record is at the lower end of likely warming.”

    In fact, NASA’s analysis makes clear that warming continues just as the models had projected. Indeed, the WashPost buries this central point, which by itself renders the entire article mostly moot:

    Put in the conditions on Earth more than 20,000 years ago: they produce an Ice Age, NASA’s Schmidt said. Put in the conditions from 1991, when a volcanic eruption filled the earth’s atmosphere with a sun-shade of dust. The models produce cooling temperatures and shifts in wind patterns, Schmidt said, just like the real world did.

    If the models are as flawed as critics say, Schmidt said, “You have to ask yourself, ‘How come they work?’ “

    The models were actually used to accurately predict the cooling from the Pinatubo eruption.

    The Washington Post entirely misses the even more important point that the models used for the 2007 IPCC report consistently underestimated recent climate changes (and emissions trends):

    The recent [Arctic] sea-ice retreat is larger than in any of the (19) IPCC [climate] models” — and that was a Norwegian expert in 2005. The retreat has accelerated since 2005, especially in volume.
    The ice sheets appear to be shrinking “100 years ahead of schedule.” That was Penn State climatologist Richard Alley in March 2006. In 2001, the IPCC thought that neither Greenland nor Antarctica would lose significant mass by 2100. They both already are.
    Sea-level rise from 1993 and 2006 — 3.3 millimetres per year as measured by satellites — was higher than the IPCC climate models predicted.
    The subtropics are expanding faster than the models project.
    Since 2000, carbon dioxide emissions have grown faster than any IPCC model had projected.

    Needless to say, the Post never talks about the paleoclimate record, which provides both support for the climate models — and more evidence that they lowball likely future impacts (see Science: CO2 levels haven’t been this high for 15 million years, when it was 5° to 10°F warmer and seas were 75 to 120 feet higher — “We have shown that this dramatic rise in sea level is associated with an increase in CO2 levels of about 100 ppm”).

    The models’ biggest flaws concern their ignoring most major amplifying carbon-cycle feedbacks (see “An illustrated guide to the latest climate science“). But rather than explaining even once that the necessarily imperfect models almost certainly underestimate future impacts, the Post chooses to repeat without explanation this misleading point:

    All the major climate models seem to show that greenhouse gases are causing warming, climate scientists say, although they don’t agree about how much. A 2007 United Nations report cited a range of estimates from 2 to 11.5 degrees over the next century.

    Now this appears to willfully conflate two very different issues. It seems to imply that the climate model don’t agree on how much warming we’ll see — by a factor of nearly 6! But in fact much of that disparity is due to the use of very different scenarios of how much emissions will grow this century.

    As I’ve noted many times, the IPCC wastes a huge amount of time and effort modeling countless low emissions scenarios that have no basis in reality. Now if you take a low climate sensitivity (warming caused by a doubling of CO2 concentrations) and multiply it by a low emissions scenario, you get a low total warming. The anti-science crowd then gloms onto that low number as evidence global warming won’t have serious consequences. (And the media gloms onto that number and compares it to the high emissions, high sensitivity case as evidence the IPCC modelers “don’t agree” by a wide amount.)

    But the IPCC has never clearly explained that all of the low emissions scenarios presuppose we ignore the anti-science crowd’s plea to do nothing and instead take very strong action to reduce emissions.

    On the other hand, the IPCC has explained it is far more likely that the climate sensitivity is quite high than it is quite low — but very few people in the media follow the science closely enough to realize that.

    And so what the scientific literature and climate models tells us today with increasingly certainty is that if we take no serious action, catastrophic change might best be considered business as usual = highly likely (see M.I.T. doubles its 2095 warming projection to 10°F — with 866 ppm and Arctic warming of 20°F and Our hellish future: Definitive NOAA-led report on U.S. climate impacts warns of scorching 9 to 11°F warming over most of inland U.S. by 2090 with Kansas above 90°F some 120 days a year — and that isn’t the worst case, it’s business as usual!”).

    But the media and opinionmakers and most economists have been led to believe those scenarios are the extreme worst case and very unlikely, when in fact they are simply what is projected to happen if we keep doing nothing.

    The true plausible worst case — which combine keeping on our current high level of emissions trend with what a more accurate attempt to model carbon cycle feedbacks — is far, far worse: U.K. Met Office: Catastrophic climate change, 13-18°F over most of U.S. and 27°F in the Arctic, could happen in 50 years, but “we do have time to stop it if we cut greenhouse gas emissions soon.”

    But you won’t learn any of that crucial information from the Washington Post. So why not join hundreds of thousands of others and stop reading it entirely!

    UPDATE:  MIT’s Joint Program on the Science and Policy of Climate Change had a very useful figure based on its 2009 peer-reviewed paper, which makes the point with more probabilistic detail:

    Here is how MIT describes what it calls the “Greenhouse Gamble” in “an attempt to better convey the uncertainty in climate change prediction”:

    Depicted as a roulette wheel, the image portrays the MIT Program’s estimations of climate change probability, or the likelihood of potential (global average surface) temperature change over the next hundred years, under different possible scenarios. Estimates of the risks of climate change are based on the best available information at the time the estimates are made, and thus as continued observations are made and scientific investigation proceeds the likelihood estimates that underlie these wheels must be updated.

    Based on new research we provide updated estimates of the likelihood of different amount of global warming over the century under reference case, in which it is assumed “no policy” action is taken to try to curb the global emissions of greenhouse gases, and a “policy case” that limits cumulative emissions of greenhouse gases over the century to 4.2 trillion metric tons of greenhouse gases (GHGs) measured in CO2-equivalent.

    The notion is that as humans allow global emissions of greenhouse gases to continue to increase, the roulette wheel continues to spin. We can control emissions — the policy case represents one choice for cumulative allowed emissions over the century — and by doing so we can limit risk. Uncertainties in the Earth system response to increasing emissions are given by nature; we can learn more about these responses but we can not directly control them. The results show much higher likelihood of higher temperature increases than for the previous wheels.

    On our current emissions path, MIT using the “best available information,” MIT projects a 9 percent chance of an incomprehensibly catastrophic warming of 7°C by century’s end, but less than a 1 percent chance of under 3°C warming.  As one MIT professor put it:

    “The take home message from the new greenhouse gamble wheels is that if we do little or nothing about lowering greenhouse gas emissions that the dangers are much greater than we thought three or four years ago,” said Ronald G. Prinn, professor of atmospheric chemistry at MIT. “It is making the impetus for serious policy much more urgent than we previously thought.”

    The time to act was quite some time ago, but now is far better than later!

    Related Links:

    Colbert’s climatologist vs. weathercaster catfight

    Underground school lunch blogger hits ‘Good Morning America’

    Why climate realists and skeptics talk past each other






  • St. Louis votes for better transit, despite Tea Party campaign

    by Jonathan Hiskes

    Here’s some good news: St. Louis citizens want robust mass transit, and they’re willing to pay for it. Despite a Tea Party opposition campaign, St. Louis County voters on Tuesday approved a half-cent sales tax increase to stabilize and eventually expand the region’s ailing transit network.

    The measure passed by a monstrous 24 point margin. The St. Louis Tea Party focused its energy on defeating the civic project, calling the campaign a test run for defeating Democrats in this fall’s midterm elections. So it’s a setback for them.

    But it’s good news for those wanting to get around the St. Louis metro area. The “proposition A” measure will restore bus lines that had been de-funded, pay for more frequent buses, prevent future cuts, and, eventually, expand the reach of transit further into area suburbs. The future cuts would have been drastic—about 50 percent of service and 650 jobs, beginning in June, according to Metro Transit Executive Director Robert Baer.

    Even more interesting, voters defeated similar tax increases in 1997 and 2008.

    In Los Angeles, too, voters approved a very similar tax on themselves in 2008—a half-cent sales tax increase to fund a large-scale electric rail system. And now they largely support Mayor Antonio Villaraigosa’s plan to build the network in 10 years instead of 30.

    People want this stuff.

    Related Links:

    Towns invest in smarter streets … in Mississippi

    This week in comically evil corporate behavior

    WHO mobilizes 1,000 cities in urban health drive






  • A lesson from California’s bad ballot measure

    by Eric de Place

    California’s nascent cap-and-trade program appears to be threatened by a ballot measure that is both substantively idiotic and yet diabolically clever. Basically, the measure would suspend implementation until California’s unemployment rate declines to below 5.5 percent. Financial backing comes from oil companies and other big polluters. Shocking, I know.

    Anyway, it’s a stupid idea on the merits because, apart from one industry-funded study, detailed analysis has shown that the bill would actually be beneficial to California’s economy, lowering energy bills and creating jobs. (Plus, the measure would, of course, reduce the state’s sizeable contribution to the planet’s greenhouse gas emissions; and California’s carbon footprint is roughly equivalent to that of France.)

    So why is it clever? Because linking the outcome to unemployment cements the (incorrect) impression that carbon reduction is bad for the economy. In fact, simply describing the measure requires one to link climate protection and unemployment. It’s a masterstroke of framing, and it’s a technique that progressives could stand to learn.

    Fortunately, this is a game that anyone can play! It shouldn’t be too hard to start adopting this kind of thing in ballot measure-saturated states in the West. So to get the conversation started, here’s a formula I just invented:

    [State] will levy an X percent tax on [bad-guy industry product] until [desired social or economic outcome] occurs.

    Here’s how it might work:

    Oregon will levy a 25 percent tax on cigarettes until teen smoking declines below 10 percent;
    Montana will levy a 50 percent tax on the in-state profits of payday lending firms until the state’s child poverty rate declines below 10 percent.
    Washington will levy a 10 percent tax on petroleum products until U.S. oil industry profits decline below $50 billion per year.

    You could tweak the dials here: change the numbers, change the targets, whatever. But I think you get the idea. In a best case scenario, progressives could win big fat Pigovian tax victories, and net a bunch of revenue to direct to their objectives.

    But even if the ballot measures failed, they would amount to a campaign season’s worth of PR for their pet issue. Tobacco sales are bad for kids; payday lending hurts poor children; the oil industry is screwing consumers. It’s fun, right? 

    I mean, if your state’s politics are locked in the death spiral of cynical bought-and-paid-for faux-populist ballot initiatives, the least you can do is leverage the dysfunction to do a little good for the world. 

    This post originally appeared at Sightline’s Daily Score blog.

    Related Links:

    Lowering income taxes while raising pollution taxes reaps great returns

    Filling our short-term fossil-fuel needs

    Can we get some attention for our issues now?






  • Don Blankenship’s record of profits over safety: ‘Coal pays the bills’

    by Brad Johnson

    Cross-posted from The Wonk Room.

    Massey Energy CEO Don Blankenship.After the worst coal mining disaster in at least 25 years, Massey Energy CEO Don Blankenship is facing long-overdue scrutiny for his record of putting coal profits over fundamental safety and health concerns. Blankenship, a right-wing activist millionaire who sits on the boards of the U.S. Chamber of Commerce and the National Mining Association, used his company’s ties to the industry-dominated Bush administration to paper over Massey’s egregious environmental and health violations. Massey rewarded Republicans with massive donations after the company avoided paying billions in fines for a 2000 coal slurry disaster in Martin County, three times bigger than the Exxon Valdez. After both mine inspectors and Massey employees got the same message that it was more important to “run coal” than to follow safety rules, a deadly fire broke out in the Aracoma Alma mine in 2006, burning two men alive.

    Blankenship was abetted by former employees placed at the highest levels of the federal mine safety system. Massey COO Stanley Suboleski was named a commissioner of the Federal Mine Safety and Health Review Commission in 2003 and was nominated in December 2007 to run the Energy Department’s Office of Fossil Energy.  Suboleski is now back on the Massey board.  After being rejected twice by the Senate, one-time Massey executive Dick Stickler was put in charge of the MSHA in a recess appointment in October 2006. In the 1990s, Stickler oversaw Massey subsidiary Performance Coal, the operator of the deadly Upper Big Branch Mine, after managing Beth Energy mines, which “incurred injury rates double the national average.” Bush named Stickler acting secretary when the recess appointment expired in January 2008.

    Below are further details of these two past incidents that foretold Blankenship’s latest disaster:

    The fatal Aracoma Mine fire

    Aracoma Mine fire.Photo: The Wonk RoomBlankenship branded deadly fire at dangerous Aracoma Mine ‘statistically insignificant’. In the most egregious case of preventable death before the Upper Big Branch explosion, Massey’s Aracoma Coal Co. agreed to “plead guilty to 10 criminal charges, including one felony, and pay $2.5 million in criminal fines” after two workers died in a fire at the Aracoma Alma No. 1 Mine in Melville, West Virginia. Massey also paid $1.7 million in civil fines. The mine “had 25 violations of mandatory health and safety laws” before the fire on January 19, 2006, but Massey CEO Don Blankenship passed the deaths off as “statistically insignificant.” [Logan Banner, 9/1/06; Charleston Gazette, 12/24/08]

    Federal mine inspector who wanted to shut down mine told to ‘back off’. Days before fire broke out in the Aracoma mine, a federal mine inspector tried to close down that section of the mine, but “was told by his superior to back off and let them run coal, that there was too much demand for coal.” Massey failed to notify authorities of the fire until two hours after the disaster. [Pittsburgh Post-Gazette, 4/23/06]

    Blankenship memo: “Coal pays the bills.” Three months before the Aracoma mine fire, Massey CEO Don Blankenship sent managers a memo saying, “If any of you have been asked by your group presidents, your supervisors, engineers or anyone else to do anything other than run coal … you need to ignore them and run coal. This memo is necessary only because we seem not to understand that the coal pays the bills.” [Logan Banner, 9/1/06]

    The Martin County coal-slurry disaster

    Martin County coal-slurry disaster.Photo: The Wonk RoomThree times the volume of the Exxon Valdez spill. Massey Energy is the parent of Martin County Coal, responsible for the “nation’s largest man-made environmental disaster east of the Mississippi” until the 2008 Tennesee coal-ash spill.  In October 2000, a coal slurry impoundment broke through an underground mine shaft and spilled over 300 million gallons of black, toxic sludge into the headwaters of Coldwater Creek and Wolf Creek,” in Martin County, Ky.  [Lost Mountain, p. 128]

    Site denied superfund status.  Bush’s Environmental Protection Agency “determined that the slurry spill was not a release of a hazardous substance” and thus ineligible for superfund status. [KY EQC]

    Sen. McConnell and Wife Stopped MSHA Investigation. U.S. Secretary of Labor Elaine Chao, wife of Sen. Mitch McConnell (R-Ky.), oversaw the Mine Safety and Health Administration. Chao “put on the brakes” on the MSHA investigation into the spill by placing a McConnell staffer in charge. In 2002 a $5,600 fine was levied. That September Massey gave $100,000 to the National Republican Senatorial Committee, chaired by McConnell. [Lexington Herald-Leader, 10/2/06, OpenSecrets]

    $2.4 Billion Becomes $20 Million. In May 2007, the EPA filed suit for $2.4 billion against Massey for violating “Clean Water Act more than 4,500 times from the beginning of 2000 to the end of 2006″ in West Virginia and Kentucky, including the Martin County spill. In January 2008, Massey agreed to pay $20 million to settle the case. [Lexington Herald-Leader, 1/18/08]

    Related Links:

    Does coal mining matter to our energy future?

    This week in comically evil corporate behavior

    Massey’s mine in Montcoal has been cited for over 3,000 violations, over $2.2 million in fines






  • Does coal mining matter to our energy future?

    by David Roberts

    So I’m reading a story about what effect the recent West Virginia coal mine disaster will have on national energy policy, and I run across this jaw-dropping quote:

    “This is a mining accident,” says Bill Wicker, communications director for the Senate Energy Committee. “This issue involves the health and safety of our miners, not our energy future.”

    You could not ask for a more craven illustration of the bankruptcy of national energy politics and the obeisance national legislators still must pay the coal industry, no matter what havoc it wreaks.

    This particular energy choice—coal—involves destroying land, polluting air and water, impoverishing communities, sickening tens of thousands of people a year, and yes, killing the people who dig it up. The industry has a long history of safety violations, environmental violations, lying to regulators, bribing public officials, busting unions, and violently suppressing local protests.

    Are we just supposed to consider each of those discretely, as marginal side issues? Is coal’s purported “cheapness” really the only consideration that bears on national energy policy?

    Right now the energy we use kills people. If we keep using the same kinds of energy, we’re going to kill even more people. Seems to me that’s germane to our energy future.

    Related Links:

    This week in comically evil corporate behavior

    Do coal companies put profit over human life?

    Massey’s mine in Montcoal has been cited for over 3,000 violations, over $2.2 million in fines






  • Lowering income taxes while raising pollution taxes reaps great returns

    by Lester Brown

    As economic decisionmakers—whether consumers, corporate planners, government policymakers, or investment bankers—we all depend on the market for guidance. In order for markets to work and economic actors to make sound decisions, the markets must give us good information, including the full cost of the products we buy.

    Unfortunately, markets largely ignore the indirect costs of goods and services, thus grossly distorting the structure of the economy. The market price of burning coal, for example, includes only the direct costs, those of mining the coal and transporting it to the power plant. By neglecting the substantial indirect costs of burning coal—the costs of air pollution, acid rain, devastated ecosystems, and climate change—the market is giving us bad information. As a result of this and other distortions, we are making bad decisions.

    The most effective way to correct this massive market failure is to restructure taxes—lowering taxes on income while raising those on environmentally destructive activities. Widely endorsed by economists, tax shifting helps make sure the price of products reflects their full costs to society.

    The first step in creating an honest market is to calculate these indirect costs. Perhaps the best model for this is a U.S. government study on smoking from the Centers for Disease Control and Prevention (CDC). In 2006 the CDC calculated the cost to society of smoking cigarettes—including both the cost of treating smoking-related illnesses and the lost worker productivity from these illnesses—at $10.47 per pack.

    This calculation provides a framework for raising taxes on cigarettes. In New York City, smokers now pay $4.25 per pack in state and local cigarette taxes. Since a 10 percent price rise typically reduces smoking by 4 percent, the health benefits of tax increases are substantial.

    The many indirect costs of using gasoline—including climate change, oil industry tax breaks and subsidies, oil supply protection, and treatment of auto exhaust-related respiratory illnesses—total around $12 per gallon ($3.17 per liter), based on a conservative estimate by the International Center for Technology Assessment. If this external or social cost were added to the roughly $3 per gallon average price of gasoline in the United States, a gallon would cost $15. These are real costs. Someone bears them. If not us, our children.

    Gasoline’s indirect cost of $12 a gallon provides a reference point for raising taxes to where the price reflects the environmental truth. Gasoline taxes in Italy, France, Germany, and the United Kingdom—averaging more than $4 per gallon—are a good start. That the average U.S. gas tax is less than 50 cents per gallon helps explain why the United States uses more gasoline than the next 20 countries combined. The high gasoline taxes in Europe have contributed to an oil-efficient economy and to far greater investment in high-quality public transportation, making it less vulnerable to oil supply disruptions.

    Phasing in an incremental gasoline tax rising by 40 cents per gallon per year for the next 10 years and offsetting it with a reduction in income taxes would raise the U.S. gas tax to the $4 per gallon tax prevailing today in Europe. This will still fall short of the $12 per gallon indirect costs, but combined with the rising price of producing gasoline, it should be enough to encourage motorists to use improved public transport and to buy plug-in hybrid and all-electric cars as they come to market.

    If gasoline taxes in Europe, which were designed to generate revenue and to discourage excessive dependence on imported oil, were thought of as a carbon tax, the $4 per gallon would translate into a carbon tax of $1,650 per ton. This is a staggering number, one that goes far beyond any carbon emission tax or cap-and-trade carbon-price proposals to date. It suggests that the official discussions of carbon prices in the range of $15 to $50 a ton are clearly on the modest end of the possible range of prices.

    Tax shifting is not new in Europe. A four-year plan adopted in Germany in 1999 systematically shifted taxes from labor to energy. By 2003, this plan had reduced annual carbon dioxide (CO2) emissions by 20 million tons and helped to create approximately 250,000 jobs. It also accelerated growth in the renewable energy sector.

    Between 2001 and 2006, Sweden shifted an estimated $2 billion of taxes from income to environmentally destructive activities. Much of this shift of $500 or so per household was levied on road transport, including hikes in vehicle and fuel taxes. France, Italy, Spain, and the United Kingdom are among the countries also using this policy instrument. In Europe and the United States, polls indicate that at least 70 percent of voters support environmental tax shifting once it is explained to them.

    Some 2,500 economists, including nine Nobel Prize winners in economics, have endorsed the concept of tax shifts. Harvard economics professor and former chairman of George W. Bush’s Council of Economic Advisors N. Gregory Mankiw wrote in Fortune magazine: “Cutting income taxes while increasing gasoline taxes would lead to more rapid economic growth, less traffic congestion, safer roads, and reduced risk of global warming—all without jeopardizing long-term fiscal solvency. This may be the closest thing to a free lunch that economics has to offer.”

    Environmental taxes are now being used for several purposes. For example, a number of cities are now taxing cars that enter the city center. Some governments are simply imposing a tax on automobile ownership. In Denmark, the registration tax on the purchase of a new car exceeds the price of the car by 180 percent. A new car that sells for $20,000 costs the buyer $56,000. In Singapore, the tax on a $14,200 Ford Focus, more than triples the price, pushing it to $45,500.

    Cap-and-trade systems using tradable permits are sometimes an alternative to environmental tax restructuring. The principal difference is that with permits, governments set the allowed amount of an activity and let the market set the price of the permits as they are auctioned off or given away. With environmental taxes, in contrast, the environmentally destructive activity’s price is incorporated in the tax rate, and the market determines the amount of the activity that will occur at that price.

    The use of cap-and-trade systems with marketable permits has been effective at the national level, ranging from restricting the catch in an Australian fishery to reducing sulfur emissions in the United States, but it also has serious limitations. Edwin Clark, former senior economist with the White House Council on Environmental Quality, observes that tradable permits “require establishing complex regulatory frameworks, defining the permits, establishing the rules for trades, and preventing people from acting without permits.” While economists largely prefer tax shifting for its efficiency, transparency, and predictable prices, both carbon taxes and cap-and-trade schemes are likely to result in a higher cost for burning carbon, thereby helping to correct the current market failure.

    A market that is allowed to ignore the indirect costs in pricing goods and services is irrational, wasteful, and self-destructive. The key to building a global economy that can sustain economic progress is the creation of an honest market, one that tells the ecological truth. To create an honest market, we need to restructure the tax system by reducing taxes on work and raising those on carbon emissions and other environmentally destructive activities, thus incorporating indirect costs into the market price. If we can get the market to tell the truth, then we can avoid being blindsided by a faulty accounting system that leads to bankruptcy.

    Stay tuned for a discussion of another tool to correct market
    failures—shifting subsidies—in Earth Policy Institute’s next Plan B Book
    Byte.

    To read about the Plan B proposal for phasing in a carbon tax of $200 per ton by 2020 to help stabilize climate, visit www.earthpolicy.org/index.php?/books/pb4/PB4ch8_ss4.

    Adapted from Chapter 10, “Can We Mobilize Fast Enough?” in Lester R. Brown, Plan B 4.0: Mobilizing to Save Civilization (New York: W.W. Norton & Company, 2009), available on-line at www.earthpolicy.org/index.php?/books/pb4

    Related Links:

    A lesson from California’s bad ballot measure

    Can we get some attention for our issues now?

    Louisiana environmental racism case gets hearing from Inter-American Commission on Human Rights






  • This week in comically evil corporate behavior

    by Jonathan Hiskes

    It’s only Wednesday and we’ve already got way more than a week’s worth of comically evil behavior from the fossil-fuel sector.

    Item the first:

    A Chinese coal freighter tried to take a shortcut through Australia’s Great Barrier Reef Marine Park and rammed into the world-reknowned ecological treasure. The stranded ship remains in danger of breaking apart and spilling its 1,075 tons of heavy engine fuel into the marine park (2.5 tons have already leaked).

    Captain Wang Jichan, of the Chinese state-controlled conglomerate Cosco, doesn’t get it. He told authorities the leak is “not serious” and that he’s more worried about pesky rescuers consuming his crew’s food and water: “They need some more water because the rescue team is consuming the water and food. They need that. That is a problem at the moment.”

    Item the second:

    In response to the tragic coal-mine explosion that killed at least 25 of his workers, notorious coal baron Don Blankenship shrugged off his company’s history of safety. “Violations are unfortunately a normal part of the mining process,” said the Massey Energy CEO.

    This from a businessman who wrote a 2005 memo instructing supervisors not to waste time on safety precautions and whose company was called “one of the worst in the industry” by a government safety regulator.

    Item the third:

    ExxonMobil paid no U.S. income taxes last year, despite reaping a record $45 billion profit. By using legal accounting methods and Caribbean tax shelters, the energy giant was able to avoid paying a cent to the IRS. At the same time, it complains about its tax “burden.”

    Remember, these are the captains of industry who tell us we can’t afford to change our current way of doing things. This is the status quo they’re working frantically to defend. Good times.

    Related Links:

    Exxon Mobil paid no federal income tax in 2009!

    Does coal mining matter to our energy future?

    WHO mobilizes 1,000 cities in urban health drive






  • Americans eat more processed food than, well, anyone

    by Tom Laskawy

    The New York Times had a small article and a big graphic recently on America’s love affair with processed, packaged food:

    Americans eat 31 percent more packaged food than fresh food, and they consume more packaged food per person than their counterparts in nearly all other countries. A sizable part of the American diet is ready-to-eat meals, like frozen pizzas and microwave dinners, and sweet or salty snack foods.

    This probably doesn’t come as too much of a surprise to anyone, especially given our outsized obesity rates compared to other countries. But the accompanying graphic helpfully illustrates a most unfortunate kind of “American exceptionalism.”

    It’s worth pointing out, however, that there’s nothing inherently evil about packaged foods. It’s what’s in them that counts.

    For example, though we favor processed foods less than the Japanese do, ours tend to be frozen pizzas and microwave dinners while theirs are predominantly minimally-processed frozen seafood or dried seaweed (both of which are very nutritious and neither of which have much in the way of additives).

    The Spanish and the French, meanwhile, eat comparable rates of packaged food overall, but more of it is dairy and baked goods. Although American food companies have figured out how to pack our “bread” with HFCS and other corn-based additives and sweeteners, Europeans tend to enjoy more fresh-baked breads that lack industrial byproducts.

    In other words, it’s not that we need to abandon packaged food if we’re to address obesity and reform the food system. It’s that we need to recognize that outsourcing our, and more importantly our children’s, nutrition to food industry scientists was a bad idea and isn’t working.

    At the end of the day, the food industry’s primary concern—indeed their legal obligation—is to shareholders, public and private. Our nutrition is secondary to them and only a factor in so far as it affects sales. 

    That said, this chart also makes clear that broad categories like “processed” or “packaged” food can be misleading. There simply isn’t any excuse for ignoring what’s in what you’re putting in your mouth. At some point in the last few decades, Americans decided to do just that. And now we’re paying for our ignorance not just with our wallets, but with our waistlines and with our health.

    Related Links:

    Making my neighborhood more walkable, sociable, sustainable, and safe

    Farm saved by community featured on CNN

    A teacher openly crusades for better school food—and gets seared






  • Imaginary, underwater subway lines are always the most convenient route

    by Ashley Braun

    Transit Authority FiguresFor publicly transitive folks like myself, why does it seem that the fastest way between two points is an imaginary subway line? And a watery one, to boot!

    If I were an East Coaster, I’d definitely submerse myself in these non-existent, though wish-listily handy transit routes, even if their actual construction would be a big, wet flop.

    All a-surfboard!

     

     

     

     

     

     

    ——————————————————————————————————————————————————————————————————————————-

    Like what you see? Sign up to receive The Grist List, our email roundup of pun-usual green news just like this, sent out every Friday.

    Related Links:

    Colbert’s climatologist vs. weathercaster catfight

    KFC: Who needs buns when a chicken-bacon-chicken sandwich will do?

    How much renewable juice does it take to power an Apple iPad?






  • WHO mobilizes 1,000 cities in urban health drive

    by Agence France-Presse

    Urbanization in the third world: Haitian slum outside of Port-au-PrinceGENEVA  – The U.N. health agency on Wednesday launched a global “1,000 cities, 1,000 lives” drive to combat a triple threat to health in fast-growing urban areas, now home to more than half of the world’s population.

    The World Health Organization predicted that most population growth in coming decades will take place in overcrowded, polluted, and often impoverished cities that house a concentrated array of the health problems faced by local societies.

    “Urban health matters in critical ways for more and more people,” said WHO Director General Margaret Chan on World Health Day, urging cities to place health concerns at the centre of their planning. “Poor health, including mental health, is one of the most visible and measurable expressions of urban harm,” she told a WHO meeting.

    The world’s urban population passed 3 billion in 2007, exceeding the rural population for the first time, according to the United Nations.

    The WHO warned that urban areas condense a threefold burden: infectious diseases exacerbated by poverty; chronic diseases such as heart trouble, cancers, and diabetes fueled by smoking, unhealthy “convenient” diets, and sedentary lifestyles; and injuries caused by accidents or crime.

    One of the WHO officials behind the drive, Lori Sloate, said a global network of cities could influence urban planning and management, “while there’s still time because we’ve just passed the tipping point.”

    By 2030, six out of 10 people will be city dwellers, rising to seven out of 10 people by 2050, with explosive growth in Asia and Africa, according to Chan.

    “In many of these cities, slums have become the dominant type of human settlement,” she warned. “Slums are productive breeding grounds for TB, hepatitis, dengue, pneumonia, cholera, and diarrhoeal diseases that spread very easily in highly concentrated populations.”

    Big cities are also growing far more quickly than in the past, outpacing the ability of authorities to build or plan for essential infrastructure including adequate health services, water, and sanitation, the WHO said in a report.

    Living and working conditions vary widely both within and between cities across the world and are the “causes of the causes” of ill health, according to the agency.

    The WHO warned that cities in both rich and poor nations can house huge disparities in health. They include a 28-year difference in the life expectancy of people living in different neighborhoods within Glasgow, Scotland. Meanwhile, in Nairobi, a child living in a slum is four times more likely to die before the age of five than one in another part of Kenya’s capital.

    Chan called the concentration of poverty in cities an “ominous trend.”

    “In developing countries, the best urban governance can help produce 75 years or more of life expectancy,” she said. “With poor urban governance, life expectancy can be as low as 35 years.”

    Some 1,300 cities in 120 countries have come forward to join the year-long campaign starting on World Health Day, a WHO spokeswoman said. The agency is encouraging them wage clean-up campaigns, foster exercise in parks, and open up public spaces by closing off portions of streets to traffic.

    Related Links:

    This week in comically evil corporate behavior

    America’s most bike-friendly cities and big green pledges

    Energy trumps the environment, poll finds