Author: Main Feed – Environmental Defense

  • EPA IG report: New Chemicals Program fails to assure protection

    Richard Denison, Ph.D., is a Senior Scientist.

    In a post to this blog nearly a year ago, I noted that many voices in the chemical industry were claiming that EPA’s New Chemicals Program (NCP) was robust and served as an excellent model for TSCA reform. My post took considerable issue with that point of view, noting the many structural constraints TSCA imposes on EPA in its effort to review new chemicals:

    • No data, no problem: No up-front testing requirement or minimum data set applies to new chemicals.
    • Guessing game: EPA is forced to heavily rely on limited models and methods to predict the toxicity or behavior of a new chemical.
    • Catch-22: While EPA can require testing of a new chemical on a case-by-case basis, it must first show the chemical may pose a risk – not an easy task without any data in the first place!
    • One bite at the apple: EPA typically gets only a single opportunity to review a new chemical.
    • Crystal-ball gazing: EPA has to try to anticipate a new chemical's for-all-time future production and use.
    • Black box: New chemical reviews lack transparency.
    • Anti-precaution: In deciding whether to require testing or controls for a new chemical, EPA equates lack of evidence of harm with evidence of no harm.

     Lately, I’ve been hearing chemical industry representatives trying to resuscitate the NCP-as-model-for-TSCA-reform mantra. So it is especially timely that a new report from EPA’s Office of Inspector General (OIG) has just been released that again thoroughly dismantles that notion. The new report’s critique of the NCP closely mirrors the appraisal I provided earlier. And adding weight to its analysis is the fact that EPA’s senior management has fully concurred with the report’s conclusions and recommendations.

    Let me first note that this new report covers more than just the NCP. Among its other conclusions:

    • “Oversight of regulatory actions designed to reduce known risks is a low priority,” reflected in the report’s documentation of the fact that compliance assurance and enforcement under TSCA is virtually nonexistent.
    • “EPA’s procedures for handling confidential business information requests are predisposed to protect industry information rather than to provide public access to health and safety studies.” (The report’s findings on CBI provide updated documentation for many of the conclusions of the 1992 EPA-commissioned report about which I recently blogged.)

    With regard to the New Chemicals Program, as I’ve covered much of this critique in my earlier post, I won’t repeat it here but strongly suggest you read the new report for more detail.

    Anti-precaution in practice

    But I do want to note one especially compelling highlight of the OIG report that strongly reinforces my point that the NCP’s new chemical review process is actually anti-precautionary.

    I had noted earlier how EPA’s presumption going in to a review of a new chemical is essentially that, unless it has good evidence indicating a potential risk, it effectively finds the chemical does not pose a risk. (A great example of this came to light in a consent order EPA issued for a carbon nanotube it reviewed last year.) 

    This approach, which in part reflects TSCA’s placement of the burden of proof on EPA to show harm rather than on industry to show safety, is especially ironic given the lack of any upfront data requirement and the paucity of data available on new chemicals.

    EPA’s OIG reached similar conclusions about the NCP:

    • “EPA’s assurance that new chemicals introduced into commerce do not pose unreasonable risks to workers, consumers, or the environment is not supported by data or actual testing.”

    and

    • “In cases where full information does not exist or analyses are limited [which is the case for the great majority of new chemicals], EPA reports the new chemicals as not having risk.”

    But the OIG report goes on to nicely illustrate the aggregate impact of EPA’s arithmetic that equates no data with no harm. 

    As a performance measure for the NCP, EPA annually reports to Congress the “percentage of new chemicals introduced into commerce that do not pose unreasonable risks to workers, consumers, or the environment” (emphasis added).

    How is that measured? Bear with me as I try to explain the convoluted process stepwise – I assure you the conclusion is startling enough to warrant following this through to the end.

    1. As I’ve discussed previously, if a company develops or obtains data it believes indicate a chemical it produces poses a substantial risk, it is required to provide a notice to EPA summarizing the data. The requirement to submit these so-called “substantial risk” notices is specified under TSCA section 8(e), but the information actually received is entirely based on self-disclosure by industry.
    2. To measure NCP performance, EPA does a review of the notices received in a given year for chemicals that at some point in the past went through a new chemical review; about 30 such notices are received each year. 
    3. If EPA staff believe that the concern raised about a chemical in the substantial risk notice would have been flagged via a new chemical review of that same chemical had it been reviewed at the present time, EPA counts that as a chemical that does not pose an unreasonable risk.
    4. In FY 2005 and 2006, EPA staff decided that for all (100%) of the small number of such chemicals reviewed, the concern would have been flagged in a new chemical review; in FY 2007, they concluded that all but one (96%) of the concerns would have been flagged.
    5. Here's the final leap: Based on this analysis, EPA reported to Congress that 100% (for FY 2005 and 2006) and 96% (for FY 2007) of all of the chemicals introduced into commerce did not pose any unreasonable risk.

     Still with me? 

     What’s wrong with this picture?

    • Note that EPA reviews about 1,500 new chemical notices each year, and about half of those chemicals go on to actually enter commerce. So the roughly 30 such chemicals for which substantial risk notices are received annually represent only about 4% of the number of new chemicals entering commerce. Nothing comes in on the other 96% – and that's good enough for EPA to maintain they do not present an unreasonable risk.
    • Moreover, few if any of the notices received in a given year are for new chemicals that entered commerce in that same year; rather, they may have entered many years or even decades earlier.
    • Substantial risk notices are not the outcome of any kind of systematic testing of chemicals in commerce – recall that under TSCA companies are not required to do any routine testing of their chemicals, and EPA has rarely required them to do any testing. So the chances of a company stumbling on evidence on substantial risk for one of its chemicals are rather remote.
    • Because there is no routine testing requirements and no specification of a set of data requirements for chemicals in commerce, even where a company does encounter evidence of substantial risk, it cannot be concluded that the resulting notice reflects all of the adverse outcomes that would have been found had the chemical been subject to testing for a robust set of health and environmental endpoints. 
    • Under this performance measure, no other data available on the chemical – whether in EPA’s files or in the literature – are considered.
    • Hence, it cannot be concluded that the concern identified in the notice is the only concern that chemical would raise if thoroughly tested. So how can EPA imply that its new chemical review would have flagged all potential concerns with a given chemical, and hence that the chemical does not present any unreasonable risk?
    • Throughout the history of TSCA, EPA has raised concerns that industry compliance with the substantial risk notice requirement is incomplete; in 1991 it even offered a limited amnesty program in an effort to unearth additional substantial risk data. And the OIG report notes that EPA enforcement of this and other TSCA requirements is exceedingly limited and therefore that “EPA does not have assurance that industry submits all Section 8(e) notices for identified risks.”
    • The report further describes the high-profile case EPA brought against DuPont in 2004 for failure to submit substantial risk notice(s) – a very rare enforcement case that was widely regarded as a “shot across the bow” by EPA to try to spur greater compliance with this TSCA requirement (how successful it was is unclear).

    All of this makes it pretty clear that EPA possesses only a tiny fraction of the tiles in the huge mosaic of the 23,000 chemicals that have passed through the New Chemicals Program review process. 

    It might be justifiable for EPA to conclude that it has not been made aware of unreasonable risks posed by many or most of the new chemicals it has at one point or another reviewed and allowed to enter commerce. But for EPA to rely on such a flawed and spotty performance measure to affirmatively conclude that virtually all of those chemicals do not pose unreasonable risks to workers, consumers, or the environment – well, that vividly shows how under TSCA, EPA relies on the absence of evidence of harm instead of requiring evidence of absence of harm.

  • Weather and Climate in the Face of the “Snowpocalypse”

    While Washington was buried under several feet of snow, we all needed some entertainment. Fortunately, leaders of the anti-science movement were happy to provide it. Sen. DeMint (R-SC) said: "It's going to keep snowing in DC until Al Gore cries 'uncle,'" while Sen Inhofe (R-OK) built an igloo dubbed "Al Gore's New Home." Sean Hannity reported "it's the most severe winter storm in years, which would seem to contradict Al Gore's hysterical global warming theories."

    I suppose it doesn't matter to them that the National Academy of Science and all major scientific organizations who have studied the question have concluded that pollution is causing changes to our climate. Or that there is some evidence that climate change could make blizzards like this more common, even as the world continues to warm. According to TIME:

    "Hotter air can hold more moisture, so when a storm gathers it can unleash massive amounts of snow. Colder air, by contrast, is drier; if we were in a truly vicious cold snap, like the one that occurred over much of the East Coast during parts of January, we would be unlikely to see heavy snowfall."

    One day's weather does not define our climate. It's one slide in the filmstrip — meaningful when strung together, but relatively uninformative on its own. (See our previous post on this.) That is why it is so important to follow the scientists unearthing past weather, recording present weather and modeling future weather — a theme The Colbert Report and the Daily Show picked up in their shows last week.

    Unfortunately, some people are attempting to exploit the recent snow to mislead the public about a carbon cap. There's an ad attacking Congressmen Boucher (D-VA) and Perriello (D-VA) for voting for the House climate bill. Far from "kill[ing] tens of thousands of Virginia jobs," this bill would bolster the Virginian and American economies. LessCarbonMoreJobs.org shows just shy of 100 Virginian companies — already employing over 16,000 — are poised to grow under a carbon cap. That's just one snapshot of the United States could achieve with climate legislation.

  • Upcoming Events: Building Resilience Workshop in New Orleans, February 25-27

    021810 Building Resilience Workshop LogoWant to know the latest about amphibious housing and urban strategies for reducing storm damage? Then mark your calendar for "Building Resilience: Implementing Innovative, Sustainable Flood Mitigation Solutions for the Gulf Coast", a workshop that will be held next week in New Orleans from Thursday, February 25th through Saturday, February 27th.

    More than a dozen organizations, including AIA New Orleans, the Center for Hazards Assessment, Response & Technology at the University of New Orleans (UNO-CHART), and the Lower 9th Ward Center for Sustainable Engagement and Development (L9W-CSED), will be sponsoring seminars at the Old U.S. Mint and the Bourbon Orleans Hotel. Featured speakers and participants will include representatives from the U.S. Army Corps of Engineers and several universities in Germany and the Netherlands.

    The three-day seminar series is a great opportunity for local architects and engineers to mix with innovative designers and industry leaders while earning Professional Development Credits. Visit the event website to learn more and register.

  • Fighting Back a Wave of Unemployment: Coastal Louisiana Needs Federally-Funded Restoration Jobs Now

    Unlike storm surges and hurricanes in the state's recent past, the rising tide of joblessness sweeping through coastal Louisiana is not a natural disaster. Nonetheless, the effects of this crisis are equally pervasive, and its resolution demands equally urgent attention at the state and federal levels. At Restoration and Resilience, we believe that cutting unemployment and curbing wetland loss are complementary strategies for southern Louisiana. To that end, we strongly urge framers of the impending Congressional jobs bill to include substantial grants for restoration projects in coastal Louisiana.

    Nearly five years after Hurricanes Katrina and Rita, Louisiana is “fortunate” to post a state unemployment rate, at 7.2% in December 2009, that is far lower than the national average of 9.7%. However, this statistic ignores widespread reductions in Louisiana workers’ hours, which have cut take home pay for many households. In addition, as post-Katrina federal funding runs out, there are indications that impending budget cuts at the municipal and state levels will translate into higher rates of joblessness in coastal Louisiana. Taken together, these cutbacks, along with a pronounced slowdown in post-Katrina reconstruction, have contributed to rising unemployment in southern Louisiana.

    Unemployment in Louisiana's metropolitan areas, Fall 2009 (Source: Bureau of Labor Statistics)

    Unemployment in Louisiana’s metropolitan areas, Fall 2009 (Source: Bureau of Labor Statistics)

    Despite indications that the national unemployment rate is finally falling, the rate of joblessness in Louisiana remains on the upswing. In all eight of the state’s metropolitan areas, unemployment rose by half-a-percentage point or more between November and December 2009.

    Coupled with the continued erosion of Louisiana’s coastal marshes at a pace of 1-3 square miles a month, the state’s economic (and ecological) challenges demand attention from a targeted jobs program in coastal protection. While we applaud the Obama Administration for including $35.6 million for Louisiana wetland projects in the FY 2011 budget, we believe that substantially more funding is needed to implement the sort of comprehensive restoration program that will generate jobs now.

    CWPPRA projects lingering in the engineering and design phase could be expedited with targeted money to supplement state funding. For example, construction of box culverts, collecting ponds, and sediment channels in the proposed Maurepas Swamp restoration could potentially generate hundreds of jobs for builders, machinery operators, and sub-contractors in the Baton Rouge and New Orleans metropolitan areas. In addition, reforestation of the estimated 36,000 acres of restored marsh could employ dozens more. Inclusion of restoration projects like Maurepas Swamp in the Build America Bonds (BABs) program could be one step towards expediting funding for wetland protection and job creation.

    Jobs were the centerpiece of the President’s State of the Union address, and they have become the center plank of both Republican and Democratic campaigns in this election year. Representatives from both parties should come together to support a jobs campaign that prioritizes work in sustainable sectors like coastal restoration. This work will help to reduce the rate of joblessness in Louisiana, and bolster the state’s defenses against natural disasters for years to come.

  • EDFix Call #7: Bottom-Up Global Problem Solving

    Sustainability movements worldwide have created major new institutions and exchanges, from high-level conferences and carbon taxes to national markets and associated currencies.

    Anthony Williams, co-author of Wikinomics and its forthcoming sequel, Macrowikinomics, has a hunch these efforts are the wrong way to go about precipitating the broad, deep changes we need if we really are going to change how we all get around, get power, eat, shop, learn, share and make things. We need to rely less on centralized control and more on self-organizing efforts everywhere initiating small experiments and piloting social innovations. Some of these will mushroom into pervasive changes in societal behavior.

    As examples of such experiments in progress today, Anthony points to initiatives such as Carbonrally, MapEcos, Global Forest Watch and CARMA for climate change; ZipCar, GoLoco and Better Place for transportation; Crocodyl, The Extraordinaries and NetSquared for activism and volunteerism. The list of initiatives is endless and fascinating.

    We'll talk with Anthony about these experiments, the forces behind them and what they imply for existing organizations. Join us on February 22, 2010 at noon ET (9am PT) for the call:

    • Phone number: +1 (213) 289-0500
    • Code: 267-6815

    Get Updates about EDFix Conference Calls

    If you'd like to get announcements about upcoming EDFix conference calls and the results with podcast releases, please sign-up here:

  • EDFix Call #6 afterthoughts: Cutting Holes in the IP Funnel

    EDFix Call #6 – Summary (9:01)

    Download MP3 | Subscribe in iTunes

    EDFix Call #6 – Full (42:36)

    Download MP3 | Subscribe in iTunes

    Get Call Updates by Email

    John Wilbanks from Science Commons and Kelly Lauber, Director of Nike's Sustainable Business & Innovation Lab, joined us on the February 8th EDFix call to discuss the GreenXchange project, which was announced at the World Economic Forum in January.

    Highlights from the call included:

    • an explanation of the new patent tools that GreenXchange will provide, including a research non-assertion pledge to encourage more non-profit research on commercial patents and and a model patent license which inverts the traditional patent and allows reuse of the technology.
    • a discussion of a "3rd layer" of language to allow constraints on top of the model patent license. These constraints would allow a patent-owner to share the patent with exceptions (e.g., not with direct competitors).

    Wilbanks explained these interventions as "cutting holes" in the intellectual property funnel so that knowledge to generate innovation will leak out to other organizations that might put it to work, while preserving the IP originator's rights.

    GreenXchange is beginning what will be a multi-year process with release of the legal language. The next steps are to recruit more or the right early partners, study what leaders like Nike are doing with their patents, and build a network. Wilbank's goal is to add a zero to the number of companies using GreenXchange every year.

    One of the surprising insights, via Lauber, is that Nike's legal team was a champion, not opponent of this new approach. She explained that their team had already been looking at open innovation and saw the upside that GreenXchange offered. She says that Nike is particularly pleased by the research opportunities that have been created and sees a huge opportunity.

    Wilbanks echoed Bill Joy's observation that most of the smart people are outside your company. Once the IP community has a standard infrastructure in place for searching and using patents across companies and even industry sectors, network effects can kick in. That's the goal for the next few years.

    Listen to or download the podcast of the full discussion (43 min.) or Jerry's summary of the call (9 min).

    Be sure to join upcoming EDFix Conference Calls.

  • Grist Recommends Two Great Transportation Reads

     

    Source: Amazon.com

    Source: Amazon.com

    We'd like to thank David Roberts of Grist for posting on two transportation books that blew his mind:

    Tom Vanderbilt's: Traffic: Why We Drive the Way We Do (and What It Says About Us), which looks at human behavior and how it affects traffic, safety, and all those accidents in parking lots.

    William Mitchell, Christopher Borroni-Bird, and Lawrence Burns' ("three brilliant supergeeks, two from GM’s advanced auto division; one from MIT's Smart Cities program"): Reinventing the Automobile: Personal Urban Mobility for the 21st Century, which discusses the need to transform the DNA of the automobile and our automobile culture.

    While we disagree with Grist and think that the transportation genre is the most exciting in the world, we agree that you should check these out!

  • Groundbreaking Goals Hiding in Plain Sight

    Colin MeehanYou probably saw today's announcement on the formation of "Clean Energy for Austin", a group of businesses, faith groups, low-income advocates and environmentalists that have come together to support the Austin City Council as it works to pass a forward-thinking plan for our utility. 

    With more than 70 local businesses big and small, 18 non-profits and 200 individuals in this new group, it's pretty clear that the generation plan has strong and broad support in Austin. Some of the reasons you've already heard:

    • To protect customers from rising fossil fuel costs and regulatory risk
    • To bring the booming green jobs market to Austin
    • To add flexibility to Austin generation planning process

    But you don't often hear two of the most compelling reasons: energy efficiency and transparency. 

    Energy Efficiency
    At Austin Energy's recommendation, the city council will commit to 800 MW of energy efficiency and a thorough study to evaluate whether it can meet a more aggressive goal of 1,000 MW of energy efficiency within the next 10 years. Everyone, even those who think we should wait before planning for the future, agrees that investments in energy efficiency are the most cost-effective moves we can make. Austin Energy has already shown that these investments pay off – the utility set (and met) a goal of 700 MW of energy efficiency over the last 10 years, saving Austinites the cost of an entire new power plant. 

    Energy efficiency is one of the best reasons to start working toward Austin Energy's goals immediately: the new programs needed to meet this goal will take time to ramp up. At the same time, setting this goal will ensure that the programs already in place continue and even expand. Austin Energy has already expanded its eligibility for low-income weatherization services from households with incomes of 125% of the poverty level to 200% of the poverty level, but that could go back down once federal funding is exhausted. Approving this plan will ensure not only the current level of coverage but the potential for expansion of those low-income programs to households earning up to 400% of the poverty level.

    Transparency
    Throughout the development of this plan, Austin Energy adopted a level of transparency unprecedented among public utilities. It reached beyond the normal large industrial consumer base historically involved in U.S. utility planning processes to include stakeholders from several communities. Minority business owners, home builders, environmentalists, the faith community and low-income advocates have all had the opportunity to work with Austin Energy and the Mayor's Generation Task Force to develop priorities for the generation planning process. 

    Austin Energy even went a step further by performing the same modeling and analysis on several stakeholder-proposed scenarios that it performed on its own scenarios. This is the first time I know of that the public has had a chance to propose investment strategies that the utility would take so seriously that it would fully analyze them and actually incorporate some of them into its final recommendation.

    Most importantly, though, we need to ensure that the future process is even more open and transparent. Clearly some stakeholders –particularly the low-income and faith communities – did not receive the same level of outreach as others, but that will change with the utility's latest recommendations. Representation for low-income and residential consumers will be required in any future generation planning advisory group. Any significant investments will need to go through the City's Electric Utilities and Resource Management Commissions before being brought to City Council for a vote twice. This level of transparency and community involvement would make Austin Energy a leader in open governance in the U.S. and is, in my mind, one of the best reasons to support city council and Austin Energy as they take on this plan.

    In the end, this plan does a lot of good for the citizens of Austin: it helps protect us against rising fossil fuel prices; it will ensure that Austin is a big part of the green economy; by planning ahead and doing regular evaluation, it gives our city the flexibility to make changes as needed. I could go on (and I will if you ever see me talking about it) but I really think the two things that get overlooked in these discussions are that energy efficiency goal and the big commitment Austin Energy is making to open governance. I'll challenge anyone to find a utility with a more open, inclusive and transparent planning process than the one Austin Energy has had over the last two years or the one we will have when the city council approves the recommendations.

  • Solutions Labs 2010 Kick-Off at Duke

    I like structure. I like to know what my day will look like so I can plan ahead. Itineraries and calendars are my friends (especially when they are color coordinated). So when I decided to go to an "unconference" with no set agenda, I really didn’t know what to expect. All of these questions fluttered into my head: what the heck is an unconference? No agenda? How do I prepare? For those of you out there who have similar questions, this post is for you.

    A few weeks ago, the Green Innovation in Business Network (GIBN) kicked off the first of ten events scheduled for 2010. The Solutions Labs use an open space "unconference" format to gather professionals from a variety of industries plus government and non-profits together for a day of discussion, experience-sharing, brainstorming and problem solving. The Solutions Lab held on January 28th was graciously hosted at Duke University's Fuqua School of Business in Durham, North Carolina by Duke's Corporate Sustainability Initiative .

    The day began with an introduction by Dan Ariely, author of Predictably Irrational, Professor of Behavioral Economics at Duke University and a founding member of the Center for Advanced Hindsight. In his casual, conversational style, Ariely helped us explore the reasons people aren’t more motivated to act in favor of the environment. He then led us through possible mechanisms and motivators for change: things such as reward substitution, ego utility and the power of competition. After Dan’s passionate and comedic introduction, we were all riled up and ready to tackle these issues. So far, the day seemed pretty organized, but I was still worried about the potential for disorganization that would come with agenda setting.

    The reasoning behind the unconference format for the Solutions Labs is that by having participants in the room set the agenda according to their needs, discussions are sure to be appropriate and engaging. Still, I could feel the train wreck lurking. No organization. Chaos. But I was so very wrong.

    x2_95937c

    Odin Zackman organizing the agenda

    Odin Zackman, our conference facilitator, patiently guided the more than 100 attendees through the process with his superb mediating skills. We split ourselves up into groups of five or so to discuss our burning questions, what we wanted to learn and how we could help each other find solutions. Then, using proposals from participants, Odin allocated topics to rooms and times, crafting the agenda for the rest of the day. The great thing was, we didn’t have to choose just one session per set. In fact, we were encouraged to float around until we became engaged in a particular discussion. It was certainly not an ordinary conference where people are talked at for hours; instead people talked to each other. Participants were constantly engaged in stimulating conversation and left inspired.

    At the end of the day there had been 25 sessions on topics ranging from "Calories & Carbon" where people talked about what we can learn from the food industry about sustainable product information, to matters of economics with "Show me the money!" and a discussion about making sustainability affordable. One common theme was the difficulty in effectively demonstrating and communicating the value of sustainability efforts and translating that into behavior change (more to come on that in a later post). The full agenda for the day and notes from most of the sessions are available on the Green Innovation in Business wiki.

    The day closed with a round of feedback and planning for next steps. Two concrete next steps agreed to were:

    1. to convene a "roundtable" of non-profit and business representatives to discuss policy opportunities and how to partner.
    2. explore the idea of creating a vision of the Research Triangle area becoming the "Green Silicon Valley"

    Once again, we would like to extend a big thank you to all the supporters that made this event possible! Special acknowledgments go out to RTI, SAS, and Dan Vermeer at Duke’s Corporate Sustainability Initiative. And thanks again to Jill Newbold who herded space, food and people.

    Stay tuned for more posts and updates on these predictably unpredictable Solutions Labs.

    Get Solutions Labs Updates via email

    Add your email address to this list to learn about upcoming events, results and planning.

    Enter your email address:

  • Worse than we thought: Decades of out-of-control CBI claims under TSCA

    Richard Denison, Ph.D., is a Senior Scientist.

    I recently obtained – not without some effort on both EPA’s and my part – a scanned copy of a 1992 report commissioned by EPA innocuously titled “Influence of CBI Requirements on TSCA Implementation,” authored by the now-defunct Hampshire Research Associates. I subsequently found a copy in an old EPA docket, located here (6 MB PDF file).

    This understated yet remarkable report is a veritable treasure trove of information that painstakingly documents the rampant rise in illegitimate confidential business information (CBI) claims made by the chemical industry in the first decade after passage of the Toxic Substances Control Act (TSCA) – and the very limited options available to EPA to stop such activity (despite recent admirable efforts on its part).

    Now, some of you may be saying: “Wow, that report is old, surely things have improved since then.” To which I respond there is absolutely no reason to believe that is the case (see this earlier blog post for just one indication of the continuing excess of CBI claims under TSCA). I would welcome any evidence to the contrary, but as you’ll see, the underlying reasons for this problem are structural to TSCA and EPA’s implementing regulations.

    I have been a squeaky wheel on this CBI issue for some time, of course. But this report elucidates several new dimensions of the nature and extent of the problem. And it documents them: Because the authors were contracted by EPA, they had access to internal databases and records of submissions EPA had received under TSCA during the period of 1979-1990.

    In this post, I’ll summarize 10 key findings from the report that document the problem. In a subsequent post, I’ll look at what the report had to say about solutions.

    Key findings (I’ll list them all here so you can take them in all at once, and then elaborate on each one below):

    1. Half or more of all information submitted to EPA under TSCA was claimed as CBI.
    2. The fraction of information claimed CBI under TSCA was initially low and then rose, often dramatically, over time.
    3. When EPA reversed a policy it had in place until 1982 that required up-front substantiation of CBI claims for new chemicals, the number of such claims shot up.
    4. When examined by EPA, a large fraction of CBI claims were found to be illegitimate – the information so claimed was not eligible under TSCA or EPA regulations.
    5. However, the vast majority of CBI claims have never been reviewed by EPA. And EPA has accepted without challenge CBI claims for information which TSCA does not allow to be so claimed.
    6. Industry faces no penalty for making a false or erroneous CBI claim under TSCA; in contrast, EPA personnel face criminal penalties for wrongful disclosure of CBI – even if the information is not eligible for CBI protection.
    7. Claiming information CBI under TSCA is simple and facilitated by EPA procedures; in contrast, challenging such claims is highly cumbersome and resource-intensive.
    8. Processing and protecting CBI imposes heavy direct and indirect costs on EPA; in contrast, there is virtually no cost to industry to assert a CBI claim.
    9. EPA has routinely failed to disclose the extent of CBI claims asserted overall, or what types of information it receives have been claimed CBI, to what extent and by whom.
    10. The extent of CBI claims asserted under TSCA exceeds by orders of magnitude that under other federal laws – most notably the Toxics Release Inventory (TRI) – even for very similar types of information submitted by companies.

     

    Elaboration of findings (quotes below are taken from the report):

     1. Half or more of all information submitted to EPA under TSCA was claimed as CBI.

    While the extent varies by submission type and information element, CBI claims were made for:

    • more than 25% of all “substantial risk” notices received under Section 8(e) of TSCA (80% of these claimed the chemical identity CBI);
    • more than 20% of all health and safety studies;
    • about half of all EPA-requested records of significant adverse reactions (required to be kept under TSCA Section 8(c)); and
    • more than 90% of all new chemical notices.

    Submissions for all of the first three categories, and for quite a few of the fourth category, constitute or contain what EPA defines to be “health and safety studies.” These CBI claims were made, therefore, in direct contravention of the plain language of TSCA, which expressly precludes such studies from CBI protection (see discussion of this issue in earlier posts).

    2. The fraction of information claimed CBI was initially low and then rose, often dramatically, over time.

    The report examined trends over time in CBI claims, and revealed a “learning curve” that appears to have been followed: companies increased the frequency of such claims as they learned there was little or no consequence to their asserting them, even for information clearly off-limits for CBI protection under TSCA or EPA regulations. For example:

    • About 70% of premanufacture notification (PMN) submissions for new chemicals submitted to EPA during the first 4 years of the PMN program (1979-1982) claimed the chemical identity as CBI. That number rose considerably thereafter, reaching 92% by 1990, the last year of data covered by the report. (Note that this high rate of CBI claims for PMNs has if anything increased further since 1990: EPA indicated in 2007 that about 95% of PMNs contain information, including chemical identity, designated by the submitter as CBI; see p. 10 of this report).
    • Very few “substantial risk” notices were submitted until 1983 (fewer than 15 per year). From that year onward, the number of such submissions increased – but so did the fraction of them claiming CBI, rising from a mere 15-18% in 1983-85 to a whopping 48% by 1990.

    3. When EPA reversed a policy it had in place until 1982 that required up-front substantiation of CBI claims for new chemicals, the number of such claims shot up.

    One contributing factor to the jump in CBI claims accompanying PMNs starting in 1983 appears to have been EPA’s reversal of a policy in place prior to that year that required up-front substantiation of CBI claims to be provided at the time the claims were asserted. This is one of several factors the report identifies clearly indicating that the lower the “cost” or effort required to assert CBI claims, the more claims are made – regardless of whether or not the claims are warranted.

    4. When examined by EPA, a large fraction of CBI claims were found to be illegitimate – the information so claimed was not eligible under TSCA or EPA regulations.

    These facts and trends apparently aroused sufficient suspicion about CBI claim validity to finally lead EPA in 1990 to initiate a pilot program to review and challenge CBI claims. Specifically, EPA challenged all CBI claims made in association with a significant number of the submissions of health and safety data it received over a limited period under either Section 8(d) or 8(e) of TSCA. Remember that TSCA expressly excludes health and safety studies from eligibility for CBI protection, and EPA regulations expressly define chemical identity as an integral part of a health and safety study; for a refresher on these points, click here).

    So what happened?

    In every case in which EPA challenged a claim, the submitter agreed to remove or reduce the scope of the claim. The report states that this result “indicates that EPA is correct in challenging the validity of these CBI claims.” This high frequency of questionable or invalid claims appears to have continued: It was reconfirmed by an EPA official cited in a 2005 report by the Government Accountability Office (see page 33), who indicated that, while only about 14 CBI claims are reviewed per year, nearly all challenged claims were withdrawn.

    The report provides numerous examples of spurious claims and justifications uncovered by this review, concluding that “they illustrate an apparent reliance on CBI claims to avoid embarrassment or adverse public reaction, rather than to protect trade secret information,” and “an effort to prevent disclosure of precisely the sort of information that the framers of TSCA sought to make available to the public.”

    The examples make for entertaining reading; I’ll cite just one here: A submitter of a “substantial risk” notice claimed both its own identity and the identity of the chemical in question as CBI. When asked to justify the claim, the submitter said the health effect identified in the study was “highly unusual” and that it sought to avoid public release of this information until it could conduct further research, so as to avoid “premature and possibly unnecessary concern” about its chemical.

    Seeking in this manner to use TSCA’s CBI provisions for a purpose for which a company might otherwise hire a public relations firm is not, of course, what Congress had in mind when it mandated immediate disclosure of such information. Nor does it come close to a justification that the information constitutes a trade secret, which is the sole legitimate basis for CBI assertions.

    Unfortunately, to the best of my knowledge, that review program at EPA was short-lived and has not been repeated.

    5. The vast majority of CBI claims have never been reviewed by EPA. And EPA has accepted without challenge CBI claims for information which TSCA does not allow to be so claimed.

    TSCA allows companies submitting information to claim any information they want confidential, whether or not it actually meets statutory or regulatory descriptions of eligible information. The onus then shifts to EPA to challenge a claim it considers invalid (more on the process EPA must follow is below).

    Because the resources required to conduct such case-by-case challenges are lacking, the report found that “the vast majority of claims submitted are not reviewed” and substantiation is rarely even requested. Substantiation is requested and claims are challenged typically only when a Freedom of Information Act (FOIA) request is filed for the information (but see finding 8 below on the limitations of this as a trigger for review).

    Lest you think things might have improved, all indications are that this minute rate of review of CBI claims continues to the present day. As noted earlier, EPA confirmed in the 2005 GAO report (see page 33) that only about 14 CBI claims are reviewed per year.

    Another key conclusion of the Hampshire report is that “Agency practice in accepting CBI claims has, in fact, been more lenient than the statute (or its implementing regulations) requires.”

    For example, EPA routinely allows PMN submissions to be claimed CBI in their entirety – even when they contain health and safety studies. (Elsewhere I have noted how few PMNs actually contain any such studies; for example, 85% of PMNs contain no health data. But 15% of the roughly 1,500 PMNs filed annually is still a good number of PMNs with health data – which should, but are not being, released by EPA.)

    All of this contributes to quite a vicious circle: The more CBI claims are made, the fewer EPA can review; the fewer EPA reviews, the greater the incentive to make unwarranted claims.

    Ah, but we’re not nearly done yet: There are still more factors that contribute to this perverse downward spiral that serves to reduce disclosure of chemical information that Congress meant for the public to see; read on.

    6. Industry faces no penalty for making a false or erroneous CBI claim under TSCA; in contrast, EPA personnel face criminal penalties for wrongful disclosure of CBI – even if the information is not eligible for CBI protection.

    The report calls out this remarkable imbalance, noting that it has contributed to the proliferation of CBI claims. It has also led EPA to create a level of protection for CBI equivalent to that granted top-secret national security information elsewhere in government, and has engendered an “institutional culture” at EPA that invariably tilts far to the side of nondisclosure over public right to know.

    7. Claiming information CBI under TSCA is simple and facilitated by EPA procedures; in contrast, challenging such claims is highly cumbersome and resource-intensive.

    Further lowering the transaction costs for asserting CBI claims and raising those for challenging them are the procedures EPA has developed.

    In many cases, merely checking a box is all that is required to designate part or all of a submission as CBI. Until and unless a specific claim is challenged, the confidentiality of that information must be protected by EPA.

    In contrast, EPA policy specifies that, for each CBI claim it wishes to scrutinize, it must typically first contact the submitter to request substantiation of a CBI claim, then review the substantiation, then again contact the submitter if it believes the claim is unwarranted to seek its consent to release the information. If unsuccessful, EPA must then convince its Office of General Counsel the case warrants issuance of a notice of denial, which must be sent by certified mail. Disclosure must then still await a 30-day period during which the submitter can challenge the impending disclosure in court and halt it pending judicial review and decision.

    8. Processing and protecting CBI imposes heavy direct and indirect costs on EPA; in contrast, there is virtually no cost to industry to assert a CBI claim.

    The report describes a range of substantial costs CBI protection imposes on EPA, ranging from direct costs to establish and maintain the needed security infrastructure, to indirect costs associated with limiting or complicating the ability of EPA staff to access information critical to performance of their jobs. While these costs may be legitimate for information that truly warrants CBI protection, they clearly are excessive in light of the large number of unwarranted claims made.

    In contrast, under TSCA there is not even a processing fee associated making a CBI claim. The report points out that such a fee would serve a dual function, based on experience under other laws: It could reduce the number of claims made merely by imposing a cost on doing so. And it would provide EPA with resources sufficient to cover its costs of processing, reviewing, and where necessary challenging such claims.

    9. EPA has routinely failed to disclose the extent of CBI claims asserted overall, or what types of information it receives have been claimed CBI, to what extent and by whom.

    In protecting information claimed as CBI, EPA practice has gone beyond merely protecting that information, to shielding from the public even the fact that such information was claimed CBI. As a result, the report concludes, “there is no way for outside users to know whether or not EPA is in possession of data relevant to their interests.” 

    Yet TSCA provides no basis for EPA to hide from the public the fact that a company has claimed certain information to be CBI. Nor can EPA legitimately hide the extent to which certain types of data are claimed CBI.

    Surely the public has a right to know that a certain company claims all of the information it submits to be CBI, while another company claims little or none of what it submits. And the public should be able to know how often companies claim health and safety data they submit to be CBI – despite and in direct contravention to TSCA’s prohibition on doing so.

    How else can we know what we don’t know?

    One bright spot of late was EPA’s effort to tally and publicize the extent of CBI claims made for data elements required to be reported under its most recent TSCA Inventory Update Reporting (IUR) cycle. EPA’s summary report of the information submitted under the 2006 IUR has a nifty table (see Exhibit 3 in that report) indicating the – often large – extent to which certain types of information were claimed CBI by companies.

    But it needs to go the next step: For each submission it receives, it should either make public each information element in the submission – or clearly indicate that the element is claimed by the submitter to be CBI. That is in addition to providing (as it did in that latest IUR report) aggregated statistics characterizing the frequency and extent of CBI claims both for individual information elements and overall for a given submission type.

    10. The extent of CBI claims asserted under TSCA exceeds by orders of magnitude that under other federal laws – most notably the Toxics Release Inventory (TRI) – even for very similar types of information submitted by companies.

    This finding falls under the category of “clearly, there is a better way.” The report highlights CBI policy and practice under the Emergency Planning and Community Right-to-Know Act (EPCRA), which established the Toxics Release Inventory (TRI). 

    The report notes that, in 1988, only 23 trade secret claims were made under TRI – out of more than 70,000 forms submitted. That’s 0.03%. Contrast that with TSCA, under which the report estimates 50% or more of the submitted information was subject to CBI claims.

    The report’s conclusion: “CBI claims under TSCA are far in excess of what is needed to protect true trade secrets.”

    What accounts for the radical disparity in the CBI experience under these two laws? The report identifies five key differences. EPCRA (Section 322) and its associated regulations:

    • require up-front substantiation of all CBI claims at the time they are made;
    • mandate claims to be certified by a senior company official;
    • provide civil and criminal penalties for false claims;
    • limit CBI claims to a narrow set of information elements; and
    • require that each submission be made available – with each information element claimed CBI clearly identified so the public understands what is being withheld.

    These EPCRA provisions help to inform the report’s excellent assessment of solutions to excessive CBI claims under TSCA – which I will delve into in another post in the near future. So stay tuned – or just read the report!

  • TCEQ: At It Again

    Dr. Elena Craft, toxicologistWe hear a lot about the jobs that will be created as we transition to a clean-energy economy, but as a toxicologist, I like to focus also on the improved air quality that will result. However, until the day comes when everyone drives plug-in hybrids and industrial facilities are non-polluting, we must take immediate steps to ensure cleaner air for ourselves and our children.

    That's why I was encouraged by the turnout in support of cleaner air at an event last week. The EPA held one of three national hearings in Houston on its proposed new national ambient air quality standard (NAAQS) for ozone. The hearings gave the public the opportunity to comment on EPA’s proposal to tighten the ozone standard from 75 parts per billion to somewhere between 60-70 ppb.

    Still, while doctors and health professionals, mothers, environmental advocates, and other interested parties all testified for the need to protect sensitive populations from ozone exposure, our own state environmental agency – the Texas Commission on Environmental Quality (TCEQ) – questioned the science behind the proposal and made it clear the new standard would be too costly and even unobtainable.

    The testimony of Mike Honeycutt, TCEQ chief toxicologist, was revealing. Here are some telling excerpts:

    • “These studies are based on the supposition that the majority of people breathe outside air 8 to 24 hours each day while the scientific data clearly show this is not the case.”

    Should we take this to mean that those who do spend more time outdoors – construction workers, carpenters, utility workers, lifeguards and athletes, to name a few – don’t deserve protection from the health impacts of ozone? 

    • “We hear anecdotally that hospital visits for asthma rise when ozone levels rise, but hospital admissions data show this is not the case. Texas Inpatient Hospital Discharge data on numbers of hospital visits for asthma between 1999 and 2001 actually show that fewer children in Texas visit the hospital for asthma during peak summer ozone season as compared to wintertime. Results from a 4-year (2000-2003) air quality study conducted by Texas A&M University and Driscoll Children’s Hospital indicate hospital admissions to be weakly correlated with ambient daily maximum ozone levels. The Kaiser Permanente Report and the Gauderman study in 2004 found no increased hospital admissions in elderly patients and health effects in children due to ozone alone.”

    Do bronchial problems increase in winter? Yes, because of complications due to viruses and other illnesses that peak during this time. Honeycutt hears “anecdotally” about the relationship between asthma and hospital visits but one must wonder if he's actually read the reports on this subject. Studies continue to demonstrate a causal relationship between high ozone concentrations, the latest just released by the Journal of Allergy and Clinical Immunology, which reported a 19 percent increase in ICU admissions on higher ozone days.

    Is it just coincidence that Honeycutt chose to single out these reports from among more than 1,700 papers on the issue? When the EPA’s independent, statutorily-established expert panel, the Clean Air Scientific Advisory Committee (CASAC), convened to develop a health-based ozone standard – after examining all 1,700 papers on the issue –the verdict was explicit: a unanimous recommendation for decreasing the primary standard to within the range of 60-70 ppb. Further, it stated that any standard above this range fails to satisfy the explicit stipulations of the Clean Air Act requiring an adequate margin of safety for all individuals, including sensitive populations.

    Should we listen to Honeycutt, speaking for a notoriously politicized agency, or should we rely on the nation’s top experts who have spent their professional lives studying the subject?

    Honeycutt also suggested Texans simply won’t stand for the control measures necessary to meet the lower ozone standard. [My favorite: We won’t be able to idle our car engines at the drive-thru while waiting for our Big Mac and fries.] 

    The truth is that TCEQ isn’t serious about making ANY real effort to reduce ozone. In the latest state implementation plan (SIP) it submitted to EPA to demonstrate attainment with the 1997 ozone standard, the only control measures that TCEQ suggested were a paper reduction in the industry trading credits for some ozone precursors, and some small adjustments in emissions from the printing industry. [According to TCEQ’s own analysis, neither of these measures will make even a dent in reducing ozone concentrations.]

    If TCEQ was really serious about protecting Texans’ health, there are plenty of opportunities out there to reduce ozone concentrations. For instance:

    • TCEQ could stop issuing air quality permits that fail to consider emissions of ozone precursors from newly proposed facilities. For example, a recent ruling on the White Stallion coal/pet coke permit in Matagorda county disregards ozone modeling data that demonstrates that the new facility will contribute to ozone concentrations in Houston.
    • TCEQ could enforce more penalties when facilities violate their air permits. Currently, TCEQ enforces only about 50 percent of the unplanned emission events from stationary sources across the state. These emission events release pollutants that generate large plumes of high ozone in our region.
    • TCEQ could operate more meaningful trading schemes for the pre-cursors that result in ozone formation. The market for nitrogen oxide credits, for instance, is so over allocated that no one is even trading credits.
    • TCEQ could encourage mass transit to help get cars off the roads, or could support legislation that would increase fuel efficiency standards. 

    Fortunately for Houstonians, some elected officials testified in support of following the real science used to develop the new health-protective ozone standard. Newly elected Houston Mayor Annise Parker was realistic: “There is no doubt it would be a significant challenge for Houston to meet the lower standard, so the amount of lead time and our ability to achieve regional coordination is significant to us, but we do want to be on the record in supporting a goal that is protective of our citizens and is based on real science.”

    How does Texas meet ANY new ozone standard as long as we have a state environmental agency determined to stall or block the measures necessary to attain them?

    Will we get cleaner air in Texas? Yes, eventually – but in spite of TCEQ’s efforts, not because of them.

    *** All Texans are invited to comment on strengthening the ozone standard. Go to http://www.regulations.gov [Note: You must include the Docket ID: No. EPA–HQ–OAR–2005–0172 ]

  • Protecting Marine Life in Cuba

    Cuba coastline

    Cuba coastline in the Guanahacabibes National Park. This rocky coastline is prime habitat for 3 foot long Cuban iguanas that are native to the area.

    Guanahacabibes National Park, is located in the far southwest corner of Cuba, just across the Yucatan Strait from Cancun and smack dab in hurricane alley. Last week Pam Baker (from our Texas office) and I paid a visit to Cuban colleagues there to learn more about new efforts to protect fish populations, coral reefs, sea turtles, and coastal ecosystems.

    Most of the park’s waters are off-limits to the harvesting of reef fish, spiny lobster and other species. La Bajada, a small village perched on the rocky coastline of the park, is home to a few dozen subsistence fishermen who are still allowed to catch fish in a designated area. 

    The park’s pristine beaches provide important nesting grounds for endangered turtle species, including loggerhead, green, and hawksbill turtles. Park scientists and volunteer students provide round-the-clock monitoring during nesting season and have virtually eliminated poaching of turtles in the park. 

    Just recently, Cuban scientists have initiated a project to assess and control growing populations of the deadly Pacific lionfish, a non-native species that threatens native fish and fishermen alike. We are working with them to develop strategies to combat this invasive and dangerous species. We are also teaming up to assess the impacts of sea level rise along the southern coast of Cuba and to examine adaptation and mitigation opportunities. By some estimates, all of the park’s mangrove forests could be submerged in ocean waters by 2050. For more on these and other collaborative projects in Cuba go to www.edf.org/cuba.

  • Calling All California Truck Fleets – Free Money to Purchase Hybrids Now Available

    Have trucks in California? You’d better get to your dealer fast, because the California Hybrid Voucher Incentive Program is open for business.

    On February 4, the landmark program officially opened, and boy, were fleets ready. In the first 12 hours, about 25 percent of the program’s $20 million in vouchers had been requested.

    All fleets with trucks operating in California are eligible for up to 100 vouchers each, on a first-come, first-served basis. With nearly $15 million still up for grabs, so it’s not too late to claim your share.

    The program, launched last week by the California Air Resources Board and administered by CALSTART, will provide $20 million in funding assistance for fleets to replace old diesel trucks with newer, cleaner hybrid trucks. Depending on the vehicle weight and model, the voucher amounts range from $10,000 to $45,000, including a $5,000 bonus for each participating fleet’s first purchase. Details about participating in the program, approved dealers and eligible vehicles are available on the program website at http://www.californiahvip.org/

    What sets the Hybrid Voucher Incentive Program (HVIP) apart from other programs is that this is the nation’s first voucher program, meaning that fleets receive the benefit at the point of sale. There are no applications to fill out or rebates to wait for; dealers throughout California have been trained and approved to fill out the necessary paperwork, request vouchers and take the savings off the sticker price of your new hybrid truck. Environmental Defense Fund applauds CARB for taking the needs of vehicle purchasers into account and hope HVIP’s success can be a model for other states.

    In addition to reducing greenhouse gas emissions and soot on California’s highways, the 800 hybrids expected to be sold through the program will boost our economy. In 2009, researchers at Duke University completed a study on the hybrid truck value chain. They found that the hybrid truck industry employs workers in at least 143 sites across the United States, in some of the cities hardest hit by the recession. HVIP will increase the total number of hybrid trucks on the road by about 50%. Now that’s what I call job creation!

    For more information on hybrid trucks and other opportunities to reduce emissions from corporate vehicles, visit the Green Fleet section of EDF’s website. There you can find a guide to incentive programs to support the purchase of hybrid trucks and track the makes and models of hybrid trucks currently available to fleets.

  • Southeast fishery closures make the New York Times

    SEfishingboat-smSoutheast fishermen recently finished the first month of closures on many popular fish. Many fisheries won’t open again for several months and reality is sinking in across the region.

    The New York Times is even taking notice. When a region’s fishery woes make ink in one of the most prominent papers in the nation, you know it’s a big deal.

    It’s apparent that closures aren’t working for fishing businesses, restaurants or local economies. Fishermen can’t make a living when they can’t fish. Businesses that rely on local fish must turn to far off places to get it. What’s not as apparent is that closures aren’t even very good for the region’s ecosystem, because they force fishermen to fish harder on other species that aren’t closed. This can cause market gluts and an early end to the fishing season for many species, which just multiplies current problems.

    However, in all the sobering news coverage that’s come out lately, outlets are overlooking a solution that’s good for fish and fishing businesses.

    Southeast fishermen need catch shares, which allow fishermen the flexibility to fish when the weather and prices are good and improve collection of fishery data, all while rebuilding fish populations. 

    The good news is that the South Atlantic Fishery Management Council is already exploring this solution. In fact, the Council has scheduled a catch shares workshop for March 1 preceding the Council’s next regular meeting. If you’re interested in learning more about the best solution for Southeast fisheries, I encourage you to attend. The meeting is open to the public.

  • How an inside look at EDF changed my perspective on corporate environmental management

    If you happened to miss my previous post, I recently finished an externship in Environmental Defense Fund's Corporate Partnerships Program, working on the Green Returns team. After graduating from Wharton last spring, I got the opportunity to work at EDF before beginning my full-time consulting job at Bain & Company. When I started at EDF, I hoped that the experience would teach me about corporate environmental management and expose me to a new perspective. After five months, I would say – mission accomplished.

    During my time at EDF, I have worked in-depth on two projects. For the first two months, I scoped the landscape of corporate environmental sustainability benchmarking tools and initiatives. Most recently, I collected sustainability best practice content around key topics such as data center energy efficiency, paper use, and commercial lighting.

    After working on these projects and supporting the Green Returns team on other tasks, I realize the importance and benefit of including environmental considerations in both long-term strategy and daily operations. The knowledge I acquired at EDF around environmental management is a new addition to my toolkit that I will use to help companies improve. I feel that I am now better able to holistically assess a company’s risk and opportunity by incorporating their impact on the environment into my strategic analysis. And from an operational perspective, I will be able to communicate to companies how addressing areas such as energy efficiency and waste reduction result in quick wins that save money and lessen environmental impact. Put quite simply, when I am faced with a company needing to reduce costs, my first question will now be – do they turn off the lights and computers at the end of the day?

  • Another day another closure

    safmcwebsite4 Feb2010arrows

    I recently spent a few hours taking stock of how our Nation is doing, using traditional approaches to fisheries management. My conclusion: not very well. As of December 31st, sixty federally managed fish stocks and stock complexes (containing an additional twenty species or so) were either overfished, being over fished, or both. That doesn’t even count the stocks for which the scientific information is so poor that we are “flying” blind or, many important but overfished non-federal stocks.

    Take a look at the homepage of the South Atlantic Council's website (to the right). It shows closures for most of the “money fish” in the region – king mackerel, black sea bass, vermilion snapper, red snapper, groupers, and most of the other shallow-water reef fish. The costs of business-as-usual to commercial and recreational fishermen, fishing families, coastal communities, and coastal economies is staggering . . . not to mention to ocean ecosystems.

    Thankfully, there is a better way — catch shares. While there are no silver bullets that will fix these fishery problems overnight – the problems been decades in the making – catch shares are the clear solution. You can read more about how to design a catch share at EDF's Catch Share Design Center.

  • Toxic America: Time for Reform – Ask EDF Scientists

    We recently interviewed two EDF scientists on the need for reforming America's toxic chemicals laws.

    Read our Toxic America Q&A.

    You can also pose your own climate questions below and we'll get back to you here in the Green Room.

  • Toxic Chemicals in Consumer Products: More than Just Consumer Exposure

    Cal Baier-AndersonCal Baier-Anderson, Ph.D., is a Health Scientist.

    An article recently published in the journal Macromolecules reports on the development of a new process that the authors claim can prevent the migration of phthalates from PVC plastic. This “breakthrough” will undoubtedly be used to argue that industry should be allowed to continue to use a retinue of toxic chemicals in the manufacture of PVC destined for use in a broad variety of applications. 

    Concern for consumer exposures is often the main argument made against the use of toxic chemicals in consumer applications. With evidence of exposure to chemicals like phthalates in nearly everyone who has been tested, including pregnant women, this is understandable. 

    But even if the new claims are proven to be true, there are many other reasons we need to find safer substitutes for such chemicals: worker exposures, environmental releases and end-of-life recycling and disposal issues, to name a few. The potential impacts from continued use of toxic chemicals must be examined across their entire lifecycle.

    PVC lifecycle concerns extend beyond phthalates

    Polyvinyl chloride (PVC) plastic is a prime example of a material that should be reserved for use in only critical applications that have no available substitutes. PVC is made from vinyl chloride, a known human carcinogen. To protect workers, exposures must be tightly controlled, as in the past there have been documented worker exposures resulting in cancer. Both accidental and incidental releases to the environment are an ongoing concern and there have instances of groundwater contamination at some production sites (for example, see here and here). 

    When it comes to consumer products and medical uses, exposure to vinyl chloride itself has been less of a concern than exposure to the plasticizing agents used to soften the PVC, such as phthalates. These have proven problematic due to migration out of the plastics and into humans. And then there are the end-of-life recycling and disposal issues. Unfortunately, PVC plastic is not readily recyclable and most plastic winds up in incinerators (which can generate ultra-toxic dioxins), in landfills (which must be monitored for leakage in perpetuity), or in water bodies (case in point is the vast floating island of plastic debris in the North Pacific). 

    While we can and should take steps to reduce consumer exposures to chemicals of concern, such as phthalates, we need to do so by broadly evaluating the materials we use, including how they are made and how they are managed after use. In short, we need to find ways to reduce both the use of toxic chemicals and their impacts throughout the entire lifecycle. And while it may not be feasible to eliminate all uses of such chemicals, we can and should reserve them for critical applications that have no safer substitutes.

  • Environmental management can help the PE sector create “green returns”

    Environmental Defense Fund’s Green Returns team is just back from the Dow Jones Private Equity Analyst Outlook 2010 conference in New York, where we sponsored, exhibited and presented. There is no doubt that EDF is the first environment group to do all three at a Dow Jones conference. It was a great chance for us to connect with executives from a number of leading private equity and venture capital firms, including 3i, Apollo Management, Clayton, Dubilier & Rice, Huntsman Gay Global Capital, THL Partners and others.

    Our message at the conference and for the private equity sector in the future is straightforward:

    Environmental management and innovation should be one of private equity’s key strategies now and for the future.

    A changing world and challenging economy is forcing companies in all industries – including private equity – to transform to remain competitive. As a result, private equity firms are looking for new ways to lead and create value. Taken to scale across portfolios, a creative approach to environmental issues can create value by improving due diligence, boosting portfolio company performance, presenting new growth and investment opportunities and building stronger relationships with LPs and other stakeholders.

    Today’s announcement [PDF] by private equity leader Kohlberg Kravis Roberts & Co. (KKR) is proof that environmental management can help drive value creation at scale. In fact, the release includes a quote from KKR Co-founder Henry Kravis stating that “The business case for environmental management has never been stronger.”

    That’s why KKR has expanded the Green Portfolio program that we co-developed and tested at three companies in 2008 (U.S. Foodservice, PRIMEDIA and Sealy) to include twenty percent of its global private equity portfolio today.

    In 2008, EDF and KKR worked with U.S. Foodservice, PRIMEDIA and Sealy to measure and improve business and environmental performance. These pilots helped EDF to develop our Green Returns approach and the three companies to capture over $16M in annual cost savings and reduce 25,000 tons of CO2 pollution. Based on these results KKR expanded the Green Portfolio Program in 2009 to include Accellent Inc., Biomet Inc., Dollar General Corp., SunGard Data Systems Inc. and HCA Inc. Today’s announcement adds four additional companies: First Data, Lehigh Phoenix (a division of Visant), Oriental Brewery and Tarkett.

    EDF believes that this is just the beginning. Environmental management can play a much larger role in value creation across the broader private equity industry. That’s why we recently released our Green Returns approach, resources and case studies to help industry leaders take advantage of this opportunity.

    Green Returns is a tested and flexible approach for private equity firms to measure and improve business and environmental performance across their portfolios. It is tailored to the strengths of the private equity sector and has quickly proven to yield significant business and environmental results, including millions of dollars in annual cost savings and thousands of tons of pollution.

    Visit http://edf.org/GreenReturns to learn more and get started today.

  • EDFix Call #5 afterthoughts: Governing the Commons

    EDFix Call #5 – Summary (12:19)

    Download MP3 | Subscribe in iTunes

    EDFix Call #5 – Full (55:38)

    Download MP3 | Subscribe in iTunes

    Get Call Updates by Email

    On January 25, 2010 we talked with Charlotte Hess, Jesse Ribot and Ruth Meinzen-Dick about recent insights on governing the commons and our emerging understanding of the "New Commons."

    Hess described those New Commons as shared resources for which there are no pre-existing rules or norms. Often a new commons emerges because of erosion of public goods or new opportunities brought about by technology, such as the Internet and data about the human genome. The New Commons are less about "property" than they are about the question, "how do we share and protect these resources?"

    Many interesting points came up, such as the importance of visibility and the lack of one best governance model. On visibility, the starting quote was "trees are easier to manage than fish or water quality." Ironically, making a resource like water visible in a reservoir might be considerably less efficient than storing it in the aquifer, where much of it wouldn't evaporate, but it would be easier to monitor (rogue wells are hard to detect).

    On governance models, the consensus is that there is no one best way. Generally, people local to a commons are the best informed to design its governance. Participation really matters, as do known rules. Our conversation continued, touching on:

    • the distinction between "good" and "goods" and the need for better language to discuss these topics.
    • Elinor Ostrom's Nobel Prize
    • the need to establish property rights for groups or the public (not just individuals and organizations),
    • lessons we can learn from the study of traditional commons and open-source projects, and
    • the threat of the anti-commons and enclosure.

    Resources we discussed include:

    Listen to the full podcast to learn more.

    Please join us for our next EDFix call on February 8, at 9am Pacific, on the recently announced GreenXchange.

    You can also: