Blog

  • New From NAP 2012-11-19 10:45:01

    Final Book Now Available

    The growth of electronic publishing of literature has created new challenges, such as the need for mechanisms for citing online references in ways that can assure discoverability and retrieval for many years into the future. The growth in online datasets presents related, yet more complex challenges. It depends upon the ability to reliably identify, locate, access, interpret, and verify the version, integrity, and provenance of digital datasets. Data citation standards and good practices can form the basis for increased incentives, recognition, and rewards for scientific data activities that in many cases are currently lacking in many fields of research. The rapidly-expanding universe of online digital data holds the promise of allowing peer-examination and review of conclusions or analysis based on experimental or observational data, the integration of data into new forms of scholarly publishing, and the ability for subsequent users to make new and unforeseen uses and analyses of the same data-either in isolation, or in combination with, other datasets.

    The problem of citing online data is complicated by the lack of established practices for referring to portions or subsets of data. There are a number of initiatives in different organizations, countries, and disciplines already underway. An important set of technical and policy approaches have already been launched by the U.S. National Information Standards Organization (NISO) and other standards bodies regarding persistent identifiers and online linking.

    The workshop summarized in For Attribution — Developing Data Attribution and Citation Practices and Standards: Summary of an International Workshop was organized by a steering committee under the National Research Council’s (NRC’s) Board on Research Data and Information, in collaboration with an international CODATA-ICSTI Task Group on Data Citation Standards and Practices. The purpose of the symposium was to examine a number of key issues related to data identification, attribution, citation, and linking to help coordinate activities in this area internationally, and to promote common practices and standards in the scientific community.

    [Read the full report]

    Topics: |

  • Renewable Energy Law News – Week of November 12

    Fate of wind energy production tax credit in hands of Obama, House GOP, officals say

    The fate of a tax credit that advocates say is needed to maintain tens of thousands of wind energy jobs will be decided during high-stakes, last-minute negotiations between President Obama and House Republicans over fiscal issues, officials said Tuesday.

    The wind energy production tax credit is due to expire at the end of the year. Its extension stalled in Congress this summer amid fierce opposition from some conservative House Republicans. The last chance to extend the measure is in the budget deal that will be cut between Obama and Republicans in the lame duck session of Congress.

    Backers of the credit tried to ramp up pressure to extend the $12 billion break Tuesday with a teleconference featuring several governors, who noted that uncertainty over its fate has led to thousands of job losses across the country. A study by a wind energy group found that 37,000 jobs would be lost if the credit expires.

    The credit’s supporters say the government has subsidized fossil fuels like oil for more than a century. Opponents argue it distorts the energy marketplace and leads to higher prices.

    Governors Urge Congress to Renew Wind Energy Production Tax Credit

    Salt Lake City, UT — With the expiration of the wind energy Production Tax Credit looming and the clock ticking rapidly away to the end of 2012, a bipartisan group of U.S. governors is urging Congress to act now to save jobs. In a joint press conference held today, Senator Chuck Grassley (R-IA) stressed that uncertainty over the extension of the wind energy Production Tax Credit (PTC) is already beginning to have an impact on renewable energy jobs.

    “The uncertainty about the future of this tax incentive,” Grassley said, “hurts the economic good that these policies do.” Grassley, who authored the original wind energy PTC in 1992 and has also sponsored Senate bill (S. 3521), which aims to extend the tax credit for at least another year, pointed to the expiration of the biodiesel tax credit in 2010 as an example that he says resulted in 23,000 jobs being “put on hold.” This is a situation that all involved are keen to prevent from happening to wind energy in their states.

    Governor Terry Branstad (R-IA) also cited uncertainty about the wind energy PTC’s fate as a major playing factor in the decision of some companies to have already begun eliminating jobs. “Due to the uncertainty,” Branstad said, “we’ve begun to see a negative economic impact and loss of jobs in our states. In Iowa, Siemens recently announced the layoff of 400 employees at their plant in Fort Madison, and Clipper Windpower laid off 100 workers at their plant in Cedar Rapids. We have literally thousands of wind energy related jobs in our state. These are high tech, high paying jobs.” Branstad says he remains hopeful that Congress will act quickly to extend the PTC.

    Branstad is the chair of the Governors’ Wind Energy Coalition, which is a group comprised of 28 state governors who all share the goal of leveraging wind energy resources as a way to pursue the long-held goal of lasting energy independence.

    “Nationally, wind energy drives about $10 to $20 billion a year in private sector capital investment and employs almost 75,000 Americans,” said John Kitzhaber (D-OR), Governor of Oregon and vice chair of the Governors’ Wind Energy Coalition. Kitzhaber used Oregon’s own Sherman County as an example of how rural communities can utilize wind energy production to drive revenue. “The county now receives $33 million per year in revenue from wind farms,” Kitzhaber said. “That’s revenue that has proved essential to sustain schools, fire departments and road maintenance.”

    Hawaii issues new rules for renewable energy tax credit

    HONOLULU – The state Department of Taxation on Friday issued new rules for the renewable energy tax credits that have spurred more residents to install solar panels.

    The department said it is doing so to provide clarity to taxpayers, but environmentalists and renewable energy advocates said the new rules jeopardize the state’s progress in moving away from imported fossil fuels.

    The rules, which will take effect on Jan. 1, require renewable energy systems to meet set output capacity requirements. The Sierra Club and Earthjustice said the change would limit the solar tax credit for the average residential solar power system to $5,000. This would effectively cut the tax credit in half and put solar power out of the reach of many families, they said.

    The department explained its decision by saying the previous rules, issued in 2010, created uncertainty and an unlevel playing field. The department has been receiving complaints about the rules for more than a year, it said.

    The law grants residents and businesses a tax credit for installing a renewable energy system. Some people, however, have been advised by the companies putting in their solar panels to say their installation consists of multiple systems and then claimed the credit multiple times. This has made solar panels more affordable and encouraged many more people to buy them but it’s also depleted tax revenues and made it harder for the state to balance its budget.

    “After listening to taxpayers concerns, the department is issuing these new temporary rules in order to provide consistent, uniform and fair application of the tax credit law, while still supporting the State’s public policy goal of reducing our reliance on fossil fuel,” the department said in a statement.

    Photo via Flickr

  • New From NAP 2012-11-15 13:45:01

    Prepublication Now Available

    Following a 2011 report by the National Research Council (NRC) on successful K-12 education in science, technology, engineering, and mathematics (STEM), Congress asked the National Science Foundation to identify methods for tracking progress toward the report’s recommendations. In response, the NRC convened the Committee on an Evaluation Framework for Successful K-12 STEM Education to take on this assignment. The committee developed 14 indicators linked to the 2011 report’s recommendations. By providing a focused set of key indicators related to students’ access to quality learning, educator’s capacity, and policy and funding initiatives in STEM, the committee addresses the need for research and data that can be used to monitor progress in K-12 STEM education and make informed decisions about improving it.

    The recommended indicators provide a framework for Congress and relevant deferral agencies to create and implement a national-level monitoring and reporting system that: assesses progress toward key improvements recommended by a previous National Research Council (2011) committee; measures student knowledge, interest, and participation in the STEM disciplines and STEM-related activities; tracks financial, human capital, and material investments in K-12 STEM education at the federal, state, and local levels; provides information about the capabilities of the STEM education workforce, including teachers and principals; and facilitates strategic planning for federal investments in STEM education and workforce development when used with labor force projections. All 14 indicators explained in this report are intended to form the core of this system. Monitoring Progress Toward Successful K-12 STEM Education: A Nation Advancing? summarizes the 14 indicators and tracks progress towards the initial report’s recommendations.

    [Read the full report]

    Topics:

  • North Carolina, Delmarva Coastlines Changed by Hurricane Sandy

    USGS releases new before-and-after photos

    Updated

    ST. PETERSBURG, Fla. – The USGS has released a series of aerial photographs showing before-and-after images of Hurricane Sandy’s impacts on the Atlantic Coast. Among the latest photo pairs to be published are images showing the extent of coastal change in North Carolina, Virginia, Maryland, and Delaware.

    The photos, part of a USGS assessment of coastal change from as far south as the Outer Banks of North Carolina to as far north as Massachusetts, show that the storm caused dramatic changes to portions of shoreline extending hundreds of miles. Pre- and post-storm images of the New Jersey and New York shoreline in particular tell a story of a coastal landscape that was considerably altered by the historic storm. Meanwhile, images from hundreds of miles south of the storm’s landfall demonstrate that the storm’s breadth caused significant coastal change as far south as the Carolinas.

    “Sandy taught us yet again that not all Cat-1 hurricanes are created equal: the superstorm’s enormous fetch over the Atlantic produced storm surge and wave erosion of historic proportions,” said USGS Director Marcia McNutt. “We have seized this opportunity to gather unique data on a major coastline-altering event.”

    As major storms approach, the USGS conducts pre-storm and post-storm flights to gather aerial images along the length of the coastline expected to experience impacts from the storm’s landfall. Identifying sites of such impacts helps scientists understand which areas are likely to undergo the most severe impacts from future storms, and improves future coastal impact forecasting. 

    Photo pairs from North Carolina to Massachusetts are now available online.

    “This storm’s impact on sandy beaches included disruption of infrastructure in the south, such as overwash of roads near Pea Island, Buxton, and Rodanthe in N.C., and some dune erosion near Duck, N.C.,” said St. Petersburg-based USGS oceanographer Nathaniel Plant. Such storm-induced changes to the coastal profile can jeopardize the resilience of impacted coastal communities in the path of subsequent storms.

    “Houses and infrastructure may be more vulnerable to future storms because beaches are narrower and dunes are lower,” Plant said.

    Overwash occurs when storm surge and waves exceed the elevation of protective sand dunes, thereby transporting sand inland. In addition to threatening infrastructure like roadways, it can bury portions of buildings and cause extensive property damage.

    The configuration of a coastline’s physical features determine how it will respond to storm forces, and whether it will experience erosion, overwash, or inundation.

    In South Bethany, Delaware, the storm appears to have eroded a low dune that had stood between the Atlantic and a row of beachfront homes. Like overwash, beach and dune erosion can compromise a coastline’s natural defenses against future storms.

    The Hurricanes and Extreme Storms team aims to quantify the degree to which such these defenses have weakened in all areas Hurricane Sandy impacted.

    Data collected from these surveys are also used to improve predictive models of potential impacts from future severe storms. Before a storm makes landfall, USGS makes these predictions to help coastal communities identify areas particularly vulnerable to severe coastal change, such as beach and dune erosion, overwash, and inundation.

    For instance, in the days before Sandy approached the eastern seaboard, the USGS ran models forecasting that 91 percent of the Delmarva coastline would experience beach and dune erosion, while 98 percent and 93 percent of beaches and dunes in New Jersey and New York, respectively, were likely to erode. Preliminary analysis suggests that Hurricane Sandy rapidly displaced massive quantities of sand in a capacity that visibly changed the landscape. 

    The USGS assessment also includes pre- and post-landfall airborne lidar data, which offers a more quantitative look at the extent of coastal change caused by Sandy. Lidar, or light detection and ranging, is an aircraft-based remote sensing method that uses laser pulses to collect highly detailed ground elevation data.

  • Researchers report potential new treatment to stop Alzheimer’s disease

    Last March, researchers at UCLA reported the development of a molecular compound called CLR01 that prevented toxic proteins associated with Parkinson’s disease from binding together and killing the brain’s neurons.
     
    Building on those findings, they have now turned their attention to Alzheimer’s disease, which is thought to be caused by a similar toxic aggregation or clumping, but with different proteins, especially amyloid-beta and tau.
     
    And what they’ve found is encouraging. Using the same compound, which they’ve dubbed a “molecular tweezer,” in a living mouse model of Alzheimer’s, the researchers demonstrated for the first time that the compound safely crossed the blood–brain barrier, cleared the existing amyloid-beta and tau aggregates, and also proved to be protective to the neurons’ synapses — another target of the disease — which allow cells to communicate with one another.
     
    The report appears in the current online edition of the journal Brain.
     
    “This is the first demonstration that molecular tweezers work in a mammalian animal model,” said Gal Bitan, an associate professor of neurology at UCLA and the senior author of the study. “Importantly, no signs of toxicity were observed in the treated mice. The efficacy and toxicity results support the mechanism of this molecular tweezer and suggest these are promising compounds for developing disease-modifying therapies for Alzheimer’s disease, Parkinson’s and other disorders.”
     
    Molecular tweezers are complex molecular compounds capable of binding to other proteins. Shaped like the letter “C,” these compounds wrap around chains of lysine, a basic amino acid that is a constituent of most proteins. Bitan and his colleagues, including Aida Attar, first author of the study and a graduate student in Bitan’s lab, have been working with a particular molecular tweezer called CLR01.
     
    In collaboration with scientists at the Università Cattolica in Rome, the researchers, working first in cell cultures, found that CLR01 effectively inhibited a process known as synaptotoxicity, in which clumps of toxic amyloid damage or destroy a neuron’s synapses.
     
    Even though synapses in transgenic mice with Alzheimer’s may shut down and the mice may lose their memory, upon treatment, they form new synapses and regain their learning and memory abilities.
     
    “For humans, unfortunately, the situation is more problematic because the neurons gradually die in Alzheimer’s disease,” Bitan said. “That’s why we must start treating as early as possible. The good news is that the molecular tweezers appear to have a high safety margin, so they may be suitable for prophylactic treatment starting long before the onset of the disease.”
     
    Next, using a radioactive “label,” the researchers were able to confirm that the compound had crossed the mouse’s blood–brain barrier and was effective in clearing the brain of amyloid-beta and tau aggregates.
     
    “This work shows that molecular tweezers do a number of things — they help to ameliorate multiple pathologic features of Alzheimer’s, including amyloid plaques, neurofibrillary tangles and brain inflammation, and our cell culture experiments demonstrated that molecular tweezers block the toxic effect of amyloid-beta on synaptic integrity and communication,” Bitan said.
     
    “We call these unique tweezers ‘process-specific,’ rather than the common protein-specific inhibitors,” he added, meaning the compound only attacks the targeted toxic aggregates and not normal body processes. “That’s a big deal, because it helps confirm evidence that the molecular tweezers can be used safely, ultimately supporting their development as a therapy for humans.”
     
    The next step, Bitan hopes, is to confirm that the tweezers improve memory and not just brain pathology. The researchers say they are working on this question and already have encouraging preliminary data.
     
    There were multiple authors on the study in addition to Bitan and Attar. Please see the study for the complete list.
     
    The work was supported by the UCLA Jim Easton Consortium for Alzheimer’s Drug Discovery and Biomarker Development; American Health Assistance Foundation grant A2008-350; RJG Foundation grant 20095024; a Cure Alzheimer’s Fund grant; individual pre-doctoral National Research Service Award 1F31AG037283; National Institute of Health grant R01AG021975; and a Veteran’s Administration Merit Award.
     
    The UCLA Department of Neurology, with over 100 faculty members, encompasses more than 20 disease-related research programs, along with large clinical and teaching programs. These programs cover brain mapping and neuroimaging, movement disorders, Alzheimer’s disease, multiple sclerosis, neurogenetics, nerve and muscle disorders, epilepsy, neuro-oncology, neurotology, neuropsychology, headaches and migraines, neurorehabilitation, and neurovascular disorders. The department ranks in the top two among its peers nationwide in National Institutes of Health funding.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • Airborne particles smuggle pollutants to far reaches of globe

    Pollution from fossil fuel burning and forest fires reaches all the way to the Arctic, even though it should decay long before it travels that far. Now, lab research can explain how pollution makes its lofty journey: rather than ride on the surface of airborne particles, pollutants snuggle inside, protected from the elements on the way. The results will help scientists improve atmospheric air-quality and pollution transport models.

    The results also show that the particles that envelop pollutants also benefit from this arrangement. The new study in Environmental Science & Technology shows that the airborne particles, made from natural molecules mostly given off by live or burning plants, last longer with a touch of pollutant packed inside. The pollutants are known as polycyclic aromatic hydrocarbons, or PAHs, and are regulated by environmental agencies due to their toxicity.

    “What we’ve learned through fundamental studies on model systems in the lab has very important implications for long-range transport of pollutants in the real world,” said physical chemist Alla Zelenyuk of the Department of Energy’s Pacific Northwest National Laboratory. “In this study, we propose a new explanation for how PAHs get transported so far, by demonstrating that airborne particles become a protective vessel for PAH transport.”

    Floating in the air and invisible to the eye, airborne particles known as secondary organic aerosols live and die. Born from carbon-based molecules given off by trees, vegetation, and fossil fuel burning, these airborne SOA particles travel the currents and contribute to cloud formation. Along for the ride are pollutants, the PAHs, that have long been thought to coat the particles on their surface.

    For decades, atmospheric scientists have been trying to explain how atmospheric particles manage to transport harmful pollutants to pristine environments thousands of miles away from their starting point. The particles collected in areas such as the Arctic also pack higher concentrations of pollutants than scientists’ computer models predict.

    The predictions are based on the assumption that the particles are like liquid spheres, whose fluidity allows PAHs to escape. But they don’t escape, and one recent advance has helped to pin down why PAHs are remaining stuck in their particle lairs. Zelenyuk and her colleagues at EMSL, DOE’s Environmental Molecular Sciences Laboratory at PNNL, developed an ultra-sensitive instrument that can determine the size, composition and shape of individual particles.

    Called SPLAT II, the instrument can analyze millions of tiny particles one by one. The ability of this novel instrument to characterize individual particles provides unique insight into their property and evolution.

    Using SPLAT II to evaluate laboratory-generated SOA particles from alpha-pinene, the molecule that gives pine trees their piney smell, Zelenyuk has already discovered that SOA particles aren’t liquid at all. Her team’s recent work revealed they are more like tar — thick, viscous blobs that are too solid to be liquid and too liquid to be solid.

    Armed with this data, Zelenyuk and researchers from Imre Consulting in Richland and the University of Washington in Seattle set out to determine the relation between the SOA particle and the PAHs. Again they used alpha-pinene for the SOA. For the PAH, they used pyrene, a toxic pollutant produced by burning fossil fuels or vegetation such as forests.

    They created two kinds of particles. The first kind exemplified the classical SOA: first they produced the particles with alpha-pinene and then coated them with pyrene. The second kind resembled what likely happens in nature: they mixed alpha-pinene and pyrene and let the particles form with both molecules present. Then they sent the particles through SPLAT and watched what happened to them over time.

    With the pyrene-coated particles, the team found the PAH pyrene evaporating off the surface of the particle quickly, all of it gone after four hours. By the next day, the particle itself had shrunk by about 70 percent, showing that the alpha-pinene SOA also evaporates, although more slowly than pyrene.

    When they created the particles in the presence of both SOA and PAH, the PAH evaporated much more slowly. Fifty percent of the original PAH still remained in the particle after 24 hours. In addition, the SOA particle itself stayed bulky, losing less than 20 percent of its volume.

    These results showed the team that PAHs become trapped within the highly viscous SOA particles, where they remain protected from the environment. The symbiotic relationship between the atmospheric particles and pollutants surprised Zelenyuk: SOAs help PAHs travel the world, and the PAHs help SOAs survive longer.

    Zelenyuk and her colleagues performed comparable experiments with other PAHs and SOAs and found similar results.

    In the real world, Zelenyuk said, the evaporation will be even slower. These results will help modelers better simulate atmospheric SOA particles and transport of pollutants over long distances.

    This work was supported by the Department of Energy Office of Science and PNNL’s Chemical Imaging Initiative.


    Reference: Alla Zelenyuk, Dan Imre, Josef Beránek, Evan Abramson, Jacqueline Wilson and Manish Shrivastava, Synergy between Secondary Organic Aerosols and Long-Range Transport of Polycyclic Aromatic Hydrocarbons, Environmental Science & Technology, Nov. 7, 2012, doi: 10.1021/es302743z.

  • Streams Show Signs of Degradation at Earliest Stages of Urban Development

    The loss of sensitive species in streams begins to occur at the initial stages of urban development, according to a new study by the USGS. The study found that streams are more sensitive to development than previously understood.

    “We tend not to think of waterways as fragile organisms, and yet that is exactly what the results of this scientific investigation appear to be telling us,” said USGS Director Marcia McNutt. “Streams are more than water, but rather communities of interdependent aquatic life, the most sensitive of which are easily disrupted by urbanization.”

    Contaminants, habitat destruction, and increasing streamflow flashiness resulting from urban development can degrade stream ecosystems and cause degradation downstream with adverse effects on biological communities and on economically valuable resources, such as fisheries and tourism.

    For example, by the time urban development had approached 20 percent in watersheds in the New England area, the aquatic invertebrate community had undergone a change in species composition of about 25 percent.

    The study also found that the health of highly-degraded streams can be improved by implementing management actions that are designed to reduce specific stressors.

    “Biological communities were not resistant to even low levels of urban development. In the study sensitive invertebrate species were being lost over the initial stages of development in relatively undisturbed watersheds,” said Dr. Gerard McMahon, lead scientist on the study. “Understanding how stream ecosystems are impacted by urban development can assist in the development of management actions to protect and rehabilitate urban stream ecosystems.”

    Multiple streams in nine metropolitan areas across the continental U.S. were sampled to assess the effects of urban development on stream ecosystems. Study areas include Atlanta, Ga., Birmingham, Ala., Boston, Mass., Dallas, Texas, Denver, Colo., Milwaukee, Wis., Portland, Ore., Raleigh, N.C., and Salt Lake City, Utah.

    The study also found that the effects of urbanization on the biological community vary geographically depending on the predominant land cover and the health of the community prior to urban development. In the study, the greatest loss of sensitive species occurred in Boston, Portland, Salt Lake City, Birmingham, Atlanta, and Raleigh metropolitan areas, where the predominant land cover was forested prior to urban development. The smallest loss of sensitive species occurred in Denver, Dallas, and Milwaukee metropolitan areas where land cover was primarily agriculture before urban development.

    “The reason for this difference was not because biological communities in the Denver, Dallas, and Milwaukee areas are more resilient to stressors from urban development, but because the biological communities had already lost sensitive species to stressors from pre-urban agricultural land use activities,” said McMahon.

    Although urban development creates multiple stressors, such as an increase in concentrations of insecticides, chlorides, and nutrients, that can degrade stream health—no single factor was universally important in explaining the effects of urban development on stream ecosystems. The USGS developed an innovative modeling tool to predict how different combinations of urban-related stressors affect stream health. This tool, initially developed for the New England area, can provide insights on how watershed management actions to improve one or more of these stressors may increase the likelihood of obtaining a desired biological condition.

    The effects of urbanization on streams, including information about this and past studies, as well as graphics and maps, and videos can be online.

    Results of this nationwide study and details about the effects of urbanization on the nine metropolitan areas can be found in a new USGS publication titled, “Effects of urban development on stream ecosystems in nine metropolitan study areas across the United States.”

    Management strategies used throughout the U.S. to reduce the impacts of urban development on stream ecosystems are described in a new USGS report written in partnership with the Center for Watershed Protection in Maryland titled, “Strategies for Managing the Effects of Urban Development on Streams.”

    This study was done by the USGS National Water-Quality Assessment Program, which conducts regional and national assessments of the nation’s water quality to provide an understanding of water-quality conditions, whether conditions are getting better or worse over time, and how natural features and human activities affect those conditions.

  • New global subsidy that provides access to most effective malaria drugs shows promise

    A new international program, conceived in part by a UCLA physician, has rapidly transformed access to lifesaving anti-malarial drugs by providing cheap, subsidized artemisinin-based combination therapies in seven African countries that account for a quarter of the world’s malaria cases.
     
    The first independent evaluation of the Affordable Medicines Facility–malaria (AMFm) program was recently published in the journal The Lancet. The program is based at the Global Fund in Geneva, an international financing institution dedicated to disbursing funds to prevent and treat infectious diseases. The evaluation shows that the program improved access to key artemisinin combination therapies, or ACTs, which offer broader protection and less antibiotic resistance than anti-malaria medications currently available in those African nations.
     
    The Oct. 31 Lancet study was accompanied by an editorial by a panel of some of the world’s most eminent scientists in this field, which praised AMFm’s ability to reach critical populations but also warned that despite the program’s success, its future funding could be threatened. 
     
    “Losing African children to malaria is such an unnecessary tragedy,” said Dr. Claire Panosian Dunavan, a clinical professor of infectious diseases at the David Geffen School of Medicine at UCLA, who was one of eight co-authors of the Lancet editorial. “Now that the global subsidy for ACTs has been proven to work through AMFm, I would hate to see the program end.”
     
    Panosian Dunavan, an expert in tropical diseases, is also one of the original authors of the 2004 Institute of Medicine report “Saving Lives, Buying Time,” which first proposed a global subsidy for modern anti-malarial drugs and led to the development of the AMFm program. 
     
    “Over the last 10 years, I’ve learned a lot from my economist colleagues,” she said. “Leveraging private markets to deliver lifesaving treatments to the global poor is indeed possible, as this global subsidy for malaria drugs has now demonstrated.”
     
    Panosian Dunavan worked closely with economist and Nobel laureate Kenneth J. Arrow, Dr. Ramanan Laxminarayan of the Center for Disease, Dynamics, Economics and Policy, and others in writing the original financing report and the Lancet editorial.
     
    In their comments, the editorial authors write, “In November 2012, the Board of the Global Fund will vote to either continue AMFm in a modified form after December 2013, or terminate the program. There is a strong push from donors (though not from countries) to integrate AMFm into the regular Global Fund model, whereby countries would choose how much of their country budget envelopes, which are already committed to other priorities supporting the public sector, to reallocate to AMFm. We believe that this approach will create instability in artemisinin demand, lower the number of ACT manufacturers, increase ACT prices, and abandon the millions who depend on AMFm-subsidized ACTs.”
     
    Worse, they say, “With the world’s largest global health funder [the U.S. President’s Malaria Initiative (PMI)] expressing unremitting opposition, even after the positive independent evaluation, the program’s future is uncertain. PMI has yet to suggest an alternative that would come close to the access afforded by AMFm in the private sector.”
     
    The Lancet study evaluated national AMFm pilot programs in Ghana, Kenya, Madagascar, Niger, Nigeria, Tanzania (including Zanzibar) and Uganda.
     
    “Africa is home to 80 percent of malaria cases, yet most of the population do not have access to affordable ACTs”, said Kara Hanson of the London School of Hygiene and Tropical Medicine, one of the lead authors of the evaluation study. “Access is restricted by unreliable public health facility supply, high prices and limited availability in the private sector, where most people go to buy medicines. Cheaper, less effective anti-malarials currently dominate the market. Worryingly, artemisinin monotherapies (artemisinin alone, rather than in combination) are also widely available in some countries, and use of these medicines can encourage development of resistance to ACTs.”
     
    Changes in availability, price and market share were assessed in each country using nationally representative surveys of public- and private-sector outlets that stock anti-malarial drugs —both before the introduction of subsidized quality-assured ACTs (QAACTs) and supporting interventions, such as communication campaigns, and six to 15 months after their introduction.
     
    Between August 2010 and the end of 2011, more than 155 million doses of QAACTs were subsidized by AMFm. QAACT availability more than doubled in five countries, and market share more than doubled in four. The effect of AMFm was more limited in Niger and Madagascar, where AMFm ACT orders were lower.
     
    AMFm had a particularly dramatic effect on the private sector, where QAACT market share increased in all pilot programs, with the increase exceeding 30 percentage points in five. What is more, private, for-profit QAACT prices fell substantially (by up to 80 percent) in six countries, with the decrease ranging from $1.28 to $4.82 (U.S.) per dose.
     
    The market share of artemisinin monotherapies also experienced large declines in Nigeria and Zanzibar, the two countries where their presence on the market was highest at the start of the program.
     
    Although AMFm had less impact on public health facilities’ ACT supply, the study authors point out that there were substantial delays in ordering drugs and implementing the full program in some countries.
     
    “But not all of the changes observed can be attributed to AMFm,” the authors cautioned. “There was some evidence from two countries that prices had already begun to fall before AMFm started and the market share of ACTs had started to increase, although most of this increase occurred in the public sector.”
     
    According to study author Hanson, “It is clear that tapping into the private sector distribution chain can have a major influence on which anti-malarial treatments are available and their price and quality in just a few months, but more information is needed about whether the subsidized drugs are reaching those most in need and on how diagnostics can be scaled up in the public and private sectors.”
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • New From NAP 2012-11-14 13:45:01

    Final Book Now Available

    The electric power delivery system that carries electricity from large central generators to customers could be severely damaged by a small number of well-informed attackers. The system is inherently vulnerable because transmission lines may span hundreds of miles, and many key facilities are unguarded. This vulnerability is exacerbated by the fact that the power grid, most of which was originally designed to meet the needs of individual vertically integrated utilities, is being used to move power between regions to support the needs of competitive markets for power generation. Primarily because of ambiguities introduced as a result of recent restricting the of the industry and cost pressures from consumers and regulators, investment to strengthen and upgrade the grid has lagged, with the result that many parts of the bulk high-voltage system are heavily stressed.

    Electric systems are not designed to withstand or quickly recover from damage inflicted simultaneously on multiple components. Such an attack could be carried out by knowledgeable attackers with little risk of detection or interdiction. Further well-planned and coordinated attacks by terrorists could leave the electric power system in a large region of the country at least partially disabled for a very long time. Although there are many examples of terrorist and military attacks on power systems elsewhere in the world, at the time of this study international terrorists have shown limited interest in attacking the U.S. power grid. However, that should not be a basis for complacency. Because all parts of the economy, as well as human health and welfare, depend on electricity, the results could be devastating.

    Terrorism and the Electric Power Delivery System focuses on measures that could make the power delivery system less vulnerable to attacks, restore power faster after an attack, and make critical services less vulnerable while the delivery of conventional electric power has been disrupted.

    [Read the full report]

    Topics: Conflict and Security Issues | Energy and Energy Conservation

  • PNNL Science Artfully Displayed in Calendar and Traveling Exhibit

    For the first time in Pacific Northwest National Laboratory’s 47 year history PNNL is showcasing its science in a print calendar available to the general public. The artwork  will also go on the road  as part of a traveling exhibit throughout Washington state.

    PNNL’s 2013 “Discovery in Action” calendar features thirteen captivating scientific images along with the  stories behind them — from technology used to cool buildings more efficiently to minerals used to treat radioactive waste, and microbes important to improving human health to materials that capture the sun’s energy.

    “Science is amazing and beautiful,” said John LaFemina, PNNL’s director of Institutional Strategy. “The images in this calendar and traveling art exhibit clearly illustrate that the work we do at PNNL contributes to the safety, security and prosperity of our nation. They are also inspirational expressions of the creative skill and imagination of our staff; they are beautiful works of art.”

    Calendar images were selected from 99 staff-submitted entries during PNNL’s third annual Science as Art contest. The twelve winning entries were selected by the general public, who could vote for their favorite images on PNNL’s Facebook site this spring. All 99 photos are available for viewing on PNNL’s Facebook page.

    PNNL’s 2013 “Discovery in Action” calendar, published by BrownTrout Publishers, Inc., is available in limited quantities for purchase online at Amazon.com. Suggested retail price is $14.99. A downloadable PDF of the calendar is also available.

    Adjacent to each image in the calendar are the names of the PNNL team members as well as research partners and funding agencies including Thomas Jefferson National Accelerator Facility, University of Notre Dame, Washington State University and University of Central Florida. Funding agencies include the Department of Energy, Department of the Interior, Nuclear Regulatory Commission, National Security Agency, Department of Health and Human Services and the National Science Foundation.

    Images within PNNL’s 2013 calendar were captured using instrumentation at PNNL and two DOE national user facilities including EMSL, the Environmental Molecular Sciences Laboratory at PNNL, and the Advanced Photon Source at Argonne National Laboratory.

    Artwork presented in the calendar will also be on display at Columbia Basin College Planetarium Dec. 3-Feb. 1, and LIGO Hanford Observatory Jan. 28-March 1. Other upcoming exhibit locations include the Pacific Science Center in Seattle, WSU Tri-Cities, and other locations. Dates will be posted online when confirmed. Click here for more information about PNNL’s science in art.

  • UCLA Nursing researchers spotlight role of nursing in social justice at major symposium

    Nurses who conduct research on aging issues often hear stories from older adult patients that highlight the inequalities in our health care system, illustrate the boundaries of ethical decision-making that can impact clinical outcomes, and bring into focus unresolved social policy issues.
     
    Sadly, these voices do not get the attention they need or deserve.
     
    Now, researchers from the UCLA School of Nursing will examine these issues of social justice and how nurses can give voice to the elderly and other vulnerable populations to influence policy and care-delivery during a symposium at the Gerontological Society of America Annual Scientific Meeting on Nov. 15. Their presentation, “Advocating for Hidden Voices, Social Justice Among Vulnerable Populations,” runs from 8 to 9:30 a.m.
     
    “In their research, our nurses have heard stories about the discrimination and disparities among the marginalized in our society,” said Linda Phillips, director of the Center for Advancement of Gerontological Nursing Sciences at the UCLA School of Nursing. “As nurses and researchers, we have a responsibility to not tolerate these disparities for vulnerable populations.”
     
    During the symposium, researchers will discuss how the issues of social justice have arisen in a variety of areas during the development and implementation of their research:

    Abuse in California’s skilled nursing facilities
    Nursing homes are a place where seniors should be safe. Yet according to government figures, one-third of nursing homes in California have been cited for causing serious harm or death to patients. This presentation will discuss elder abuse in skilled nursing facilities and how the lack of fines and enforcement offer little incentive to initiate change in practice.
     
    African American men and prostate cancer
    Prostate cancer incidence and mortality rates are highest among older African American men; inadequate health care access, low socioeconomic status and race are all key factors. This presentation will focus on the role gerontological nurse–researchers can play in addressing these types of problems and will discus outcomes associated with financially based treatment inequities and how to use these stories to influence policy.
     
    Aging among older homeless woman
    The “golden years” are often looked upon by older adults as a period for reflection and enjoyment, but many find themselves destitute and homeless. Approximately 33 percent of chronically homeless adults are over 50 and are at high risk for chronic illness, social isolation and victimization. Moreover, they lack housing and access to health care. This presentation will discuss the development and implementation of programs that can meet the needs of this vulnerable population.
     
    Disparities among racial and ethnic groups
    Despite strong and convincing evidence of health disparities and expansive difference in health outcomes, there are limited studies being done that focus on the unique challenges faced by certain racial and ethnic groups. The final presentation will showcase the need for funding to address health disparities among these groups and where we are at currently in terms of funded research.
     
     
    “By sharing this research, we hope to the raise awareness of these healthcare discrepancies and start the work to make changes that build a healthy community for all,” Phillips said.
      
    The UCLA School of Nursing is redefining nursing through the pursuit of uncompromised excellence in research, education, practice, policy and patient advocacy. For more information, visit nursing.ucla.edu.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • New From NAP 2012-11-14 10:45:01

    Prepublication Now Available

    The U.S. Department of Homeland Security (DHS) is responsible for securing and managing the nation’s borders. Over the past decade, DHS has dramatically stepped up its enforcement efforts at the U.S.-Mexico border, increasing the number of U.S. Border patrol (USBP) agents, expanding the deployment of technological assets, and implementing a variety of “consequence programs” intended to deter illegal immigration. During this same period, there has also been a sharp decline in the number of unauthorized migrants apprehended at the border.

    Trends in total apprehensions do not, however, by themselves speak to the effectiveness of DHS’s investments in immigration enforcement. In particular, to evaluate whether heightened enforcement efforts have contributed to reducing the flow of undocumented migrants, it is critical to estimate the number of border-crossing attempts during the same period for which apprehensions data are available. With these issues in mind, DHS charged the National Research Council (NRC) with providing guidance on the use of surveys and other methodologies to estimate the number of unauthorized crossings at the U.S.-Mexico border, preferably by geographic region and on a quarterly basis. Options for Estimating Illegal Entries at the U.S.-Mexico Border focuses on Mexican migrants since Mexican nationals account for the vast majority (around 90 percent) of attempted unauthorized border crossings across the U.S.-Mexico border.

    [Read the full report]

    Topics: Behavioral and Social Sciences

  • PNNL expertise highlighted at Supercomputing

    From identifying common patterns in data to speeding up computers, researchers from the Department of Energy’s Pacific Northwest National Laboratory will share their computational expertise at this year’s Supercomputing conference.

    Also referred to as SC12, the annual gathering is the international conference for high-performance computing, networking, storage and analysis. It runs Nov. 10-16 at the Salt Palace Convention Center in Salt Lake City. Two noteworthy talks featuring PNNL research are described below.

    New algorithm pin-points similar data in seconds

    Data is everywhere these days. Biologists sift through vast amounts of error-prone data to understand how our cells work. Even librarians slog through mountains of information to better understand the materials they catalog. The key to comprehending today’s information explosion is finding meaningful patterns buried in the data — and then finding comparable data patterns in other, related sources. This technique is called network alignment. Computational scientists at PNNL and Purdue University have developed new methods to identify similar patterns in any type of data. Their procedures help find proteins that act the same in humans and mice, and help find ideas that act the same for librarians and Wikipedia editors.

    The existing methods used to solve these kinds of problems have been too slow to cope with the growing amount of data, prompting the PNNL and Purdue team to make them faster. To do this, they developed a new algorithm that uses an approach called approximate matching, which saves time by matching nearly identical patterns instead of exactly identical ones. They also developed new computer implementations that enabled the algorithm to use all a computer’s processors in parallel to quickly identify relationships between two different networks. Tests using both of these improvements showed that the algorithm found similar interactions between thousands of proteins in two species in just seconds and found comparable ideas between hundreds of thousands of topics in library systems and Wikipedia entries in less than a minute.

    PNNL’s Mahantesh Halappanavar led the research on how to quickly find approximate matchings with help from Purdue’s Arif Khan and Alex Pothen. And, Purdue’s David Gleich led the work on how to use approximate matchings to align networks. Gleich will present a paper describing this research Wednesday.

    4:30-5 p.m., Wed., Nov. 14: A multithreaded algorithm for network alignment via approximate matching, Arif Khan, David Gleich, Mahantesh Halappanavar & Alex Pothen, Room 355-EF.

    Software translates code, speeds up data-crunching

    Large and complex networks in parallel computers can lead to inefficient communications between processors that also slows down computation. This makes it difficult to achieve exascale computing, which is one thousand times faster than today’s fastest petascale supercomputers.  Scientists are developing strategies to reduce the time it takes to compute data and communicate those results between parallel processors. A team of researchers from PNNL, University of California, San Diego, and Lawrence Livermore National Laboratory have developed new software called Bamboo to help do just that.

    Traditionally, scientists have broken up a complex algorithm to speed things up. Different processors calculate bits of the algorithm and then each processor communicates its results to the others. Such division of labor is quicker than one processor doing all the work by itself. But communicating bunches of data between multiple processors can cause information bottlenecks that slow down the whole process. One solution is to initially calculate a portion of a processor’s data and communicate those results while the other portion is still being calculated. Called overlapping communications and calculations, this approach can reduce the overall time it takes to complete a job, but it requires extremely complex codes. That’s where Bamboo comes in. Bamboo automatically translates standard MPI parallel codes into a format that can easily overlap communication with available computation.  Without Bamboo, scientists have the onerous task of manually developing overlapping MPI code. Tests showed Bamboo-generated code was as good as or better than human-developed codes.

    PNNL’s Eric Bylaska drew on his experience developing complex code for NWChem, DOE’s premier molecular modeling software package, to help develop realistic test programs for the Bamboo framework. The University of California, San Diego’s Scott Baden, who led the project, will present a paper describing the team’s results Wednesday.

    10:30-11 a.m., Wed., Nov. 14: Bamboo – Translating MPI Applications to a Latency-Tolerant, Data-Driven Form, Tang Nguyen, Pietro Cicotti, Eric Bylaska, Dan Quinlan & Scott Baden, Room 255-EF.

  • New From NAP 2012-11-13 10:45:01

    Prepublication Now Available

    Adolescence is a distinct, yet transient, period of development between childhood and adulthood characterized by increased experimentation and risk-taking, a tendency to discount long-term consequences, and heightened sensitivity to peers and other social influences. A key function of adolescence is developing an integrated sense of self, including individualization, separation from parents, and personal identity. Experimentation and novelty-seeking behavior, such as alcohol and drug use, unsafe sex, and reckless driving, are thought to serve a number of adaptive functions despite their risks.

    Research indicates that for most youth, the period of risky experimentation does not extend beyond adolescence, ceasing as identity becomes settled with maturity. Much adolescent involvement in criminal activity is part of the normal developmental process of identity formation and most adolescents will mature out of these tendencies. Evidence of significant changes in brain structure and function during adolescence strongly suggests that these cognitive tendencies characteristic of adolescents are associated with biological immaturity of the brain and with an imbalance among developing brain systems. This imbalance model implies dual systems: one involved in cognitive and behavioral control and one involved in socio-emotional processes. Accordingly adolescents lack mature capacity for self-regulations because the brain system that influences pleasure-seeking and emotional reactivity develops more rapidly than the brain system that supports self-control. This knowledge of adolescent development has underscored important differences between adults and adolescents with direct bearing on the design and operation of the justice system, raising doubts about the core assumptions driving the criminalization of juvenile justice policy in the late decades of the 20th century.

    It was in this context that the Office of Juvenile Justice and Delinquency Prevention (OJJDP) asked the National Research Council to convene a committee to conduct a study of juvenile justice reform. The goal of Reforming Juvenile Justice: A Developmental Approach was to review recent advances in behavioral and neuroscience research and draw out the implications of this knowledge for juvenile justice reform, to assess the new generation of reform activities occurring in the United States, and to assess the performance of OJJDP in carrying out its statutory mission as well as its potential role in supporting scientifically based reform efforts.

    [Read the full report]

    Topics: Behavioral and Social Sciences

  • New From NAP 2012-11-12 15:28:13

    Final Book Now Available

    What is climate? Climate is commonly thought of as the expected weather conditions at a given location over time. People know when they go to New York City in winter, they should take a heavy coat. When they visit the Pacific Northwest, they should take an umbrella. Climate can be measured as many geographic scales – for example, cities, countries, or the entire globe – by such statistics as average temperatures, average number of rainy days, and the frequency of droughts. Climate change refers to changes in these statistics over years, decades, or even centuries.

    Enormous progress has been made in increasing our understanding of climate change and its causes, and a clearer picture of current and future impacts is emerging. Research is also shedding light on actions that might be taken to limit the magnitude of climate change and adapt to its impacts.

    Climate Change: Evidence, Impacts, and Choices is intended to help people understand what is known about climate change. First, it lays out the evidence that human activities, especially the burning of fossil fuels, are responsible for much of the warming and related changes being observed around the world. Second, it summarizes projections of future climate changes and impacts expected in this century and beyond. Finally, the booklet examines how science can help inform choice about managing and reducing the risks posed by climate change. The information is based on a number of National Research Council reports, each of which represents the consensus of experts who have reviewed hundreds of studies describing many years of accumulating evidence.

    [Read the full report]

    Topics: Environment and Environmental Studies

  • New From NAP 2012-11-12 00:00:00

    Final Book Now Available

    On September 8-9, 2011, experts in solar physics, climate models, paleoclimatology, and atmospheric science assembled at the National Center for Atmospheric Research (NCAR) in Boulder, Colorado for a workshop to consider the Sun’s variability over time and potential Sun-climate connections.

    While it does not provide findings, recommendations, or consensus on the current state of the science, The Effects of Solar Variability on Earth’s Climate: A Workshop Report briefly introduces the primary topics discussed by presenters at the event. As context for these topics, the summary includes background information on the potential Sun-climate connection, the measurement record from space, and potential perturbations of climate due to long-term solar variability. This workshop report also summarizes some of the science questions explored by the participants as potential future research endeavors.

    [Read the full report]

    Topics: |

  • New From NAP 2012-11-08 10:45:01

    Prepublication Now Available

    Across the United States, thousands of hazardous waste sites are contaminated with chemicals that prevent the underlying groundwater from meeting drinking water standards. These include Superfund sites and other facilities that handle and dispose of hazardous waste, active and inactive dry cleaners, and leaking underground storage tanks; many are at federal facilities such as military installations. While many sites have been closed over the past 30 years through cleanup programs run by the U.S. Department of Defense, the U.S. EPA, and other state and federal agencies, the remaining caseload is much more difficult to address because the nature of the contamination and subsurface conditions make it difficult to achieve drinking water standards in the affected groundwater.

    Alternatives for Managing the Nation’s Complex Contaminated Groundwater Sites estimates that at least 126,000 sites across the U.S. still have contaminated groundwater, and their closure is expected to cost at least $110 billion to $127 billion. About 10 percent of these sites are considered “complex,” meaning restoration is unlikely to be achieved in the next 50 to 100 years due to technological limitations. At sites where contaminant concentrations have plateaued at levels above cleanup goals despite active efforts, the report recommends evaluating whether the sites should transition to long-term management, where risks would be monitored and harmful exposures prevented, but at reduced costs.

    [Read the full report]

    Topics: Environment and Environmental Studies | Earth Sciences

  • New From NAP 2012-11-07 12:45:08

    Final Book Now Available

    The Workshop on the Future of Antennas was the second of three workshops conducted by the National Research Council’s Committee for Science and Technology Challenges to U.S. National Security Interests. The objectives of the workshop were to review trends in advanced antenna research and design, review trends in commercials and military use of advanced antennas that enable improved communication, data transfer, soldier health monitoring, and other overt and covert methods of standoff data collection.

    The first day’s sessions, consisting of five presentations and discussions on antennas and wireless communications and control, were open to committee members, staff, guests, and members of the public. The second day was a data-gathering session addressing vulnerabilities, indicators, and observables; presentations and discussions during this session included classified material and were not open to the public.

    The committee’s role was limited to planning and convening the workshop. This report is organized by topic in the order of presentation and discussion at the workshop. For Day 1 the topics were Future of Antennas, Commercial State of the Art of Wireless Communications and Control, Military State of the Art of Wireless Communications and Control, and Future Trends in Antenna Design and Wireless Communications and Control. For Day 2 the topics were Vulnerabilities of Ubiquitous Antennas, and Indicators and Observables, followed by a wrap-up discussion. Summary of a Workshop on the Future of Antennas describes what happened at the workshop.

    [Read the full report]

    Topics:

  • New From NAP 2012-11-07 12:45:01

    Final Book Now Available

    In 2012, the Defense Intelligence Agency (DIA) approached the National Research Council’s TIGER standing committee and asked it to develop a list of workshop topics to explore the impact of emerging science and technology. From the list of topics given to DIA, three were chosen to be developed by the Committee for Science and Technology Challenges to U.S. National Security Interests. The first in a series of three workshops was held on April 23-24, 2012. This report summarizes that first workshop which explored the phenomenon known as big data.

    The objective for the first workshop is given in the statement of task, which explains that that workshop will review emerging capabilities in large computational data to include speed, data fusion, use, and commodification of data used in decision making. The workshop will also review the subsequent increase in vulnerabilities over the capabilities gained and the significance to national security. The committee devised an agenda that helped the committee, sponsors, and workshop attendees probe issues of national security related to so-called big data, as well as gain understanding of potential related vulnerabilities. The workshop was used to gather data that is described in this report, which presents views expressed by individual workshop participants.

    Big Data: A Workshop Report is the first in a series of three workshops, held in early 2012 to further the ongoing engagement among the National Research Council’s (NRC’s) Technology Insight-Gauge, Evaluate, and Review (TIGER) Standing Committee, the scientific and technical intelligence (S&TI) community, and the consumers of S&TI products.

    [Read the full report]

    Topics: |

  • New From NAP 2012-11-07 00:00:00

    Final Book Now Available

    Intelligence, surveillance, and reconnaissance (ISR) capabilities have expanded situation awareness for U.S. forces, provided for more precise combat effects, and enabled better decision making both during conflicts and in peacetime, and reliance on ISR capabilities is expected to increase in the future. ISR capabilities are critical to 3 of the 12 Service Core Functions of the U.S. Air Force: namely, Global Integrated ISR (GIISR) and the ISR components of Cyberspace Superiority and Space Superiority, and contribute to all others.

    In response to a request from the Air Force for ISR and the Deputy Assistant Secretary of the Air Force for Science, Technology, and Engineering, the National Research Council formed the Committee on Examination of the Air Force Intelligence, Surveillance, and Reconnaissance (ISR) Capability Planning and Analysis (CP&A) Process. In this report, the committee reviews the current approach to the Air Force corporate planning and programming process for ISR capability generation; examines carious analytical methods, processes, and models for large-scale, complex domains like ISR; and identifies the best practices for the Air Force.

    In Capability Planning and Analysis to Optimize Air Force Intelligence, Surveillance, and Reconnaissance Investments, the current approach is analyzed and the best practices for the Air Force corporate planning and programming processed for ISR are recommended. This report also recommends improvements and changes to existing analytical tools, methods, roles and responsibilities, and organization and management that would be required to ensure the Air Force corporate planning and programming process for ISR is successful in addressing all Joint, National, and Coalition partner’s needs.

    [Read the full report]

    Topics: