Author: Serkadis

  • Young celebs to attend ‘Mattel Party on the Pier’ benefiting Mattel Children’s Hospital UCLA

    WHAT:
    Young celebrities from popular children’s television shows are scheduled to attend the 13th annual “Mattel Party on the Pier” benefiting Mattel Children’s Hospital UCLA. Many of the actors and actresses have attended the event in the past and enjoy returning each year to support the hospital’s fundraising efforts. 
     
    The celebrities will sign autographs and volunteer in game booths. All event guests will be treated to unlimited rides, carnival games stocked with prizes donated by Mattel, arts and crafts, and a silent auction featuring entertainment packages, jewelry and collector’s memorabilia.
     
     
    WHEN:  
    10 a.m.–2 p.m. on Sunday, Oct. 21  
     
    WHERE:         
    Pacific Park on the Santa Monica Pier (map)
     
    PHOTO | INTERVIEW OPPORTUNITIES:  
    In addition to hospital staff and pediatric patients and their families, celebrities scheduled to appear include:   
    • Allie Deberry, Carlon Jeffrey, China Anne McClain, Sierra McCormick, Aedin Mincks, Stefanie Scott, Jake Short (“A.N.T. Farm”)
    • Laura Marano, Calum Worthy (“Austin & Ally”)
    • Katherine McNamara, Brendan Meyer (“Girl vs. Monster”)
    • Cameron Boyce, Skai Jackson (“Jessie”)
    • Tucker Albrizzi, Kelli Berglund, Tyrel Jackson Williams, Billy Unger (“LAB Rats”)
    • Trevor Jackson, Coco Jones (“Let It Shine”)
    • Davis Cleveland, Adam Irigoyen (“Shake It Up”)
    • Allisyn Ashley Arm (“So Random”)
    • Austin Anderson (“Victorious”)
    • Nathan Kress (“iCarly”)
    • Don Provenmire, Swampy Marsh (“Phineas & Ferb”)
    • Kaitlyn Jenkins (“Bunheads”)
    • Olivia Holt, Alex Jones, Dylan Riley Snyder (“Kickin’ It”)
    • Sammi Hanratty (“The Suite Life of Zach & Cody,” “The Greening of Whitney Brown”)
    • Mia Talerico (“Good Luck Charlie”)
    • Ryan Ochoa, Geno Segers (“Pair of Kings”)
    • Julianna Rose (member of “American Girl at The Grove”) 
     
    BACKGROUND:   
    The “Mattel Party on the Pier” is the signature fundraiser for Mattel Children’s Hospital UCLA, selling out each year, with some 1,400 attendees. The event generates funding for the hospital, allowing it to launch innovative projects and meet its most urgent needs. Proceeds from last year’s event provided direct support for the hospital’s inflammatory bowel diseases programs, research into autism-associated epilepsy and treatments to improve immune function for children who have undergone chemotherapy and bone marrow transplants.
     
    Mattel Inc. is the event’s title sponsor. Additional major sponsors include Duracell; the Goldhirsh-Yellin Foundation; the John W. Carson Foundation; Beth and Neal Cutler, M.D.; the Dodgers Dream Foundation; the Littman Family; Lori and Michael Milken and the Milken Family Foundation; Richard and Ellen Sandler; Helene Spiegel and the Thomas Spiegel Family Foundation; and Liz and Evan Greenspan. Media and entertainment sponsors include L.A. Parent magazine and Feet First Eventertainment.
     
    Mattel Children’s Hospital UCLA, one of the highest-rated children’s hospitals in California, is a vital component of Ronald Reagan UCLA Medical Center, ranked the fifth best hospital in nation and best in the western United States by U.S. News & World Report. Mattel Children’s Hospital offers a full spectrum of primary and specialized medical care for infants, children and adolescents. The hospital’s mission is to provide state‑of-the-art medical and surgical treatment for children in a compassionate atmosphere and to improve the understanding and treatment of pediatric diseases.
     
    Follow the “Party on the Pier” conversation on Twitter at @MCHUCLA and #MattelPOP.  

     

    MEDIA CONTACT:  
    Amy Albin, UCLA Health Sciences Media Relations, [email protected]
    310-794-8672 (office) | 310-597-5765 (cell)  
     
    PARKING:  
    Email media contact to confirm attendance and parking. Media credentials are required.

  • UCLA Longevity Center presents Healthy Aging 2012 Conference

    WHAT:
    The UCLA Longevity Center will hold its Healthy Aging Conference, focusing on the theme “Healthy Aging – Taking Control of Your Life.” The event will feature a diverse group of speakers representing the UCLA community and beyond. Panels include:
    • Nutritious Eating for Healthy Aging
    • The Centenarians: Life Past the Century Mark
    • Train Your Brain: Boot Camp for Your Mind
    • Alzheimer’s Research Update
    • Sex After 70
    • Medicare, Social Security and Health Care Reform
    • Keep Moving: Fitness and Athletic Choices
    • The Bucket List: Setting and Focusing on Goals
    WHO:
    Among the event’s speakers will be:
     
    Dr. Gary Small
    Author and director of the UCLA Longevity Center
     
    Dr. David Merrill
    UCLA geriatric physician and researcher
     
    Joan Moran
    Motivational speaker and author
     
    Tim Carpenter
    Founder and executive director of EngAGE
     
    Dr. Walter E. Brackelmanns
    Noted UCLA couples/sex therapy expert
     
    Dr. L. Stephen Coles
    Director of the LA Gerontology Research Group and the Supercentenarian Research Foundation
     
    WHEN:
    10 a.m.–4 p.m., Saturday, Oct. 27
     
    WHERE:
    Olympic Collection Conference Center (map)
    11301 Olympic Blvd., West Los Angeles 90064
     
    INFORMATION | REGISTRATION:
    Registration is $125 for members of the general public and $75 for UCLA students through Oct. 19 (includes lunch). A full list of sessions and speakers, as well as registration information, can be found at www.healthyaging2012.com.   
     
    BACKGROUND:
    Through its many programs, the UCLA Longevity Center promotes healthy aging lifestyles and strives to build a community that helps people live better, longer. More information can be found at www.longevity.ucla.edu or 310-794-0676.
     
    MEDIA CONTACTS:
     
    UCLA: Rachel Champeau, [email protected]
    310-794-2270
     
    Abuzz Productions: Ashli Lewis [email protected]   
    415-823-4540

  • UCLA scientists discover sleeping brain behaves as if it’s remembering something

    UCLA researchers have for the first time measured the activity of a brain region known to be involved in learning, memory and Alzheimer’s disease during sleep. They discovered that this region, called the entorhinal cortex, behaves as if it’s remembering something, even during anesthesia–induced sleep — a finding that counters conventional theories about sleep-time memory consolidation.
     
    The research team simultaneously measured the activity of single neurons from multiple parts of the brain that are involved in memory formation. The technique allowed them to determine which brain region was activating other areas and how that activation was spreading, said the study’s senior author, Mayank R. Mehta, a professor of neurophysics in UCLA’s departments of neurology, neurobiology, and physics and astronomy.
     
    In particular, Mehta and his team looked at three connected brain regions in mice — the neocortex, or “new brain,” the newest part of the cerebral cortex to evolve; the hippocampus, or “old brain”; and the entorhinal cortex, an intermediate brain that connects the new and the old brains.
     
    While previous studies have suggested that the dialogue between the old and the new brain during sleep was critical for memory formation, researchers had not investigated the contribution of the entorhinal cortex to this conversation, which turned out to be a game-changer, Mehta said.
     
    Mehta’s team found that the entorhinal cortex showed what is called persistent activity, which is thought to mediate working memory during waking life — for example, when people pay close attention to remember things temporarily, such as recalling a phone number or following directions.
     
    “The big surprise here is that this kind of persistent activity is happening during sleep, pretty much all the time,” Mehta said. “These results are entirely novel and surprising. In fact, this working memory–like persistent activity occurred in the entorhinal cortex even under anesthesia.”
     
    The study appears Oct. 7 in the early online edition of the journal Nature Neuroscience.
     
    The findings are important, Mehta said, because humans spend one-third of their lives sleeping, and a lack of sleep results in adverse effects on health, as well as learning and memory problems.
     
    It had been shown previously that the neocortex and the hippocampus “talk” to each other during sleep, and it is believed that this conversation plays a critical role in memory consolidation, the establishing of memories. However, no one had been able to interpret the conversation.
     
    “When you go to sleep, you can make the room dark and quiet, and although there is no sensory input, the brain is still very active,” Mehta said. “We wanted to know why this was happening and what different parts of the brain were saying to each other.”
     
    Mehta and his team developed an extremely sensitive monitoring system that allowed them to follow the activities of neurons from each of the three targeted portions of the brain simultaneously, down to the activity of a single neuron. This allowed them to decipher the precise communications, even when the neurons were seemingly quiet. They then developed a sophisticated mathematical analysis to decipher the complex conversation.
     
    During sleep, the neocortex goes into a slow wave pattern for about 90 percent of the time. And during this period, its activity slowly fluctuates between active and inactive states about once every second.
     
    Mehta and his team focused on the entorhinal cortex, which has many parts. The outer part mirrored the neocortical activity. However, the inner part behaved differently. When the neocortex became inactive, the neurons in the inner entorhinal cortex persisted in the active state, as if they were remembering something the neocortex had recently “said,” a phenomenon known as spontaneous persistent activity.
     
    Further, they found that when the inner part of the entorhinal cortex became spontaneously persistent, it prompted the hippocampus neurons to become very active. On the other hand, when the neocortex was active, the hippocampus became quieter. This data provided a clear interpretation of the conversation.
     
    “During sleep, the three parts of the brain are talking to each other in a very complex way,” he said. “The entorhinal neurons showed persistent activity, behaving as if they were remembering something — even under anesthesia, when the mice could not feel or smell or hear anything. Remarkably, this persistent activity sometimes lasted for more than a minute, a huge time-scale in brain activity, which generally changes on a scale of one-thousandth of a second.”
     
    The findings challenge current theories of brain communication during sleep, in which the hippocampus is thought to talk to, or drive, the neocortex. Mehta’s findings instead indicate that the entorhinal cortex is the third key actor in this complex dialogue and that the neocortex is driving the entorhinal cortex, which in turn behaves as if it is remembering something. That, in turn, drives the hippocampus, while other activity patterns shut it down.
     
    “This is a whole new way of thinking about memory consolidation theory,” Mehta said. “We found there is a new player involved in this process and it’s having an enormous impact. And what that third player is doing is being driven by the neocortex, not the hippocampus. This suggests that whatever is happening during sleep is not happening the way we thought it was. There are more players involved, so the dialogue is far more complex, and the direction of the communication is the opposite of what was thought.”
     
    Mehta theorizes that this process occurs during sleep as a way to unclutter memories and delete information that was processed during the day but is irrelevant. This results in the important memories becoming more salient and readily accessible. Notably, Alzheimer’s disease starts in the entorhinal cortex, and Alzheimer’s patients suffer from impaired sleep, so Mehta’s findings may have implications in that area.
     
    For this study, Mehta teamed with Thomas Hahn and Sven Berberich, both of Heidelberg University in Germany and the Max Planck Institute for Medical Research, and James McFarland of Brown University and the UCLA Department of Physics. Going forward, the team will further study this brain activity to uncover the mechanisms behind it and determine if it influences subsequent behavioral performance. These results and related findings can be found at http://www.physics.ucla.edu/~mayank.
     
    “These results provide the first direct evidence for persistent activity in medial entorhinal cortex layer neurons in vivo, and reveal its contribution to cortico-hippocampal interactions, which could be involved in working memory and learning of long behavioral sequences during behavior, and memory consolidation during sleep,” the study states.
     
    The study was funded by the Whitehall Foundation, the National Institutes of Health, the National Science Foundation, the W. M. Keck Foundation, the German Ministry of Education and Research and the Max Planck Society.
     
    The UCLA Department of Neurology, with over 100 faculty members, encompasses more than 20 disease-related research programs, along with large clinical and teaching programs. These programs cover brain mapping and neuroimaging, movement disorders, Alzheimer’s disease, multiple sclerosis, neurogenetics, nerve and muscle disorders, epilepsy, neuro-oncology, neurotology, neuropsychology, headaches and migraines, neurorehabilitation, and neurovascular disorders. The department ranks in the top two among its peers nationwide in National Institutes of Health funding.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • UCLA sponsors annual fundraising walk for polycystic kidney disease

    WHAT:
    Polycystic kidney disease (PKD) is an inherited, incurable disorder that affects about one in 1,000 Americans. It is the third leading cause of kidney failure, behind diabetes and high blood pressure, yet is relatively unknown among the general public.
     
    To help increase awareness of the disease and raise funds for a cure, the UCLA Health System is co-sponsoring the PKD Foundation’s annual Los Angeles Walk for PKD, and nearly 90 UCLA Health System volunteers organized by UCLA’s Dr. Anjay Rastogi will participate in the event. Dr. Rastogi will also speak.
     
    The event will feature prizes, raffles, information booths and more.  
     
    WHO:
    Speeches will be made by the following individuals:
     
    Dr. Anjay Rastogi
    Medical director of the UCLA Living Kidney Donor Program and director of UCLA Dialysis Services
     
    Gary Godsey
    President of the PKD Foundation
     
     
    WHEN:
    Saturday, Oct. 6
     
    8:30 a.m.
    Check-in
     
    8:30–9:30 a.m.
    Speeches by Rastogi and Godsey
     
    9 a.m.
    Kids’ race
     
    9:30 a.m.
    2-mile walk
     
     
    WHERE:
    Santa Monica Beach Park (map)
    Ocean Park Boulevard and Barnard Way, Santa Monica 90405
     
    SPONSORS:
    UCLA Health System, Health Plus Inc.,  AST Enzymes, Great American Seafood, PlasCor, Savage BMW, Bramasol, GNC, Tea Pot Brand, The Vitamin Shoppe, Valcon Masonry and Construction Services, Alpha Atlantic, Atlantis Packaging, Orange County’s Pacific Symphony, Perry’s, Samson Productions, Elbows Mac ‘n’ Cheese, Vellano.
     
    PARKING:
    Parking is available for $8 in the lot at Beach Park 1 (map)
     
    MEDIA CONTACT:
    Enrique Rivero | UCLA Health Sciences Media Relations | [email protected]
    310-794-2273 (office) | 310-597-5768 (cell)

  • ‘Graduates’ of neonatal intensive care unit reunite with those who saved their lives

    WHAT:
    Approximately 400 former patients and their families will attend a reunion of “graduates” who were cared for in the neonatal intensive care unit (NICU) at Mattel Children’s Hospital UCLA in Westwood or UCLA Medical Center, Santa Monica. This is the 27th NICU reunion organized by UCLA nurses. 
     
    The party will include activities for kids and will give nurses the chance to meet with former patients who are now thriving. These former patients today range in age from 3 months to 35 years old. 
     
    The NICU takes care of medically fragile newborns, including micro-preemies born as early as 24 weeks’ gestation and weighing only a pound or two, as well as full-term babies who are born with life-threatening illnesses.
     
    WHEN:
    Noon–3 p.m., Sunday, Oct. 7
     
    WHERE:
    Picnic area at UCLA’s Sunset Canyon Recreation Center (map)
    111 De Neve Drive, Los Angeles 90095
     
    INTERVIEW | PHOTO OPPORTUNITIES:
    The following people will be among those available at the reunion:
     
    Quezada family (Palmdale)
    Parents Manuel and Beatriz and their son 3-year-old son Daniel, who was born at 25 weeks (Manuel speaks English and Spanish)
     
    Weaker family (Sun Valley)
    Parents Jenna and Jeff and their 2-year-old son Luke, who was born at 32 weeks
     
    Kinsey family (Simi Valley)
    Parents Stacey and Scott; their 2-year-old daughter, who was born at 24 weeks; and their 9-week-old twins, who were born at 34 weeks
     
    Sara Van der Linden (Santa Paula)
    Sara, 34, was in the NICU in 1977 and has attended more than 20 NICU reunions
     
    Shohreh Samimi
    NICU unit director at Mattel Children’s Hospital UCLA
     
    Joyce Keeler
    NICU nurse at UCLA for more than 35 years
     
     
    BACKGROUND:
    UCLA NICUs offer the most advanced interventions available for critically ill babies, with medical and surgical specialists available 24 hours a day to address every possible physiological need. UCLA experts are also involved in extensive neonatal research, and NICU staff provide a variety of developmental interventions for infants and families Learn more about the NICU at Mattel Children’s Hospital UCLA and the NICU at UCLA Medical Center, Santa Monica.
     
    MEDIA PARKING:
    Please R.S.V.P. to media contact. Parking will be available in Lot 11 on De Neve Drive. Parking is $11, or show attendant media credentials for complimentary parking. Media trucks with placards in the window can park at the temporary parking spots outside Lot 11.
     
    MEDIA CONTACT:  
    Amy Albin | UCLA Health Sciences Media Relations | 310-794-8672

  • UCLA public health researchers get $20M grant to promote health and fitness, fight obesity

    Researchers at the UCLA Fielding School of Public Health and UCLA’s Jonsson Comprehensive Cancer Center have been awarded a $20 million federal grant to further their innovative efforts to curb obesity, a global pandemic that has reached the level of a national crisis in the United States.
     
    The UCLA project, rather than requiring busy, stressed individuals in low-resource neighborhoods to seek out physical activity and nutrient-rich foods, will engage them as “captive” audiences in settings they already frequent — including schools, offices and churches — making healthier options a default that can only be avoided with effort or by “opting out.”
     
    The five-year grant from the Centers for Disease Control and Prevention is intended to address health disparities among racial and ethnic groups across the country and is part of the agency’s Racial and Ethnic Approaches to Community Health (REACH) initiative.
     
    The UCLA project will be led by Dr. Antronette Yancey and Roshan Bastani, professors of health policy and management at the Fielding School and co-directors of the school’s UCLA Kaiser Permanente Center for Health Equity. Other faculty members on the team include assistant professor Beth Glenn, professor Annette Maxwell and professor William J. McCarthy, all of the school’s department of health policy and management.
     
    For more than 20 years, UCLA has been recognized as a leader in promoting health among a diverse array of ethnic groups in a variety of settings, with programs that address critical health issues ranging from obesity and tobacco control to cancer screening and vaccinations. This work is conducted in partnership with several hundred community-based organizations, primarily in the Los Angeles region.
     
    The new CDC funding enables the researchers to build on knowledge gained from their prior work and to expand the geographic scope of their efforts. They will concentrate on promoting healthy nutrition and physical activity in 30 to 40 medium- to large-sized cities throughout the U.S. Southeast, Midwest and Southwest, focusing on geographic hubs in those metropolitan areas where ethnic or racial minorities make up the majority of residents.
     
    The program will be disseminated through national networks of community-based organizations, allowing the program to reach large segments of the African American, Asian American/Pacific Islander, Hispanic/Latino and American Indian populations in these urban centers.
     
    The core of the program is Yancey’s “Instant Recess,” which she developed nearly 14 years ago to help prevent obesity and promote health and well-being. “Instant Recess” focuses on integrating short physical-activity breaks into non-discretionary time — during non-P.E. time in school, “paid time” at work and Sunday church services, for example — and establishing policies to ensure that appealing healthy options are accessible whenever food is served at meetings or gatherings, in cafeterias, or in vending machines.
     
    The “Instant Recess” exercise breaks consist of 10-minute dance- and sports-themed movements scientifically designed to maximize enjoyment and energy expenditure while minimizing injury risk and perceived exertion in the average sedentary, overweight individual. A library of more than 50 “Instant Recess” CDs and DVDs has been produced, including American Indian pow-wow, Latin salsa, Appalachian “talking dance,” cumbia, reggae, hip hop, line dancing and African dance, along with basketball, baseball, football, boxing and soccer. The CDs and DVDs also include suggestions for low-resource healthy nutrition policies that can be adopted by organizations.
     
    “This award is a truly amazing validation of our obesity prevention and control work of the past 20 years in stimulating communities to seek and embrace the healthy, culturally situated choice,” Yancey said. “We are incredibly honored to have received this funding, which will allow us to take our work to scale at a national level and evaluate the sustainability of our interventions.”
     
    “This project represents a true public health approach,” Bastani said. “We will continue to partner with a wide range of community organizations that have national reach and assist them in adapting our program for the specific populations they serve.”
     
    Preventable risk factors — including tobacco use, poor nutrition and a lack of physical activity — are more common in communities of color and low-income neighborhoods and often result in chronic conditions such as heart disease, diabetes, asthma, cancer and stroke, among others.
     
    As part of the project, UCLA will work collaboratively with national partners to promote and implement sustainable “Instant Recess” initiatives within schools, youth programs, religious institutions, public health and health care agencies, small businesses, and professional sports teams to support healthy lifestyle behaviors in the African American, Asian American/Pacific Islander, Hispanic/Latino and American Indian communities.  
     
    The UCLA Fielding School of Public Health is dedicated to enhancing the public’s health by conducting innovative research; training future leaders and health professionals; translating research into policy and practice; and serving local, national and international communities.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • Agreement will lead to commercialization of batteries for renewable energy storage

    A Washington state firm with a 27,000 square foot manufacturing and design facility in Mukilteo has signed a license agreement with Battelle to further develop and commercialize a type of advanced battery that holds promise for storing large amounts of renewable energy and providing greater stability to the energy grid.

    The agreement with UniEnergy Technologies LLC is intended to advance and commercialize “redox flow” battery technology. 

    Developing a technology that can smoothly integrate energy from variable and intermittent sources — such as wind and solar power — onto the electricity grid while maintaining grid stability has proven challenging. First developed in the 1970s, redox flow batteries are one type of storage technology that has shown the ability to meet this challenge. But until now, these batteries have been limited in their ability to work well in a wide range of temperatures, their relatively high cost, and their limited ability to store energy, otherwise known as energy density. 

    Recently however, with funding from the Energy Department’s Office of Electricity Delivery & Energy Reliability, researchers at DOE’s Pacific Northwest National Laboratory have made significant progress in improving the performance of redox flow technology. 

    Redox flow batteries are a type of rechargeable battery that stores electrical energy in two tanks of electrolytes, which are then pumped through a reactor to produce energy. The PNNL-developed vanadium electrolytes incorporate two novel approaches to overcome the limitations of previous generations of redox flow batteries. The result is a dramatically improved operating range, higher energy density and lower cost for vanadium redox flow batteries.

    The licensing agreement with UniEnergy will lead to enhanced commercial products for utilities, power generators and industry that will enable the energy grid to operate more reliably and efficiently, with better integration of  renewable resources, such as energy produced by wind and the sun. 

    “The redox flow battery is well-suited for storing intermittent, renewable energy on the electricity grid. The technology can help balance supply and demand, prevent disruptions and meet the grid’s varying load requirements,” said Imre Gyuk, energy storage program manager at DOE’s Office of Electricity Delivery & Energy Reliability in Washington, D.C.

    “Redox flow batteries can also help utilities during times of peak demand on the grid, providing additional power when it is needed,” he added. “Successful commercialization of DOE-sponsored technology development, such as this, is vital for creating the grid of the future, and sustaining U.S. leadership in advanced technology.”


    About UniEnergy Technologies LLC

    UniEnergy Technologies, Inc., or UET, is a privately-held clean energy company, founded in Washington state and based in Mukilteo, Wash.  UET’s founders are President Gary Yang, and Chief Technology Officer Liyu Li, both experts in energy storage technologies.  UET’s mission is to scale up and commercialize new generation redox flow batteries and other advanced electricity storage technologies through wide collaboration with partners that include leading industries, associations and research institutions in related fields as well as government bodies.


    About Battelle and PNNL

    Interdisciplinary teams at Pacific Northwest National Laboratory address many of America’s most pressing issues in energy, the environment and national security through advances in basic and applied science.  PNNL employs 4,600 staff, has an annual budget of nearly $1 billion, and has been managed for the U.S. Department of Energy by Ohio-based Battelle since the laboratory’s inception in 1965.  For more, visit the PNNL’s News Center, or follow PNNL on Facebook, LinkedIn and Twitter.

  • GridLAB-D: A one-of-a-kind energy grid simulator

    A one-of-a-kind, high-tech modeling tool designed to simulate different situations on the electric power grid will be on display at the White House today. The result of a multi-year funding effort by the U.S. Department of Energy’s Office of Electricity Delivery and Energy Reliability, Pacific Northwest National Laboratory researchers will join Energy Secretary Steven Chu to demonstrate how GridLAB-D™ can help power system operators, industry, innovators and entrepreneurs understand how making a change to one part of the power system impacts other parts of the grid.

    “GridLAB-D provides first-of-its-kind analysis and simulation of all aspects of grid operations, from generation to consumption, in unprecedented detail. Using this open source, freely available tool, we can understand how making changes to one part of the electric system, such as incorporating more solar or wind power, impacts other parts of the system, and in different types of situations, such as inclement weather, record heat or even drought,” said Carl Imhoff, PNNL’s Electricity Infrastructure Market Sector lead. “In a sense, GridLAB-D lets users see the future of the grid like never before,” he said.

    The GridLAB-D research team joins other teams of researchers from government, industry and academia as part of “Datapalooza,” an event sponsored by the White House Office of Science and Technology Policy, Council on Environmental Quality, the U.S. Department of Energy, and the U.S. Environmental Protection Agency. It’s designed to showcase various ways in which data analysis can be used to improve the nation’s power system. PNNL researchers will be demonstrating how GridLAB-D allows users to see how solar panels in various community scenarios can impact the performance of the power system. From there, utilities can understand what investments should be made to ensure the solar power can come online in a safe and effective manner.

    How it works

    The smart grid is often referred to as the merger of the internet and the electric grid. It includes information technology that identifies how much power is being used. It also generates a lot of data, which must be analyzed in real time to enable optimal energy resource usage. GridLAB-D has been in development at PNNL for nearly 10 years and is made available as an open-source tool for utilities, universities, researchers, consultants, and government and defense agencies.

    “GridLAB-D is a powerful tool capable of pulling together and simultaneously considering the effects of multiple technologies on the electric grid over periods of time that can range from seconds to decades,” said David Chassin, project manager for GridLAB-D and a PNNL scientist. “It’s a power systems simulator, market simulator, communication simulator and building simulator, all tied into one. Every piece shares information with every other piece to build a clearer picture of how the electric grid will evolve over time.”

    “For example, GridLAB-D allows us to drill down to see how changing prices or weather conditions impact voltage levels and communications loads on a minute-to-minute basis,” said Chassin. “It has also been used to study whether different utility rate designs make sense when new demand response technology becomes more widely available.”

    Seeing solar at the White House

    For the Datapalooza event, PNNL researchers are using GridLAB-D to simulate what would happen if a large numbers of solar panels were incorporated into a typical southern California distribution system. They worked with colleagues from PNNL’s National Visualization and Analytics Center in Richland to develop an animation that shows what may occur as more homes and businesses install solar panels. The PNNL team used building design parameters for southern California and weather data from Bakersfield, Calif., to represent a typical southern, inland-California energy usage model, where utilities are currently experiencing a rapid rise in solar power on their systems.

    “We’ve found that as clouds move through neighborhoods, the voltage can rise or fall outside allowable ranges, depending on conditions,” said Kevin Schneider, a PNNL lead power engineer on the project. “Utilities may be required to manage voltage differently depending on how much solar power is found in various parts of their systems.”

    GridLAB-D is one component of PNNL’s smart grid research and development program, under which researchers have been applying capabilities and expertise for more than 20 years to shape and deliver an electricity grid that is more resilient, secure and efficient.

    Funding for GridLAB-D was provided by U.S. Department of Energy’s Office of Electricity Delivery and Energy Reliability. More information about GridLAB-D is available at www.gridlabd.org.

  • UCLA-led study finds direct correlation between hospital bedsores, patient mortality

    A new clinical study spearheaded by the dean of UCLA’s School of Nursing has found a direct correlation between pressure ulcers — commonly known as bedsores — and patient mortality and increased hospitalization.
     
    The research is believed to be the first of its kind to use data directly from medical records to assess the impact of hospital-acquired pressure ulcers on Medicare patients at national and state levels.
     
    According to the study, featured as the lead article in the current issue of the Journal of the American Geriatrics Society, seniors who developed pressure ulcers were more likely to die during their hospital stay, to have longer stays in the hospital, and to be readmitted to the hospital within 30 days of their discharge.
     
    To arrive at their findings, the researchers tracked more than 51,000 randomly selected Medicare beneficiaries hospitalized across the United States in 2006 and 2007.
     
    “Hospital-acquired pressure ulcers were shown to be an important risk factor associated with mortality,” said Dr. Courtney Lyder, lead investigator on the study and dean of the UCLA School of Nursing. “It is incumbent upon hospitals to identify individuals at high risk for these ulcers and implement preventive interventions immediately upon admission.”
     
    According to Lyder and his research team, individuals at the highest risk are those with existing chronic conditions, such as congestive heart failure, pulmonary disease, cardiovascular disease, diabetes and obesity, as well as those on steroids.
     
    In conducting the study, the researchers were challenged by the fact that there is no large single database to help determine the incidence of pressure ulcers among hospitalized Medicare patients. They therefore culled their data from Medicare’s claim history, a national surveillance system designed to identify adverse events — or “unintended harm” — within the hospitalized Medicare population. The researchers looked at this data to determine the cause and patterns of hospital-acquired pressure ulcers.
     
    The study found that 4.5 percent of the patients tracked acquired a pressure ulcer during their stay in the hospital. The majority of these bedsores were found on the tailbone or sacrum, followed by the hip, buttocks and heels. The study also revealed that of the nearly 3,000 individuals who entered the hospital with a pressure ulcer, 16.7 percent developed at least one new bedsore on a different part of their body during their hospitalization.
     
    “This is a serious issue, and now we have data that can help the health care system address this ongoing problem,” Lyder said. “When individuals enter the hospital with the risk conditions that we’ve identified, it should send up an immediate warning signal that appropriate steps should be taken to minimize the chance of pressure ulcers occurring.”
     
    In addition to Lyder, clinical researchers on this study included Yun Wang of Qualidigm, the Centers for Outcomes Research and Evaluation at Yale University, and Yale–New Haven Health; Mark Metersky of Qualidigm and the division of pulmonary and critical care medicine at the University of Connecticut School of Medicine; Maureen Curry of Qualidigm; Rebecca Kliman of the Office of Clinical Standards and Quality at the Centers for Medicare and Medicaid Services; Nancy Verzier of Qualidigm; and David Hunt of the Office of Health Information Technology Adoption in the Office of the National Coordinator for Health Information Technology.
     
    The study was funded by the Centers for Medicare and Medicaid Services.
     
    The UCLA School of Nursing is redefining nursing through the pursuit of uncompromised excellence in research, education, practice, policy and patient advocacy. For more information, please visit nursing.ucla.edu.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter

  • Washington science academy names Subhash Singhal its president-elect

    A Battelle Fellow Emeritus and retired engineer who continues to support research at the Department of Energy’s Pacific Northwest National Laboratory has been named president-elect of the Washington State Academy of Sciences.

    Subhash Singhal, who is also a member or the National Academy of Engineering, took on the state academy’s president-elect duties earlier this month. His one-year term runs through September 2013, after which he is expected to serve as president for a year. Singhal became a founding member of the state academy in 2008.

    Singhal retired from PNNL in October 2011, but still supports its research as an independent consultant. He is a world leader in solid oxide fuel cells and directed PNNL’s fuel cell research. Singhal has written more than 95 scientific publications, edited 17 books, received 13 patents and given more than 310 presentations. He also serves as an adjunct professor of materials science and engineering at the University of Utah and is a visiting professor at the China University of Mining and Technology-Beijing.

    The Washington State Academy of Sciences strives to increase the role and visibility of science in Washington state, as well as provide expert scientific and engineering analysis to help inform public policy decisions. Five other PNNL researchers are members of the state academy.

    More information about the academy’s new board members is available online.

  • Nickelblock: an element’s love-hate relationship with battery electrodes

    Anyone who owns an electronic device knows that lithium ion batteries could work better and last longer. Now, scientists examining battery materials on the nano-scale reveal how nickel forms a physical barrier that impedes the shuttling of lithium ions in the electrode, reducing how fast the materials charge and discharge. Published last week in Nano Letters, the research also suggests a way to improve the materials.

    The researchers, led by the Department of Energy’s Pacific Northwest National Laboratory’s Chongmin Wang, created high-resolution 3D images of electrode materials made from lithium-nickel-manganese oxide layered nanoparticles, mapping the individual elements. These maps showed that nickel formed clumps at certain spots in the nanoparticles. A higher magnification view showed the nickel blocking the channels through which lithium ions normally travel when batteries are charged and discharged.

    “We were surprised to see the nickel selectively segregate like it did. When the moving lithium ions hit the segregated nickel-rich layer, they essentially encounter a barrier that appears to slow them down,” said Wang, a materials scientist based at EMSL, the Environmental Molecular Sciences Laboratory, a DOE user facility on PNNL’s campus. “The block forms in the manufacturing process, and we’d like to find a way to prevent it.”

    Lithium ions are positively charged atoms that move between negative and positive electrodes when a battery is being charged or is in use. They essentially catch or release the negatively charged electrons, whose movement through a device such as a laptop forms the electric current.

    In lithium-manganese oxide electrodes, the manganese and oxygen atoms form rows like a field of cornstalks. In the channels between the stalks, lithium ions zip towards the electrodes on either end, the direction depending on whether the battery is being used or being charged.

    Researchers have known for a long time that adding nickel improves how much energy the electrode can hold, battery qualities known as capacity and voltage. But scientists haven’t understood why the capacity falls after repeated usage — a situation consumers experience when a dying battery holds its charge for less and less time.

    To find out, Wang, materials scientist Meng Gu and their collaborators used electron microscopy at EMSL and the National Center for Electron Microscopy at Lawrence Berkeley National Laboratory to view how the different atoms are arranged in the electrode materials produced by Argonne National Laboratory researchers. The electrodes were based on nanoparticles made with lithium, nickel, and manganese oxides.

    First, the team took high-resolution images that clearly showed rows of atoms separated by channels filled with lithium ions. On the surface, they saw the accumulation of nickel at the ends of the rows, essentially blocking lithium from moving in and out.

    To find out how the surface layer is distributed on and within the whole nanoparticle, the team used a technique called three-dimensional composition mapping. Using a nanoparticle about 200 nanometers in size, they took 50 images of the individual elements as they tilted the nanoparticle at various angles. The team reconstructed a three-dimensional map from the individual elemental maps, revealing spots of nickel on a background of lithium-manganese oxide.

    The three-dimensional distribution of manganese, oxygen and lithium atoms along the surface and within the particle was relatively even. The nickel, however, parked itself in small areas on the surface. Internally, the nickel clumped on the edges of smaller regions called grains.

    To explore why nickel aggregates on certain surfaces, the team calculated how easily nickel and lithium traveled through the channels. Nickel moved more easily up and down the channels than lithium. While nickel normally resides within the manganese oxide cornrows, sometimes it slips out into the channels. And when it does, this analysis showed that it flows much easier through the channels to the end of the field, where it accumulates and forms a block.

    The researchers used a variety of methods to make the nanoparticles. Wang said that the longer the nanoparticles stayed at high temperature during fabrication, the more nickel segregated and the poorer the particles performed in charging and discharging tests. They plan on doing more closely controlled experiments to determine if a particular manufacturing method produces a better electrode.

    This work was supported by PNNL’s Chemical Imaging Initiative.


    Reference: Meng Gu, Ilias Belharouak, Arda Genc, Zhiguo Wang, Dapeng Wang, Khalil Amine, Fei Gao, Guangwen Zhou, Suntharampillai Thevuthasan, Donald R. Baer, Ji-Guang Zhang, Nigel D. Browning, Jun Liu, and Chongmin Wang. Conflicting Roles of Ni in Controlling Cathode Performance in Li-ion Batteries, NanoLetters Sept. 17, 2012, doi: dx.doi.org/10.1021/nl302249v.

  • Battelle funding new Cybersecurity program at CBC

    Columbia Basin College is starting a new, comprehensive program in Cybersecurity to include a one-year certificate, two-year associate’s degree, and a Bachelor of Applied Science degree program.  The certificate and associate degree programs will start Fall 2012 and the bachelor program is planned for Fall 2013.  Battelle is contributing $118,000 for development and implementation of the first year of the Bachelor of Applied Science program since it will be solely funded through fees, grants, and donations. 

    As CBC President Rich Cummins points out, “Government and industry have a growing need for specialized workers that can anticipate, defend, and protect against cyber attacks on data and network systems across the nation.  Although cybersecurity specialists are in high demand nationally, they are in even higher demand in the local region as a result of the advanced technical infrastructure and work performed in our community.” 

    The program will admit 18 students to its initial class. 

    “With cyber attacks on the rise nationally, PNNL and other organizations are in need of well-trained computer security specialists who can assess, detect, and protect data and infrastructure.  The new CBC degree program will help fill that workforce need while at the same time provide a critical service locally and nationally,” says Pacific Northwest National Laboratory Director, Mike Kluse.

    For more information, visit the Columbia Basin College website.

  • Fisher elected president of the Health Physics Society

    Darrell Fisher, a radiation scientist with the Department of Energy’s Pacific Northwest National Laboratory, has been elected president of the Health Physics Society, a 5,000-member, internationally recognized scientific organization of professionals who specialize in radiation safety. 

    Fisher leads the Isotope Sciences Program at PNNL, which focuses on isotope production and applications development for government and private industry.

    Fisher has been active in the Health Physics Society for 36 years, serving as board member, treasurer, parliamentarian, and chair of its major committees.  His three-year term includes one year as president-elect and one year as past-president of the society. 

    “In the coming year, our professional society will continue to seek congressional support for university teaching programs in health physics and nuclear engineering, and federal agency support for international standards development,” Fisher said.  We also seek a broader role in federal policy development for nuclear materials and waste management.”

    Fisher’s term began in July at the Society’s 57th Annual Meeting in Sacramento, Calif.  At that same meeting, PNNL health physicist Kathy Pryor concluded her year as the society’s president.

    “Kathy provided outstanding leadership and set a high standard,” Fisher said. “I look forward to serving and doing what I can to help advance the society and increase its influence for the nation, profession and membership.”

    Fisher joined PNNL in 1978 after earning a master’s degree and doctorate in nuclear engineering sciences from the University of Florida.  He earned a bachelor’s degree in biology from the University of Utah.

  • SEC Passes Natural Resource Transparency and Conflict Minerals Rules: The Glass is Fuller than Expected

     Over two years ago, Congress adopted Sections 1502 and 1504 of the Dodd-Frank Wall Street Financial Reform Act, which focuses on conflict minerals and natural resource transparency. However, the Securities and Exchange Commission (SEC) was tardy in issuing the implementing regulations, but it passed both rules this past Thursday— more than 450 days past its April 2011 deadline.

    A lot is at stake for citizens in dozens of countries, for investors and for multinational companies. Section 1502 mandates that U.S. companies sourcing minerals from the Democratic Republic of Congo (DRC) and adjacent countries perform due diligence on the source and chain of custody of minerals and disclose whether they use conflict minerals…

    Section 1504 requires publicly traded oil, gas and mining companies to make project-level disclosures of payments made to governments around the world for the purpose of commercial development of natural resources. The aim of both provisions is to enhance corporate and government accountability. Yet, vague rules that allow for exemptions or do not require reporting on critical details would easily undermine the objective of effective transparency.

     Was the wait worth it? That, of course, depends on who you ask. The wait appears worth it in the case of rules on the disclosure of resource payments to foreign governments (Section 1504), while the results are somewhat mixed for rules mandating the disclosure of conflict minerals (Section 1502).

    The SEC first voted on disclosure rules for conflict minerals (Section 1502). The mere fact that after such a long delay the agency finally voted in favor of these regulations constitutes a step forward. The intent of Section 1502 of Dodd-Frank (and thus of SEC) was not to mandate penalties for sourcing minerals from mines controlled by armed groups in conflict-afflicted regions. Instead it relies on the adverse reputational effect of such disclosure. Reputable companies would want to avoid having their name associated with armed conflict, human rights violations, slavery and rape. Yet, an important segment of the industry opposed such disclosures on the basis that compliance costs would be high and that disclosure would be ineffective in addressing instability in the region.

    But following the SEC’s ruling on Section 1502, the glass is only half full because the industry managed to get some reprieve from full disclosure. For all companies there will be a two-year phase-in period, and for smaller companies a four-year phase-in period. Other companies, such as Wal-Mart and Target, will be exempted from disclosure because the SEC does not require disclosure for store brand products manufactured by third-party suppliers. Further, companies using recycled or scrap minerals would also avoid the disclosure rules.

     Thus, while human rights advocates and the industry (with the exception of some firms which were not opposed to Section 1502) were generally at odds about the provision, they agreed that the outcome of the SEC ruling was mixed. Many in the industry are displeased that the rules were passed, but are pleased that there will be significant implementation delays and exemptions. Civil society and human right advocates are pleased that the SEC voted in favor of adopting rules but fear that the rules are relatively weak.

    Both sides do agree that the disclosure alone will not solve the conflict in the eastern DRC; and human rights activists feel that the measures passed by the SEC may help mitigate conflict and deter human rights abuses, even though they believe broader governance reforms are needed. It remains to be seen how effective the actual implementation of these provisions will be and whether broader complementary measures to tackle misgovernance and conflict in the DRC will be implemented.

    In contrast to the “glass half full” ruling on Section 1502, the Section 1504 ruling on natural resource payment disclosure represented a much fuller glass. The American Petroleum Institute (API), big oil and several other extractive industry companies had lobbied heavily against rules that would require project-level disclosure and in favor of various exemptions, including the so-called “tyrant veto”, which would exempt companies from disclosing payments in countries where payment disclosure was prohibited by local law.

    In its ruling, the SEC rejected the “tyrant veto” exemption and exemptions in cases where contracts stipulate secrecy. Further, the SEC also mandated that companies file disclosures, rather than merely furnish them, which is important because the requirement to file enables investors to litigate in certain cases of false reporting. The SEC also specified that payments above $100,000 must be reported and disaggregated by category, rejecting the arguments put forth by the industry for a materiality approach or a threshold of $1 million.

    A key question prior to the final ruling was how the SEC would define a “project”. Industry lobbyists pushed for a broad definition that would allow disclosures at as aggregate a level as possible. Some even tried to equate a project with all operations in a country. In its ruling, the SEC acknowledged that the term “project” is commonly understood by issuers and investors, and granted companies some latitude in defining what constitutes a project. But, thanks to the guidance issued by the SEC with the rules, the amount of discretion that companies will have is rather limited.

    Specifically, in its guidance, the SEC rejected several project definitions that were proposed by industry stakeholders and strongly opposed by civil society. It clarified that a project cannot be defined at the country level or following criteria driven by geological basin, reporting unit or materiality thresholds. At the same time, the SEC indicated that for the purposes of the rule, the notion of “project” should be guided by the relevant contract (since the payments made by companies to the government are usually stipulated in the contract). Thus, the ruling demarcated reasonable boundaries around what constitutes a project. As a result, reporting is expected to take place at a rather detailed and disaggregated level.

    Further, the SEC decision not to rule on a project definition may have been a clever move, both substantively and tactically. Substantively, giving companies latitude in defining a project rather than imposing a “one size fits all” definition may result in disclosure of payments for segments of the industry outside of exploration and production. Tactically, by sticking to the wording of the original Dodd-Frank law, the SEC may fend off a possible source of litigation by the industry (API).

    However, the SEC’s clever ruling on the project definition may not dissuade the API from litigation. If they do decide to litigate, the industry body may opt to focus on the costs associated with implementing transparency rules, which they claim will be huge, particularly with regard to compliance costs and loss of competitiveness. In fact, in issuing the rules, the SEC itself did acknowledge that some of these costs to industry may not be trivial.

    When discussing compliance costs, it is important to distinguish between the total costs of reporting and the additional costs resulting from the new disclosure requirements. The latter are particularly relevant in assessing the potential costs of 1504, and are likely to be much lower than some companies claim. Most companies already have extensive internal systems in place for recording payments, and already collect project level information to handle their current reporting requirements. Adjustments due to the new set of rules are thus likely to be relatively minor and could be done in a timely and cost-effective manner.

    Several companies also highlighted concerns that other market participants could use information disclosed by issuers to derive trade secrets such as contract terms, data on reserves, or other confidential information. These arguments have been rebutted by outside analysis and advocates of transparency. The SEC did not give them credence either, noting that the statute covers the amount of payments, not the manner in which payments are determined or other contract terms.

    Companies were also concerned that they would become less competitive relative to companies not subject to the reporting obligations under 1504. The American Petroleum Institute (API) and companies like ExxonMobil and Rio Tinto are concerned that by becoming more transparent they will lose contracts in countries where the government either legally prohibits disclosure or prefers to work with companies that are not subject to payment disclosure. In its ruling, the SEC rejected this flawed notion that implies that corrupt or opaque governments would drive the provision of exemptions from transparency of companies listed in the U.S.

     Furthermore, the impact on competitiveness would be minimal in the numerous jurisdictions where payment information is already publicly available, partly as a result of increased participation by governments and companies in the voluntary disclosure framework under the Extractive Industry Transparency Initiative (EITI).

    There is also a clear trend toward the globalization of mandatory disclosure of payments by extractive sector companies. The European Union is soon likely to adopt laws similar to those set forth by the U.S.  Together, U.S. and EU regulations would cover the vast majority of listed natural resource companies in the world. Moreover, mandatory rules were already adopted by the Hong Kong Stock Exchange and discussions are ongoing in other financial centers in Asia.

    More generally, it seems misplaced to equate, as API and some companies have tried to, competitiveness and the ability to keep payments secret. Yes, there are some companies in the world that benefit from rent-seeking, monopolistic behavior, bribery of foreign officials and tax avoidance or outright evasion.

    But, as previously argued, private companies around the world, including in dynamic sectors in the U.S., compete on the basis of efficiency, entrepreneurship, and high technical and innovation standards. A truly competitive firm would have little to gain from secrecy; to the contrary, it would benefit from the level playing field created by high levels of transparency.

    Over the past two years, the discussion of the potential costs of disclosure has been long and detailed. By contrast, there has been far less said about potential benefits. It is the case that the benefits of transparency are not easy to quantify. Yet, as we have noted before, a body of empirical work has found the benefits of transparency, good governance and corruption control to be quite large and to accrue to multiple stakeholders, including citizens, investors and competitive companies in the extractive sector. In their submissions to the SEC, some investors noted that new disclosure requirements would help them assess the risks faced by companies operating in resource-rich countries and thus possibly promote investment and capital formation.

    In fact, our own data and research suggests that in the long run there is up to a 300 percent development to citizens dividend from increased transparency, accountability and improved governance. In particular, improved governance can contribute to a threefold rise in incomes and two-thirds decline in infant mortality.

    Further, project-level disclosure will empower citizens to obtain information on how much their governments earn from natural resources, advocate for a fairer share of revenues, and verify government-published budget data. Once the data is disclosed and processed by analysts and civil society, citizens should also be able monitor the flow of money from the central government to regional and local governments, thus helping ensure that they are receiving what is promised. Finally, more transparency in dealings between companies and governments may help companies sidestep attempts by some government officials to engage in unethical activities.

    More generally, it is also important to emphasize that extractive-intensive countries need not be subject to the resource curse. Countries with transparent and enlightened leadership, and with satisfactory standards of governance and corruption control (supported by good corporate governance practices among multinationals), can harness their natural resources to achieve robust and inclusive growth and development. As seen in figure 1, extractive-rich countries that do well in controlling corruption also have higher income levels, in contrast with poorly governed ones. The challenge in coming years is raising the governance standards of many resource-rich countries that are lagging in this area.    

    The robust implementation of the SEC rules on transparency in natural resources as mandated by Section 1504 of the Dodd-Frank Act will be an important step forward, but it will not be sufficient. In order to make extractive industry transparency a global norm, the EU and other financial centers need to follow the lead taken by the United States. Building on its success in promoting the U.S. rules on Section 1504 advocacy organizations, such as the Publish What You Pay (PWYP) Coalition and its main NGO members, as well as key investors, need to now fully focus on the passage of a similarly strong set of transparency rules in the European Union.

    Engaging China on this issue will also be important. And extractive-rich countries around the world need to do their part, deepening their work on transparency through the EITI and other such mechanisms. And important dimensions of opacity that still prevail in natural resources, untouched by Dodd-Frank 1504, will need to be addressed separately, such as promoting contract transparency; tackling the challenge of obscure “beneficial ownership” (to ensure the public is aware of who the ultimate owners/beneficiaries are of natural resource extraction and exploration); and the further analysis and codification of the considerable payoff to transparency reforms.

    The original Dodd-Frank Section 1504 and the SEC rulings are a huge step forward toward transparency and are likely to resonate worldwide. But much of the concrete work remains ahead.

     

    Note:  This article was co-authored with Veronika Penciakova and was originally posted at Brookings.

  • SEC’s Day of Reckoning on Transparency: Dodd-Frank Section 1504 on Disclosure of Natural Resource Revenues

       Following a very lengthy delay, tomorrow, August 22nd, the Securities and Exchange Commission (SEC) will finally issue the detailed implementing rules on natural resource transparency in Section 1504 of the Dodd-Frank Wall Street Reform and Consumer Protection Act adopted by Congress in July 2010.   Specifically, Section 1504 stipulated that companies in extractive industries listed in U.S. exchanges would be required to report payments made to governments around the world.

    This may sound clear enough, but as often is the case the devil will be in the details. Tomorrow those details will be in the hands of the SEC and will determine whether ‘effective transparency’ is attained or continues to remain elusive. Namely the SEC will determine whether the information that needs to be disclosed by companies is sufficiently detailed, relevant and accessible, enabling effective monitoring and analysis by civil society, investors and government reformists…

    Given the content of the 2-year-old Dodd-Frank legislation, the SEC has no choice but to mandate disclosure.  However, effective disclosure is by no means guaranteed as the SEC could issue weak rules, rendering disclosure ineffective.  Thanks to Dodd-Frank legislation mandating transparency, the main danger is no longer wholesale ‘transparency evasion’ by many companies, but the more nuanced risk of enabling ‘transparency elusion’ (or ‘transparency avoidance’) by companies that wish to skirt detailed disclosure, thereby masking possible misdeeds.

        In fact in the aftermath of the financial crisis and in the increasingly sophisticated legal and business environment of the 21st century, outright and explicit opposition to some form of disclosure by corporations is seen as increasingly costly, particularly from a reputation standpoint. Thus, tactics have shifted to an extent – as they previously did in the tax compliance field, when some corporations ceased focusing exclusively on tax evasion opting instead tax elusion (or tax avoidance, eluding or avoiding taxes rather than just evading them) gained prominence.

    But more concretely, how could the SEC possibly undermine the disclosure mandated by the Dodd-Frank Act in Section 1504, and permit ‘transparency elusion’ by corporations?

    There are several ways this watering down of the rules could take place, permitting corporations to elude full disclosure: first, by ruling weakly and in favor of corporations who wish to elude disclosure by minimizing the level of detail required by companies to disclose on payments made to foreign governments.

    In particular, this would happen if the SEC fails to mandate companies to report disaggregated payments for each concession, lease or contract, and instead give them latitude to either define themselves what constitutes a ‘project’, or, possibly worse would be if companies are allowed to report only at the aggregate country level (even though the latter is unlikely, since Dodd-Frank specifies that project-level disclosure should take place).

    Second, the SEC would side in favor of companies that wish to avoid effective transparency by granting them significant exemptions from reporting payments for medium-scale projects, say, ranging from $75,000 to $750,000 (there is already consensus that it is reasonable to exempt very small projects, such as those below $25,000).

         Third, although unlikely, the SEC could grant companies exemptions in not having to report payments made to opaque (and often authoritarian) governments with domestic laws that may ban disclosure (even though there is no evidence that companies would be hurt by disclosing payments for those settings).

    Finally, some oil companies represented by the American Petroleum Institute (API) and supported by Shell and others, have opted for an additional tactic to elude transparency: threatening to litigate against the SEC irrespective of how it rules tomorrow. The threat has been an overt effort to influence and weaken tomorrow’s rule-making by the SEC, while acting on that threat after tomorrow would be aimed at further delaying the implementation of the actual disclosure rules and to subsequently weaken the transparency rules themselves.

    If the SEC issues weak rules on some of the above mentioned critical aspects, companies may be able to effectively skirt disclosing financial information, which in turn would jeopardize accountability to shareholder investors and would impair analysis of tax compliance, of potential diversion of funds away from government treasuries, and of possible corruption or fraud.

    Here also lies the positive flip side: if the SEC issues strong and effective transparency rules and leaves little room for disclosure avoidance, then accountability to investors would be enhanced and a potent deterrent would be in place regarding tax evasion and tax elusion, as well as regarding bribery and corruption among companies and public officials.

    Growing evidence suggests that the benefits of transparency are sizeable in various dimensions, including incomes per capita and other social and political stability gains for the host country citizens, as well as gains for countries in terms of investments, macro-economic (fiscal) stability, financial sector development, and control of corruption.

    For instance, our own data and research from around the world suggests that in the long run, with increased transparency, accountability and improved governance, citizens could see up to a 300 percent development dividend from improved governance – i.e. their incomes per capita could triple, while infant mortality could decline by two-thirds. Furthermore, some studies by other authors suggest a positive impact of transparency specifically in the natural resources sector.

    But how costly to the corporate sector would disclosure be?  It would be naïve to suggest that every corporation would gain (or have no costs) from full disclosure, at least in the short term.  This has little to do with the actual administrative costs of data collection for disclosure, because the incremental cost of new data collection over what data the companies already need to collect for tax and their own internal purposes would be small.

    Instead, the real reason that disclosure may be costly to some companies in the short term relates to a different strand of our research: there are two types of companies, those that focus on efficiency and innovation and can thrive in a competitive level-playing field, and those that derive gains from rent-seeking (and outright bribery), monopolistic behavior or tax avoidance. The latter group would have an interest in maintaining an opaque status quo and stand to lose from a more level playing field resulting from effective transparency, while the former group would stand to gain, since the playing field would be leveled across all companies, benefitting the entrepreneurial and competitive firms.

    Therefore not surprisingly, the corporate sector remains divided regarding these transparency rules.  Some mining companies have come out publicly in favor of transparency, as have prominent investors and former top executives, while some of the big oil companies are strongly opposed to it.  In fact, some companies such as the giant Statoil in Norway and Newmont Mining already disclose payments voluntarily.

    Thus, if one assesses the transparency benefits against the legitimate company costs (not counting the private costs to some companies due to corrupt behavior), the net payoff of transparency could be very large. This not only applies to overall societal gains, but incipient evidence also suggests that the corporate sector as a whole would benefit from a transparent level-playing field (even as some opaque companies may lose out in the short term).

        And it is also noteworthy that highly reputable pro-market, pro-business competition publications such as The Economist and the Financial Times have written prominent editorials supporting tough and detailed rule making to attain effective disclosure by companies in the natural resources industries.

    But the expected large net benefits of transparency do not necessarily mean that SEC will automatically rule effectively tomorrow. It is fresh in our collective memories that in terms of its overall mandate on financial sector supervision and regulatory oversight, over the past decade the SEC failed to perform and was held partly responsible for contributing to the global financial crisis.

    The SEC was seen as, at best, being afflicted by poor leadership and as an ineffective bystander while the excesses of financial overleveraging and financial deregulation occurred. At worse, it was seen as having been subject to regulatory capture by the corporate sector, ultimately leading to regulatory failure (while becoming further tainted by the Madoff fraud scandal).

    While some efforts to improve the SEC have taken place in recent years, the jury is still out regarding its current effectiveness in issuing and implementing regulations. Further, in the specific case of issuing rules mandating disclosure to companies in oil, gas and mining, the SEC may be overly influenced by the lobbying efforts and litigation threats by some big oil companies, fronted by the API, who oppose effective transparency.

    On the other hand as the SEC aims to improve its performance and reputation, it could end up issuing effective transparency regulations in all the key dimensions, pleasantly surprising observers and transparency advocates. A good ruling would have important repercussions worldwide, including in the European Union, where preparation of similar regulations are being debated and the lead already taken by the U.S. on revenue transparency is being closely watched before they finalize their legislation.

    Yet even if the SEC were to issue a strong set of rules, its role in promoting revenue transparency would not cease the day after tomorrow.  How effectively the SEC fends off challenges by big oil companies, and then implements its rules in the future will matter significantly as well.

    Even effective SEC implementation will not suffice in itself.  Financial centers around the world would need to follow suit, governments need to continue making progress in making transparent revenue payments, working with the Extractives Industry Transparency Initiative (EITI) and similar such programs, and civil society evidence-based monitoring and advocacy efforts need to expand further, through the initiatives of the Publish What You Pay (PWYP) coalition and its member organizations.

    Finally, as information from companies  begin to flow more freely and transparently, analysts in NGOs, think tanks and academia would be encouraged to exploit more fully the ‘power of data’, so to further learn about improving governance in natural resources, deter corrupt behavior, and benefit citizens and honest corporations worldwide.

  • U.S. Obsession with Guns, Uninterrupted: A Case Study on the Capture of Politicians?

         The terrifying massacre during the midnight opening of the Batman movie in Aurora, near Denver, is another reminder that guns kill.   It is also another reminder of the failure of U.S. politicians to act on it.  Unfortunately, those gruesome reminders are frequent in the U.S., making the impotence of politicians to act even more self-evident.  Most of the industrialized and emerging countries of the world, and their citizens, understand that guns, and semi-automatic assault weapons, do kill, of course.  They have acted on it.

    I have written about the topic before, such as here on gun killings in U.S. Universities, and here another blog entry among others showing a table with data on the extent of gun ownership and on gun homicides in the U.S. compared with other countries.  So at this juncture let me just point to selected data provided by the Brady Campaign to Prevent Gun Violence, and offer one thought on political capture…

    Their website reminds us that since 1968, when Martin Luther King and Robert Kennedy were assassinated, over one million people have been killed with guns in the United States.  Annually, on average, almost 100,000 people in the United States are shot or killed with a gun.  Well over 30,000 people died from gun violence.  In other industrialized countries, a very tiny fraction of those killed in the U.S. die by gunshots, since guns are not ubiquitous there.  And the U.S. firearm homicide rate is about 20 times higher than in 22 other populous high-income countries combined, despite similar non-lethal crime and violence rates.  Not surprisingly, then, among 23 populous, high-income countries, 80% of all firearm deaths occurred in the United States.

    Further, research shows that 94% of gun-related suicides would not occur had no guns been present. And keeping a firearm in the home increases the risk of homicide by a factor of 3, so it is not surprising that guns are more likely to raise the risk of injury than to confer protection.  In fact, every year there are only about 200 legally justified self-defense homicides by private citizens.

    In short, there is substantial evidence that removing guns saves lives, because guns kill people.   It is of interest that the rates of assault with knives and guns in the United States are similar, yet there are five times as many deaths from guns.  And so many of those lethal guns can be obtained in the U.S. without a background check to screen out criminals or the mentally ill.  Almost one-half of gun acquisitions occur in the secondary market, and sales between individuals do not require a background check.  And particularly lethal ‘assault weapons’ (semi-automatic firearms), with no known civilian use benefit whatsoever (to the contrary), used to be subject to a ban in the U.S. – but such ban was lifted in 2004.

    Police working on the Aurora movie massacre have said that James Holmes, the alleged gunman, had three weapons: a Remington shotgun, a Smith & Wesson M&P assault rifle, and a Glock 40-caliber handgun. The semiautomatic assault rifle, which is a civilian version of the military’s M-16, can fire 50 to 60 rounds per minute, is designed to hold large ammunition clips.  Apparently the killer purchased thousands of ammunition rounds over the internet.

    Those are just some of the stark facts.

    Yet again, there is a failure of leadership on this issue.  Most of the U.S. politicians, including the presidential candidates, are keeping mum about gun control.  At most they are expressing shock and regret about (yet another) ‘senseless’ killing spree, this time in Aurora, and trying to provide comfort to the grieving families.

    Money in politics, and the capture of politicians by the undue influence of special interests, such as the National Rifle Association (NRA), goes on unabated. The few politicians who dare to even speak about the fact that guns kill people, and that stricter gun control laws are needed in the U.S, such as U.S. Rep. Carolyn McCarthy, whose husband was killed by a deranged gunman,  and Senator Dianne Feinstein, a longtime advocate of stricter gun restrictions, are painfully aware of the fact that those politicians that speak up are targeted at re-election time by the NRA.  Money speaks so loudly that virtually all politicians and leaders are silent on the issue.

    The rest of the world looks on at this obsession with guns, and the perverse influence of money in politics, in disbelief.

     

  • Job Transition: heading (to) the Revenue Watch Institute

         I wanted to quickly share the news with fellow bloggers and readers on my upcoming job transition, to take place in the early fall.   At that point I will head the Revenue Watch Institute (RWI) and will cease being a resident fellow at Brookings.

    I am mindful that nowadays improved governance of oil, gas and minerals is the critical development challenge for dozens of countries around the world.    So it is such a privilege to be asked to lead Revenue Watch, an organization with great people which plays a key role in promoting policy reforms and transparency by governments and corporations in the natural resource sector…

    I am excited about this opportunity and challenge.  I am also aware of the sizeable shoes that I have been asked to to fill.  Karin Lissakers, the current RWI President has been a terrific leader of an organization already operating in dozens of countries and making a difference in policy reform advise and advocacy.

    RWI’s official announcement about this transition is here.   I will still contribute some new work from Brookings until mid-September, when I move to RWI.

  • Putin President Again: A Wake-Up Call to the World?

      Vladimir Putin is about to be re-elected, yet again, as President of Russia.  He already served as President twice, over the 2000-2008 period, to then immediately ease himself into the Kremlin’s Premiership for the past four years, awaiting his next term as President, which is about to begin. 

    His new term is expected to last six years this time around, since the Russian constitution was amended to permit a longer presidency.  If he seeks and wins reelection in 2018, Putin could be president until 2024 and effectively rule Russia for over two decades.  He would have served longer than any Russian leader besides Stalin…

    Much will be written about the reasons for the comfortable margin by which Putin is likely to win his 3rd presidential term today, in spite of the ‘Putin-fatigue’ syndrome that has set in among the urban elite.  Articles will mention the craving for an image of a strongman and for stability among many Russians, while others may cry foul about fraud at the polls or related electoral corruption.  Yet this should not obscure three larger issues of significance for Russia and the world, transcending the current electoral event.

      First, Russia governance has been declining for about a decade already, and rather markedly.  This has been discussed in a recent entry and in a conference presentation.  Such decline is seen in figure 1 here.  As we can see, the decline is in virtually every one of the six dimensions of governance (as measured by the Worldwide Governance Indicators, or WGI), notably including a marked decline in Voice & democratic Accountability. 

    The current presidential elections, held in a less-than-free environment for the media and for political participation, and where the emergence of viable alternatives to Putin has been stymied, ought to be viewed as a continuation of this trend of declining governance.

      In fact, Russia’s governance standards nowadays rate extremely poorly when compared with the rest of the world, as seen in Figure 2, which averages the six dimensions of governance in the WGI.  Such rough composite of governance indicates that Russia compares poorly with many countries.  Its cohorts in terms of poor governance, like Pakistan, are countries where transition has not been successful.

    Second, for quite some time, Russia has already faced the huge challenge of endemic corruption, and if anything such corruption has worsened over the past decade, as also seen in figure 1 above.  There is high corruption in politics, in the executive, in the judiciary, and in the interactions between the private and public sectors.

        As seen in figure 3 here, for every type of bribery, a very high proportion of enterprise managers report that they do bribe often, comparable with countries like Nigeria and Libya, and sharply contrasting the much lower levels of bribery in many other countries. Cronyism plays an important role: those close to Putin in the Kremlin have benefitted handsomely.  And one source of high level bribery is public procurement: the lion share of firms in Russia have to pay bribes to obtain contracts. 

         Furthermore, various forms of bribery have gone up substantially. Figure 4 shows the increasing trend in procurement bribery, leading to the extremely high levels that currently prevail.

     

    Third, the troubling evolution of governance in Russia over the past decade is a wake-up call to the world, which at times has been naïve about Russia’s transition, and about other transitions.  Over two decades ago the Soviet Union collapsed, and a democratic era dawned in Russia and many other formerly Soviet states. Yet since then the progress in democratic governance has been halting in many countries, or, even worse, there have been some reversals over the past decade, such as in Russia. 

    These developments carry a warning to the Arab world.  Just because an old autocratic regime is discarded, the emergence of robust democratic institutions is by no means assured.  I have written about this subject in this brief article (here), presented and discussed in various countries, including in the Middle East.

    Take the case of Egypt, for instance:  the demise of the Mubarak regime may indeed have been salutary, and can be viewed as a necessary precondition for a democratic transition.  Yet the events being played out also suggest that Mubarak gone, in itself, was insufficient. A broader perspective is useful:  of the scores of initial transitions to democracy over the past fifty years, many have not been fully successful, either having muddled through or even moving backwards, as in the case of Russia.

    Democratic transitions are fragile and require constant vigilance, hard work and democratic institution-building for decades after the initial democratic episode.  Short-term setbacks or even marked reversals are not uncommon. The euphoria of the moment when an old autocratic regime is replaced, coupled with the political expediency of the international community, ought not blur the stark assessment of how each transition is actually progressing — or not.

  • Conviction of Spain’s Superjudge Garzon: An indictment of its own judiciary?

                                                           The recent conviction (ostensibly for ordering jailhouse witetaps) of Baltasar Garzón, the Spanish judge who took on corrupt officials, despots, terrorists and human rights violators during the Franco regime, casts a dark shadow on Spain’s judiciary and hints at a political witch-hunt. 

    In October 1998, Judge Garzón catapulted to prominence when he broke with traditional international law and tried to extradite the former Chilean ruler Augusto Pinochet from the United Kingdom, where he was receiving medical treatment, to Spain…

    At the time Pinochet, like other former autocrats, was fully confident that as a former leader of a sovereign nation he was legally untouchable abroad, regardless of the crimes he had committed while in power. Through that legal challenge Garzón became a de facto architect of the principle of universal jurisdiction.  

    Judge Garzón has no small ego.  He has taken activist stances on sensitive issues and sought publicity.  This has not endeared him to Spain’s arch-conservative Supreme Tribunal nor other jurists and politicians in Spain, where he touched powerful vested interests by unearthing high-level political corruption and state-sponsored death squads. Further, professional and political envy at his national and international prominence (he has been dubbed the ‘Superjudge’) cannot be disregarded as a factor in his current predicament.

    Garzón may also have made some errors of judgment, such as ordering wiretaps in a political corruption and money laundering case when the law was unclear on the permissibility of such action.  According to Human Rights Watch he was not alone in approving these wiretaps, yet, he was singled out.  Worse, even though he may have made some missteps, being convicted on criminal charges,  and barring him from the legal profession for 11 years (effectively terminating his judiciary carreer) seems to be a wholly disproportionate sanction.

    As reported by the New York Times, Reed Brody, counsel for Human Rights Watch, said the “accumulation of the cases against Judge Garzón” suggested “reprisal for his past actions against vested interests.” “Unfortunately,” he added, “it certainly looks like his enemies now got what they wanted.”

    A travesty of justice appears to have been committed in Spain, with the fundamental principle of judicial independence becoming compromised. This may seem shocking and unlikely in a country like Spain, where impressive gains in governance and rule of law had been made in the post-Franco era.  In fact, over the past few decades many countries in Latin America have looked up to and learned from Spain’s rule of law and judicial institutions, benefitting from considerable technical collaboration with jurists and legal experts in this area.

     Figure 1:

                                                                                                                   However, the evidence suggests that over the past decade something has changed in Spain’s governance, and not for the better.  Shortly after judge Garzon tried to have Pinochet extradited in 2000 – over a decade after the Chilean dictator left power with immunity – Spain rated higher than Chile on the quality of its rule of law institutions according to the Worldwide Governance Indicators (WGI). By 2010 the countries’ respective positions had reversed, resulting from a decline in the quality of rule of law in Spain and a slight improvement in Chile (Figure 1). 

    Figure 2: 

                         Worse, by 2010 Spain’s performance on rule of law was mediocre by OECD standards. As we can see in Figure 2 Spain (ranked 29th) not only rated well below the Scandinavian countries, which rated among the best in the world, but it also rated below many of its peers, including New Zealand (5th), Canada (9th), Ireland (13th), Hong Kong (20th) and Malta (22nd), among others.  There was nothing inexorable about a deteriorating rule of law — each country featured in Figure 2, with the exception of Spain, exhibited some improvement in their rule of law over the past decade.

    Spain’s quality of rule of law in 2010 was roughly at the level of Estonia (which improved markedly over the past decade), Cyprus, Bermuda, Guam and French Guyana. The declining and mediocre ratings for Spain may be symptomatic of a broader governance challenge.  Among OECD countries Spain also rates near the bottom in government effectiveness, control of corruption and regulatory quality.
     
    In fact, it is a poignant irony that years after the contribution of Spain’s Judge Garzón in challenging Chile’s immunity to Pinochet, Spain rates below Chile in virtually all six WGI governance indicators.  Deeper analysis is needed to unlock the factors behind such institutional decline in Spain over the past decade. 
     
    Arguably, Spain has ceased to be an example for Latin American countries to emulate. In fact, the powerful vested interests that persecuted Judge Garzón are a stark reminder that governance failings are not the exclusive domain of emerging and developing countries, but are all also too common in rich industrialized countries.
     
    Further, the ’current’ governance indicators presented above are actually based on data from 2010. New data is not yet available, but given the current turn of events in the Garzón case, Spain’s rule of law ratings for 2011 and 2012 are unlikely to pick up.    
     
    Although serious damage has been inflicted to the Spanish judiciary, experience shows that it is possible to reverse course, even if in this case it may take some nudging from international institutions like the European Court of Human Rights (and the media likes of the Financial Times) supporting Spain’s own voices for change.