Category: News

  • New From NAP 2012-11-01 07:45:01

    Prepublication Now Available

    Tobacco use is the leading cause of preventable death in United States, causing more than 440,000 deaths annually and resulting in $193 billion in health-related economic losses each year–$96 billion in direct medical costs and $97 billion in lost productivity. Since the first U.S. Surgeon General’s report on smoking in 1964, more than 29 Surgeon General’s reports, drawing on data from thousands of studies, have documented the overwhelming and conclusive biologic, epidemiologic, behavioral, and pharmacologic evidence that tobacco use is deadly. This evidence base links tobacco use to the development of multiple types of cancer and other life-threatening conditions, including cardiovascular and respiratory diseases. Smoking accounts for at least 30 percent of all cancer deaths, and 80 percent of lung cancer deaths. Despite the widespread agreement on the dangers of tobacco use and considerable success in reducing tobacco use prevalence from over 40 percent at the time of the 1964 Surgeon General’s report to less than 20 percent today, recent progress in reducing tobacco use has slowed. An estimated 18.9 percent of U.S. adults smoke cigarettes, nearly one in four high school seniors smoke, and 13 percent of high school males use smokeless tobacco products.

    In recognition that progress in combating cancer will not be fully achieved without addressing the tobacco problem, the National Cancer Policy Forum of the Institute of Medicine (IOM) convened a public workshop, Reducing Tobacco-Related Cancer Incidence and Mortality, June 11-12, 2012 in Washington, DC. In opening remarks to the workshop participants, planning committee chair Roy Herbst, professor of medicine and of pharmacology and chief of medical oncology at Yale Cancer Center and Smilow Cancer Hospital, described the goals of the workshop, which were to examine the current obstacles to tobacco control and to discuss potential policy, outreach, and treatment strategies that could overcome these obstacles and reduce tobacco-related cancer incidence and mortality. Experts explored a number of topics, including: the changing demographics of tobacco users and the changing patterns of tobacco product use; the influence of tobacco use on cancer incidence and cancer treatment outcomes; tobacco dependence and cessation programs; federal and state level laws and regulations to curtail tobacco use; tobacco control education, messaging, and advocacy; financial and legal challenges to tobacco control efforts; and research and infrastructure needs to support tobacco control strategies, reduce tobacco related cancer incidence, and improve cancer patient outcomes. Reducing Tobacco-Related Cancer Incidence and Mortality summarizes the workshop.

    [Read the full report]

    Topics:

  • New From NAP 2012-10-30 23:00:00

    Final Book Now Available

    Medicare, the world’s single largest health insurance program, covers more than 47 million Americans. Although it is a national program, it adjusts payments to hospitals and health care practitioners according to the geographic location in which they provide service, acknowledging that the cost of doing business varies around the country. Under the adjustment systems, payments in high-cost areas are increased relative to the national average, and payments in low-cost areas are reduced.

    In July 2010, the Department of Health and Human Services, which oversees Medicare, commissioned the IOM to conduct a two-part study to recommend corrections of inaccuracies and inequities in geographic adjustments to Medicare payments. The first report examined the data sources and methods used to adjust payments, and recommended a number of changes.

    Geographic Adjustment in Medicare Payment – Phase II:Implications for Access, Quality, and Efficiency applies the first report’s recommendations in order to determine their potential effect on Medicare payments to hospitals and clinical practitioners. This report also offers recommendations to improve access to efficient and appropriate levels of care. Geographic Adjustment in Medicare Payment – Phase II:Implications for Access, Quality, and Efficiency expresses the importance of ensuring the availability of a sufficient health care workforce to serve all beneficiaries, regardless of where they live.

    [Read the full report]

    Topics:

  • Transforming America by redirecting wasted health care dollars

    The respected national Institute of Medicine estimates that $750 billion is lost each year to wasteful or excessive health care spending. This sum includes excess administrative costs, inflated prices, unnecessary services and fraud — dollars that add no value to health and well-being.
     
    If those wasteful costs could be corralled without sacrificing health care quality, how might that money be better spent?
     
    In a study published in the current online edition of the American Journal of Preventive Medicine, Frederick J. Zimmerman, professor and chair of the department of health policy and Management at the UCLA Fielding School of Public Health, and colleagues outline some of the myriad ways that $750 billion could benefit Americans.
     
    “If cut from current health care expenditures, these funds could provide businesses and households with a huge windfall, with enough money left over to fund deficit reduction on the order of the most ambitious plans in Washington,” Zimmerman said. “The money could also cover needed investments in transportation infrastructure, early childhood education, human capital programs, rural development, job retraining programs and much more. And it could transform America with little to no reduction in the quality of, or access to, health care actually provided.”
     
    Zimmerman noted that while different observers would likely have different priorities regarding the alternative uses toward which the wasted expenditures could be directed, all would agree that the alternatives proposed in this study have inherent social value.
     
    “When the fastest-growing part of the economy is also the least efficient, the economy as a whole loses its ability over time to support our current living standards,” said Jonathan Fielding, a UCLA professor of health policy and management and director of the Los Angeles County Department of Public Health, who is a co-author of the study. “The U.S. has become irrationally attached to its inefficient health care system. Recognizing the opportunity costs of this attachment is the first step in repairing the system.”
     
    In the study, the research group, which also included Dr. Steven Teutsch, chief science officer of the Los Angeles County Department of Public Health, and first author Jeffrey C. McCullough, a graduate student at the UCLA Fielding School, presented one scenario of how that money could be used.
     
    For one, the authors propose that more than $410 billion per year — or 55 percent of the savings — could be returned to the private sector for individuals and companies to use as they please; another $202 billion (27percent) could go toward deficit reduction, yielding a greater reduction than the congressional “super committee” sought and failed to achieve. An additional $104 billion (14 percent) could support additional investments in human capital and physical infrastructure.
     
    “For example,” Zimmerman said, “the Head Start program could be doubled in size, universal preschool could be provided, average class size could be reduced from 22–25 to 13–17 students. And trained nurses could conduct regular home visits for high-risk pregnancies.”
     
    Two percent of the savings, amounting to $18 billion, could promote urban and rural quality of life by improving the built environment surrounding schools, expanding and modernizing public libraries, improving wastewater treatment and providing rural development grants to every small town in the nation. Job-training opportunities would be affordable for nearly 50,000 unemployed persons. And under the research group’s scenario, the remaining 2 percent of the savings would be devoted to fully funding an extensive wish list of transportation projects to alleviate road congestion and promote mass transit alternatives.
     
    Freeing up this money would be no easy task, Fielding warned. These excess expenditures will be difficult to reduce because the costs are spread across many groups, and the financial beneficiaries are coordinated, clear-minded and powerful, he said. Overcoming this resistance will require concerted collective action on the part of many economic sectors, governmental agencies and other organizations that are not used to seeing themselves as sharing interests with the others.
     
    But whatever one’s values and preferences, said Zimmerman, “eliminating excess medical care costs provides a monumental opportunity to reallocate those resources to strengthen our international competitiveness, enhance our well-being and build a healthier nation.”
     
    The result of redirecting some $750 billion per year, he said, could be transformative for Americans, and the potential uses for these funds are panoramic in both scope and possibility.
     
    “This will not be an easy fight,” Zimmerman said. “But we believe reconceptualizing our excess health care spending by looking at its opportunity cost to society is an important first step.”
     
    A video of the group’s research is available online at www.ajpmonline.org/content/video_pubcasts_collection.
     
    This research was not supported by external grants or funding. The authors report no conflict of interest.
     
    The UCLA Fielding School of Public Health is dedicated to enhancing the public’s health by conducting innovative research; training future leaders and health professionals; translating research into policy and practice; and serving local, national and international communities.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter

  • New From NAP 2012-10-24 23:00:00

    Final Book Now Available

    Most water resources managers, scientists, and other experts would agree that nonpoint source pollution is a more pressing and challenging national water quality problem today than point source pollution. Nonpoint sources of pollutants include parking lots, farm fields, forests, or any source not from a discrete conveyance such as a pipe or canal. Of particular concern across the Mississippi River basin (MRB) are high levels of nutrient loadings–nitrogen and phosphorus–from both nonpoint and point sources that ultimately are discharged into the northern Gulf of Mexico (NGOM). Nutrients emanate from both point and nonpoint sources across the river basin, but the large majority of nutrient yields across the MRB are nonpoint in nature and are associated with agricultural activities, especially applications of nitrogen-based fertilizers and runoff from concentrated animal feeding operations.

    Improving Water Quality in the Mississippi River Basin and Northern Gulf of Mexico offers strategic advice and priorities for addressing MRB and NGOM water quality management and improvements. Although there is considerable uncertainty as to whether national water quality goals can be fully realized without some fundamental changes to the CWA, there is general agreement that significant progress can be made under existing statutory authority and budgetary processes.

    This book includes four sections identifying priority areas and offering recommendations to EPA and others regarding priority actions for Clean Water Act implementation across the Mississippi River basin. These sections are: USDA’s Mississippi River Basin Healthy Watersheds Initiative; Numeric Water Quality Criteria for the northern Gulf of Mexico; A Basinwide Strategy for Nutrient Management and Water Quality; and, Stronger Leadership and Collaboration.

    [Read the full report]

    Topics:

  • Going live with a smarter electric grid

    A project designed to make the region’s and nation’s electric grid more reliable and efficient will be showcased today during an event at the University of Washington where students will be able to view how they are using energy in real time.  The UW project is one of 11 projects across five Northwest states that comprise the Pacific Northwest Smart Grid Demonstration Project, a public/private demonstration launched in February 2010.

    Sponsored by the Department of Energy’s Office of Electricity Delivery and Energy Reliability, and co-funded by the participating utilities, the demo is beginning a two-year period of collecting energy use data.

    The 11 participating utilities will evaluate the benefits of smart grid technologies locally — in their respective cities — and at the regional level. The project team will look at how a smarter grid can help deliver electricity more efficiently to avoid congestion in the transmission system and how more wind power can be used. The project’s data collection and analysis efforts are expected to provide an unprecedented view into how smart grid concepts can provide regional benefits while improving consumer choice and reliability locally.

    “The two-way information exchange in the Pacific Northwest Smart Grid Demonstration Project allows grid operators to make the existing electric grid more efficient — while also exploring how using other technologies, such as energy storage devices, smart appliances and wind power, can bolster the reliability of our system,” said Carl Imhoff, Electricity Infrastructure Market Sector manager at Battelle in Richland. Battelle is leading the demonstration project for the DOE’s Office of Electricity Delivery and Energy Reliability.

    “The data we’ll gather during the next two years will enable us to evaluate the costs and benefits of a smart grid to consumers in all types of utilities in the Pacific Northwest. We’ll also evaluate how we can optimize our power system while, at the same time, adding more variable, renewable energy resources such as wind and solar,” he said.

    The UW’s Role
    The University of Washington has invested nearly $10 million in the project. Before the project began, the UW had seven meters on campus providing a limited view into the campus’ energy use. As Seattle City Light’s largest customer, the UW has worked with the utility to install more than 200 smart meters across campus in nearly every building. The meters give energy users real-time information and analysis on energy usage and will improve the UW’s understanding of how much energy they are using and how efficiently they are using it. At new residence halls on campus, students will have real-time access to their energy use data by way of in-room energy management devices. Graduate students also will be able to gather and study this data for use in classroom instruction.

    “The University of Washington is recognized as a national leader in sustainability within the higher education community,” says UW Provost Ana Mari Cauce. “The project provides an exciting opportunity for testing how 21st century technology can reduce energy consumption. Given our students’ keen interest in the environment, it is appropriate that much of our research on smart grids will occur within our residence halls and that the initial research will be conducted by students in our Program on the Environment.”

    Regional knowledge
    At the regional level, the Pacific Northwest Smart Grid Demonstration Project is testing an innovative concept called “transactive control” through information exchanges connecting the 11 participating utilities, each volunteering to participate in the study with the Electricity Infrastructure Operations Center located at Battelle’s facilities in Richland, Wash. There, information about demand for electricity, amount of wind power available and wind forecasts are translated into incentive signals, or prices, which are updated every five minutes and sent to participating utilities. This allows contributors to make local decisions on how their piece of the smart grid project can support local and regional grid needs.

    The Battelle-led project team is using the signal to test a variety of smart technologies in different geographies, in different weather, in different situations, to learn the most they can about how the grid can operate most efficiently.

    The move to “go-live” with the transactive control signal is the continuation of more than a decade of smart grid research in the Pacific Northwest.

    Learning from the past
    “Six years ago in a similar project on the Olympic Peninsula in Washington state we learned that a small group of consumers using a smart energy signal similar to transactive control could save about 10 percent on their monthly power bill and help their utility reduce peak demand by 15 percent,” said Imhoff. “Now, we’ll be able to see how a broader set of customers, from different climate and geographic regions, can save energy or money, or both.” he said.

    During the Olympic Peninsula GridWise® Demonstration Project, 115 consumers were outfitted with smart water heaters and electric dryers that responded to a smart signal and would intermittently turn their heating elements off, anywhere from 20 seconds to two minutes, during times of peak demand on the grid. Researchers found consumers didn’t notice because water would remain hot and the barrel of the dryer would continue to spin during these short periods of time. Researchers also found the impact from those smart devices made a difference in the amount of money consumers paid for power each month, and in the amount of energy saved during times of peak demand on the grid. Project participants are now looking to that intermittency used in a broader scale with a diverse set of smart appliances and an even smarter signal.

    Building the business case
    As a primary partner in the project, the Bonneville Power Administration is leading a regional effort to develop a business case for smart grid — to show which major infrastructure and technology investments will provide the best value to Northwest ratepayers in the long run.

    “One of the main goals of this smart grid project is to develop the regional cost/benefit analysis,” according to BPA Deputy Director Bill Drummond.  “And that’s important because demonstrating that the benefits of smart grid outweigh the costs is crucial for any utility considering moving forward with these investments.”

    The “go live” event takes place at 10:30 a.m. at Alder Hall Commons, located at 1315 NE Campus Parkway (entrance on NE 40th St. between Brooklyn Ave. NE and University Way NE). U.S. Senators Patty Murray and Maria Cantwell, UW Provost Ana Mari Cauce; Bill Drummond, Deputy Director of the Bonneville Power Administration; and Battelle Senior Vice President and Pacific Northwest National Laboratory Director Mike Kluse will be on hand for the celebration.

    The Pacific Northwest Smart Grid Demonstration Project was co-funded by American Recovery and Reinvestment Act funding through the DOE, and the project’s utility and vendor partners. More information about the project is available at www.pnwsmartgrid.org.


    Project Participants
    Battelle leads a strong collaboration that includes the Bonneville Power Administration and the following 11 utility representatives based in the Pacific Northwest:

    • Avista Utilities – Spokane, Wash.
    • Benton PUD – Kennewick, Wash.
    • City of Ellensburg – Ellensburg, Wash.
    • Flathead Electric Cooperative, Inc. – Kalispell, Mont.
    • Idaho Falls Power – Idaho Falls, Idaho
    • Lower Valley Energy – Afton, Wyo.
    • Milton-Freewater City Light & Power – Milton-Freewater, Ore.
    • NorthWestern Energy – Butte, Mont.
    • Peninsula Light Company – Gig Harbor, Wash.
    • Portland General Electric – Portland, Ore.
    • University of Washington/Seattle City Light – Seattle, Wash.

    The demonstration also involves a diverse team of technology providers including: Alstom Grid, IBM/Netezza, 3TIER Inc., and Quality Logic Inc. Washington State University and Central Washington University also are directly involved.

  • New From NAP 2012-10-24 09:09:47

    Final Book Now Available

    The committee evaluated submissions received in response to a request for proposals (RFP) for Biomolecular Simulation Time on Anton, a supercomputer specially designed and built by D.E. Shaw Research (DESRES) that allows for dramatically increased molecular dynamics simulations compared to other currently available resources. During the past 2 years, DESRES has made available to the non-commercial research community node-hours on an Anton system housed at the Pittsburgh Supercomputing Center (PSC), based on the advice of previous National Research Council committees convened in the falls of 2010 and 2011.

    The success of the program has left DESRES to make the Anton machine housed at the PSC available for an additional 3.7 million node-hours during the 9 months following October 2012. DESRES has asked the National Research Council (NRC) to once again facilitate the allocation of time to the non-commercial research community. To undertake this task, the NRC convened a committee of experts to evaluate the proposals submitted in response to the aforementioned RFP. The committee members were selected for their expertise in molecular dynamics simulations, as well as their experience in the subject areas represented in the 52 proposals that were considered by the committee. They comprised a cross-section of the biomolecular dynamics field in academia, industry, and government, including an array of both senior and junior investigators.

    The goal of the third RFP for Biomolecular Simulation Time on Anton has been to continue to facilitate breakthrough research in the study of biomolecular systems by providing a massively parallel system specially designed for molecular dynamics simulations. These special capabilities allow multi-microsecond to millisecond simulation timescales, which previously had been unobtainable. The program seeks to continue to support research that addresses important and high-impact questions demonstrating a clear need for Anton’s special capabilities.Report of the Committee on Proposal Evaluation for Allocation of Supercomputing Time for the Study of Molecular Dynamics: Third Round is a summary of the proposals, research, and criteria set forth in the RFP for Biomolecular Simulation Time on Anton.

    [Read the full report]

    Topics: |

  • New From NAP 2012-10-24 08:45:01

    Final Book Now Available

    Computing and information and communications technology (ICT) has dramatically changed how we work and live, has had profound effects on nearly every sector of society, has transformed whole industries, and is a key component of U.S. global leadership. A fundamental driver of advances in computing and ICT has been the fact that the single-processor performance has, until recently, been steadily and dramatically increasing year over years, based on a combination of architectural techniques, semiconductor advances, and software improvements. Users, developers, and innovators were able to depend on those increases, translating that performance into numerous technological innovations and creating successive generations of ever more rich and diverse products, software services, and applications that had profound effects across all sectors of society. However, we can no longer depend on those extraordinary advances in single-processor performance continuing. This slowdown in the growth of single-processor computing performance has its roots in fundamental physics and engineering constraints–multiple technological barriers have converged to pose deep research challenges, and the consequences of this shift are deep and profound for computing and for the sectors of the economy that depend on and assume, implicitly or explicitly, ever-increasing performance. From a technology standpoint, these challenges have led to heterogeneous multicore chips and a shift to alternate innovation axes that include, but are not limited to, improving chip performance, mobile devices, and cloud services. As these technical shifts reshape the computing industry, with global consequences, the United States must be prepared to exploit new opportunities and to deal with technical challenges. The New Global Ecosystem in Advanced Computing: Implications for U.S. Competitiveness and National Security outlines the technical challenges, describe the global research landscape, and explore implications for competition and national security.

    [Read the full report]

    Topics: |

  • Autumn field school at Amarna

    The latest news update from Professor Barry Kemp, from the Amarna Project.

    On October 14th the current Amarna field school began to assemble, five overseas students and seven SCA inspectors (drawn mostly from the Middle Egypt region) converging on the Amarna expedition house. The theme of the field school is survey, both in the mapping and planning sense and through the appreciation of human impact on the landscape. Amarna is ideal for these purposes, offering a variety of reasonably clear examples and opportunities for instruction. This year’s venue for instruction in planning and profile drawing, using a total station, basic measuring and drawing tools and aerial photography from the expedition’s helium photography balloon is the Great Aten Temple, and more particularly one of the broad spreads of gypsum concrete on which are marked the outlines of large numbers of offering tables. The weekly exercises in archaeological landscape appreciation take in the South Tombs Cemetery, the area of the city around the house of Thutmose, the Stone Village and the tomb of Panehsy and surrounding territory. Evening lectures and visits to other sites complete the programme.

    The field school is run in conjunction with the Institute for Field Research (IFR) of California ([email protected]) and with the agreement of the Supreme Council of Antiquities of Egypt and in particular with the support of Dr Abd el-Rahman El-Aidi. We are grateful to Dr Hans Barnard, Gwil Owen and Miriam Bertram for volunteering their services as instructors.

    The clearing of the gypsum surface of the first court of the temple has revealed a surprising fact. The area is surrounded by an embankment of dusty sand, about 1 metre high, with a thick capping composed of gypsum concrete that is the remains of the floor that originally spread across the entire temple. The sides of the embankment have slumped over the years. In the course of cleaning back the edges it has become clear that it contains fragments of fine sculpture and pieces of carved inlays in darker stone.
    The fragments include pieces in indurated limestone (evidently from a large and ornate architrave), travertine, granite and quartzite. We must conclude that, before the final phase of the temple was completed, fine pieces of sculpture were no longer needed and were thoroughly broken up.

    On October 22nd and 23rd the expedition was honoured by a visit by the British ambassador, James Watt, and his wife, Amal.

    The field school is scheduled to run until November 15th. From October 28th, it will overlap with the start of the next season of excavation at the South Tombs Cemetery, directed by Dr Anna Stevens.

    The attached picture shows the group writing landscape description above the South Tombs Cemetery.

  • UCLA Health System statement on condition of patients involved in Inglewood shooting

    Gloria Jimenez, age 28, is currently in good condition, and her daughter, age 7, is in fair condition at Ronald Reagan UCLA Medical Center and Mattel Children’s Hospital UCLA. Both suffered gunshot wounds during an incident that took place in Inglewood, Calif., on Oct. 20.
     
    Ms. Jimenez wanted to release the following statement:
     
    “I would like to thank everyone in the community, the emergency responders and the doctors and nurses at UCLA for their ongoing support. It is a very sad time for our family and we appreciate your kind words and thoughts and quick response for our care. We feel blessed to have so many people there for us.”
     
    No other information or interviews are available at this time.

  • In vitro fertilization linked to increased risk of birth defects

     
    In vitro fertilization may significantly increase the risk of birth defects, particularly those of the eyes, heart, reproductive organs and urinary system, according to a new UCLA study.
     
    UCLA researchers presented findings from their abstract, “Congenital Malformations Associated With Assisted Reproductive Technology: A California Statewide Analysis,” on Oct. 20 at the American Academy of Pediatrics National Conference and Exhibition in New Orleans.
     
    Despite the increasing use of IVF in the United States, links between birth defects and IVF are poorly understood, the researchers said. The management of birth defects accounts for a large part of pediatric surgical care and demands significant health care resources. According to the Centers for Disease Control and Prevention, California has the highest rate of IVF usage in the United States. 
     
    For the study, the researchers looked at infants born in California from 2006 to 2007 following the use of assisted reproductive technologies — fertility treatments involving the manipulation of both eggs and sperm — primarily IVF. They examined the mother’s age, race and the number of times she had previously given birth, as well as the infant’s gender, year of birth, whether the infant was part of a multiple birth (twins, triplets, etc.) and the presence of major birth defects.
     
    “Our findings included a significant association between the use of assisted reproductive technology, such as certain types of in vitro fertilization, and an increased risk of birth defects,” said lead author Dr. Lorraine Kelley-Quon, a general surgery resident at Ronald Reagan UCLA Medical Center, who conducted the research at Mattel Children’s Hospital UCLA.
     
    Among the 4,795 infants born after IVF and 46,025 naturally conceived infants with similar maternal demographics examined in the study, the researchers identified 3,463 infants with major birth defects.
     
    They found that birth defects were significantly increased for infants born after IVF, compared with naturally conceived infants (9.0 percent vs. 6.6 percent), even after controlling for maternal factors. Specifically, IVF infants had greater rates of malformations of the eye (0.3 percent vs. 0.2 percent), heart (5.0 percent vs. 3.0 percent) and genitourinary system (1.5 percent vs. 1.0 percent).  
     
    Overall, IVF infants’ odds of having birth defects were 1.25 times greater than those of naturally conceived infants with similar maternal characteristics.
     
    The researchers also looked at infants born following fertility treatments that, unlike IVF and other assisted reproductive technologies, do not involve the manipulation of both eggs and sperm, including artificial insemination and ovulation induction. They found that the risk of birth defects for these infants was not significant.
     
    “For parents considering in vitro fertilization or other forms of assisted reproductive technology, it is important that they understand and discuss with their doctor the potential risks of the procedure before making a decision,” said Kelley-Quon.
     
    The study was funded by the Robert Wood Johnson Foundation Clinical Scholars Program.
     
    In addition to Kelley-Quon, UCLA co-authors included Chi-Hong Tseng, D. Carla Janzen and Dr. Stephen B. Shew.
     
    The authors have no financial ties to disclose.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • Journal outlines how UCLA researchers are redefining dentistry through ‘salivaomics’

    UCLA RESEARCH ALERT
     
    FINDINGS:
    Scientists from the UCLA School of Dentistry have been at the vanguard of research on human saliva in recent years, leading the way in the dynamic, emerging field of salivary diagnostics, which seeks to catalog the biological makeup of saliva to help screen for and detect both oral and systemic diseases.
     
    Now, the Journal of the American Dental Association, a leading publication for dental professionals, has published a special supplement to its October issue in which Dr. David Wong, the school’s associate dean of research, outlines the state of the science of salivary diagnostics, highlighting advances made by researchers at UCLA and other institutions and charting a path for future research and clinical applications.
     
    In the article, Wong’s research findings show that saliva is made up of complex sets of molecules — including genes, proteins, DNA and RNA — that help paint a picture of an individual’s biology. The study of the biological molecules in saliva is known as “salivaomics.” Findings show that by studying the “omics” in saliva — such as genomics, transcriptomics and proteomics — scientists can develop tests composed of many molecular measurements; the findings are then interpreted by a computational model to produce a clinically actionable result.
     
    Through collaborative work with scientists at other institutions, UCLA researchers have developed several informatics and statistical tools to help interpret biomarkers in saliva; these biomarkers can then be used for early detection of disease, treatment monitoring, recurrence prediction and other translational assessments.
     
    Research done at the UCLA School of Dentistry has shown that saliva, as a medium for health screening is just as useful as blood and other bodily fluids and has vast potential for the early detection of cancers, autoimmune diseases, diabetes and other disorders.
     
    IMPACT:
    The ability to conveniently and inexpensively capture saliva samples in a clinical setting for diagnostic purposes would be a huge step forward for health care providers in the detection, treatment and prediction of recurrence of life-threatening diseases. Wong expects that future research in salivaomics will eventually translate into practical medical applications that will be administered in dentists’ and doctors’ offices.
     
    And because it has been shown that 20 percent more Americans visit their dentists more regularly than their physicians, there will be substantial opportunities for dentists to engage in primary health care by taking saliva samples from their patients and, based on the medical findings of those samples, developing individualized treatment plans.
     
    AUTHOR:
    Dr. David Wong, a professor of oral biology and medicine who holds the Felix and Mildred Yip Endowed Chair in Dentistry at the UCLA School of Dentistry, is available for interviews.
     
    JOURNAL:
    Wong’s work appears in a special supplement to the October issue of the Journal of the American Dental Association. The full article can be read here.
     
    FUNDING:
    Wong’s research has been funded by the National Institute of Dental and Craniofacial Research and the National Cancer Institute.

  • UCLA faculty members elected to Institute of Medicine

    Two UCLA faculty members, Dr. Sherin U. Devaskar and Jack Needleman, have been elected to the prestigious Institute of Medicine of the National Academies, one of the highest honors in the fields of health and medicine.
     
    Devaskar and Needleman join 68 other new members and 10 foreign associates elected for their “outstanding professional achievement and commitment to service,” the institute announced Oct. 15.
     
    Devaskar, physician-in-chief at Mattel Children’s Hospital UCLA and UCLA’s assistant vice chancellor for children’s health, is a highly accomplished investigator renowned for unraveling the processes responsible for the fetal and neonatal origins of chronic childhood and adult disorders, including obesity and diabetes. She is a professor of pediatrics at the David Geffen School of Medicine at UCLA and holds UCLA’s Mattel Executive Endowed Chair.
     
    Needleman, a professor in the department of health services at UCLA’s Fielding School of Public Health and a faculty associate at the UCLA Center for Health Policy Research, has transformed the measurement of hospital patients’ quality-of-care linked to nurses, improved management practices to increase quality and safety, and helped propel passage of national legislation. Quality measures he developed have been adopted by the Agency for Healthcare Research and Quality, Medicare, the Joint Commission and the National Quality Forum.   
     
    “The Institute of Medicine is greatly enriched by the addition of our newly elected colleagues, each of who has significantly advanced health and medicine,” said the institute’s president, Harvey V. Fineberg. “Through their research, teaching, clinical work and other contributions, these distinguished individuals have inspired and served as role models to others.”
     
    Established in 1970, the Institute of Medicine is one of the National Academies — along with the National Academy of Sciences, the National Academy of Engineering and the National Research Council — and serves as a resource for independent, scientifically informed analysis and recommendations on health issues.
     
    With the newly elected members, Institute of Medicine membership now totals 1,928.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • Science supports sex addiction as a legitimate disorder

    The idea that an individual might suffer from a sexual addiction is great fodder for radio talk shows, comedians and late night TV. But a sex addiction is no laughing matter. Relationships are destroyed, jobs are lost, lives ruined.
     
    Yet psychiatrists have been reluctant to accept the idea of out-of-control sexual behavior as a mental health disorder because of the lack of scientific evidence.
     
    Now a UCLA-led team of experts has tested a proposed set of criteria to define “hypersexual disorder,” also known as sexual addiction, as a new mental health condition.
     
    Rory Reid, a research psychologist and assistant professor of psychiatry at the Semel Institute of Neuroscience and Human Behavior at UCLA, led a team of psychiatrists, psychologists, social workers, and marriage and family therapists that found the proposed criteria to be reliable and valid in helping mental health professionals accurately diagnose hypersexual disorder.
     
    The results of this study — reported in the current edition of the Journal of Sexual Medicine — will influence whether hypersexual disorder should be included in the forthcoming revised fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), considered the “bible” of psychiatry.
     
    The importance of the study, Reid said, is that it suggests evidence in support of hypersexual disorder as a legitimate mental health condition.
     
    “The criteria for hypersexual disorder that have been proposed, and now tested, will allow researchers and clinicians to study, treat and develop prevention strategies for individuals at risk for developing hypersexual behavior,” he said.
     
    The criteria, developed by a DSM-5 sexual and gender identity disorders work group for the revised manual, establish a number of symptoms that must be present. These include a recurring pattern of sexual fantasies, urges and behaviors lasting a period of six months or longer that are not caused by other issues, such as substance abuse, another medical condition or manic episodes associated with bipolar disorder. Also, individuals who might be diagnosed with this disorder must show a pattern of sexual activity in response to unpleasant mood states, such as feeling depressed, or a pattern of repeatedly using sex as a way of coping with stress.
     
    Part of the criteria also states that individuals must be unsuccessful in their attempts to reduce or stop sexual activities they believe are problematic.
     
    “As with many other mental health disorders,” said Reid, “there must also be evidence of personal distress caused by the sexual behaviors that interfere with relationships, work or other important aspects of life.”
     
    In order to evaluate the criteria for hypersexual disorder, Reid and his colleagues conducted psychological testing and interviews with 207 patients in several mental health clinics around the country. All of the patients were seeking help for out-of-control sexual behavior, a substance-abuse disorder or another psychiatric condition, such as depression or anxiety.
     
    The researchers found that the proposed criteria for hypersexual disorder accurately classified 88 percent of hypersexual patients as having the disorder; the criteria were also accurate in identifying negative results 93 percent of the time. In other words, the criteria appear to do a good job of discriminating between patients who experience hypersexual behavior and those who don’t, such as patients seeking help for other mental health conditions like anxiety, depression or substance abuse.
     
    “The results lead us to believe that the proposed criteria tend not to identify patients who don’t have problems with their sexual behavior,” Reid said. “This is a significant finding, since many had expressed concerns that the proposal would falsely classify individuals.”
     
    Reid also noted that the ability of the criteria to accurately identify hypersexual disorder in these patients was quite high and compared favorably to other psychiatric diagnoses.
     
    Another significant finding of the study, he said, was that patients who met the criteria for hypersexual disorder experienced significantly greater consequences for their sexual activities, compared with individuals with a substance-abuse diagnosis or a general medical condition. Of the 207 patients they examined, 17 percent had lost a job at least once, 39 percent had a relationship end, 28 percent contracted a sexually transmitted infection and 78 percent had interference with healthy sex.
     
    “So an individual meeting the criteria for hypersexual disorder can experience significant challenges and consequences in their life,” Reid said. “Our study showed increased hypersexual behavior was related to greater emotional disturbance, impulsivity and an inability to manage stress.”
     
    Interestingly, the researchers found that 54 percent of the hypersexual patients felt their sexual behavior began to be problematic before the age of 18. Another 30 percent reported that their sexual behavior began to be problematic during their college-aged years, from 18 to 25.
     
    “This appears to be a disorder that emerges in adolescence and young adulthood, which has ramifications for early intervention and prevention strategies,” Reid said.
     
    The study also examined the types of sexual behavior that hypersexual patients reported. The most common included masturbation and excessive use of pornography, followed by sex with another consenting adult and cybersex. The study noted that hypersexual patients had sex with commercial sex workers, had repeated affairs or had multiple anonymous partners — amounting to an average of 15 sex partners in the previous 12-month period.
     
    “It’s not that a lot of people don’t take sexual risks from time to time or use sex on occasion to cope with stress or just escape, but for these patients, it’s a constant pattern that escalates until their desire for sex is controlling every aspect of their lives and they feel powerless in their efforts to change,” Reid noted.
     
    Other authors on the study included Heather McKittrick, Margarit Davtian, and senior author Dr. Timothy Fong, all of UCLA; Bruce N. Carpenter and Randy Gilliland of Brigham Young University; Joshua N. Hook of the University of North Texas; Sheila Garos of Texas Tech University; Jill C. Manning, in private practice; and Erin B. Cooper of Temple University. Dr. Fong has the following relationships: speaker’s bureau for Reckitt Benckiser, Pfizer Pharmaceuticals, and grant support from Psyadon Pharmaceuticals. The other authors report no conflict of interest.
     
    Most of the study was unfunded; researchers donated their time. Some travel expense was funded internally through the UCLA Department of Psychiatry.
     
    The Semel Institute for Neuroscience and Human Behavior is an interdisciplinary research and education institute devoted to the understanding of complex human behavior, including the genetic, biological, behavioral and sociocultural underpinnings of normal behavior, and the causes and consequences of neuropsychiatric disorders. In addition to conducting fundamental research, the institute faculty seeks to develop effective strategies for prevention and treatment of neurological, psychiatric and behavioral disorder, including improvement in access to mental health services and the shaping of national health policy.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter

  • Renewable Energy Law News – Week of October 15

    Photo via Flickr

    Obama Administration Approves Roadmap for Utility-Scale Solar Energy Development on Public Lands

    WASHINGTON, D.C. – As part of President Obama’s all-of-the-above energy strategy to expand domestic energy production, Secretary of the Interior Ken Salazar today finalized a program for spurring development of solar energy on public lands in six western states. The Programmatic Environmental Impact Statement (PEIS) for solar energy development provides a blueprint for utility-scale solar energy permitting in Arizona, California, Colorado, Nevada, New Mexico and Utah by establishing solar energy zones with access to existing or planned transmission, incentives for development within those zones, and a process through which to consider additional zones and solar projects.

    Today’s action builds on the Administration’s historic progress to facilitate renewable energy development. On Tuesday, with the authorization of the Chokecherry and Sierra Madre Wind Energy Project site in Wyoming, Interior reached the President’s goal of authorizing 10,000 megawatts of renewable power on public lands. Since 2009, Interior has authorized 33 renewable energy projects, including 18 utility-scale solar facilities, 7 wind farms and 8 geothermal plants, with associated transmission corridors and infrastructure. When built, these projects will provide enough electricity to power more than 3.5 million homes, and support 13,000 construction and operations jobs according to project developer estimates.

    “Energy from sources like wind and solar have doubled since the President took office, and with today’s milestone, we are laying a sustainable foundation to keep expanding our nation’s domestic energy resources,” said Secretary Salazar, who signed today’s Record of Decision at an event in Las Vegas, Nevada with Senator Harry Reid. “This historic initiative provides a roadmap for landscape-level planning that will lead to faster, smarter utility-scale solar development on public lands and reflects President Obama’s commitment to grow American made energy and create jobs.”

    The Solar PEIS establishes an initial set of 17 Solar Energy Zones (SEZs), totaling about 285,000 acres of public lands, that will serve as priority areas for commercial-scale solar development, with the potential for additional zones through ongoing and future regional planning processes. If fully built out, projects in the designated areas could produce as much as 23,700 megawatts of solar energy, enough to power approximately 7 million American homes. The program also keeps the door open, on a case-by-case basis, for the possibility of carefully sited solar projects outside SEZs on about 19 million acres in “variance” areas. The program also includes a framework for regional mitigation plans, and to protect key natural and cultural resources the program excludes a little under 79 million acres that would be inappropriate for solar development based on currently available information.

    Photo via Flickr

    Wind Energy Jobs, PTC Surface At Second Presidential Debate 

    The first presidential debate came and went without mention of the wind energy production tax credit (PTC) and hardly any discussion of renewables. The story was quite different, however, at the second debate between President Barack Obama and Republican presidential candidate Mitt Romney, held Tuesday night at Hofstra University in Hempstead, N.Y.

    In fact, energy was arguably one of the most contentious issues of the night, and sparked heated disputes between the two candidates, who traded jabs on policies that – as described Tuesday night – did not differ all that much.

    Obama has said he favors an “all of the above” energy approach, including oil, gas, wind, solar and biofuels – a position he stated in the first presidential debate and reiterated Tuesday night.

    “We’ve got to control our own energy, you know – not only oil and natural gas, which we’ve been investing in – but also, we’ve got to make sure we’re building the energy sources of the future,” he said at Tuesday’s debate. “Not just thinking about next year, but 10 years from now, 20 years from now. That’s why we’ve invested in solar and wind and biofuels, energy-efficient cars.”

    And despite the virtual absence of renewables from Romney’s official energy plan, released in August, this time, the former Massachusetts governor also expressed support for clean energy like wind and solar power.

    “Look, I want to make sure we use our oil, our coal, our gas, our nuclear, our renewables,” he said. “I believe very much in our renewable capabilities – ethanol, wind [and] solar will be an important part of our energy mix.”

    At the first debate, neither candidate mentioned the jobs being lost in the wind energy supply chain due to the looming expiration of the PTC.

    The PTC’s omission from the first debate may have seemed glaring to some in the wind industry, considering that the president had made the critical tax credit a cornerstone of his campaign efforts in Iowa and Colorado – two states that have lost hundreds of wind energy jobs over the past few months.

    This time, however, Obama came out swinging against Romney, who has stated he would let the PTC expire at the end of this year.

    “What I’m not for is us ignoring the other half of the quotation,” Obama said, referring to renewables. “So for example, on wind energy, when Gov. Romney says these are ‘imaginary jobs,’ when you’ve got thousands of people right now in Iowa, right now in Colorado who are working, creating wind power, with good-paying manufacturing jobs – and the Republican senator in that, in Iowa, is all for it, providing tax credits to help this work – and Gov. Romney says, ‘I’m opposed; I’d get rid of it’ – that’s not an energy strategy for the future.”

    Romney refuted the claims, saying he does, in fact, support wind jobs.

    “I don’t have a policy of stopping wind jobs in Iowa, and they’re not phantom jobs – they’re real jobs,” he said.

    “I appreciate wind jobs in Iowa and across our country,” he added. “I appreciate the jobs in coal and oil and gas. I’m going to make sure that taking advantage of our energy resources will bring back manufacturing to America. We’re going to get through a very aggressive energy policy, 3.5 million more jobs in this country.”  

    Louisiana’s Solar Tax Credit Under Review

    Louisiana, USA — The Louisiana Department of Revenue weighed the future benefits of solar energy at a public hearing last week in Baton Rouge. Homeowners in Louisiana can choose solar-generated electricity and realize a 50-percent, one-time, refundable, state income tax credit for the purchase and installation of the system under provisions of the Wind and Solar Energy Systems Tax Credit created by state legislation in 2007.

    Louisiana’s investment in this incentive program is something the solar energy industry does not want to see fade away.

    Nearly 100 people, from all over the state and nation, filed into the hearing room to shed some light on LDR’s rules for the Income Tax Credits for Wind or Solar Energy Systems.

    The solar power industry generated more than just energy that day as more than a dozen people registered to speak.

    “It appears there was a lot of interest,” said Byron Henderson, press secretary for the Department of Revenue, “This is just a public hearing on proposed rule changes for the tax credits on the wind and solar energy systems.”

    Tucker Crawford, co-owner of a Louisiana-based solar energy company and president of the Gulf States Renewable Energy Industries Association – which is a non-profit, trade organization that represents solar and renewable energy firms in Louisiana, Mississippi and Alabama– told the committee that the entire Louisiana solar industry has far exceeded the state’s initial estimates.

    “That’s a good thing to Louisiana energy consumers,” Crawford said. “In 2007, Louisiana only had five licensed solar installers. Today, we have 196 and counting; many of them are represented here today.”

    Crawford said that the 2007 state law – which allowed income tax credits for wind or solar energy systems purchased and installed by taxpayers to cut costs on their homes or buildings – has created local jobs, increased the state’s energy independence and reduced or eliminated utility bills for more than 3,100 Louisiana households.

  • Taking race out of the equation in measuring women’s risk of osteoporosis and fractures

    For women of mixed racial or ethnic backgrounds, a new method for measuring bone health may improve the odds of correctly diagnosing their risk of osteoporosis and bone fractures, according to a UCLA-led study.
     
    Currently, assessing osteoporosis and the risk of fractures from small accidents like falls requires a bone density scan. But because these scans don’t provide other relevant fracture-related information, such as bone size and the amount of force a bone is subjected to during a fall, each patient’s bone density is examined against a national database of people with the same age and race or ethnicity.
     
    This approach, however, doesn’t work for people of mixed race or ethnicity because comparison databases can’t account for mixed heritage. A similar problem exists for those from smaller racial or ethnic groups for which there are not comparison databases.
     
    “All the current ways of determining your risk for fractures require knowing your race and ethnicity correctly, and they ignore the fact that racial and ethnic groups are not homogenous,” said study co-author Dr. Arun Karlamangla, a professor of medicine in the geriatrics division at the David Geffen School of Medicine at UCLA. “It also flies in the face of the current reality in Southern California, where so many people are of mixed ethnicity.”
     
    Given that osteoporosis and hip fractures are leading causes of injury in older people, alternative means of measuring risk are needed. Now, a UCLA-led team of researchers has found a way of assessing risk without knowledge of a person’s race or ethnicity. The method involves combining bone mineral density measures with body size and bone size to create composite bone strength indices.
     
    The findings are published in the October issue of the Journal of Clinical Endocrinology and Metabolism.
     
    The researchers studied data on nearly 2,000 women in the U.S. between the ages of 42 and 53 who were of Caucasian, African American, Japanese and Chinese heritage. The data came from the Study of Women’s Health Across the Nation; UCLA was one of seven sites that recruited women for the study.
     
    Using the new composite bone strength indices, the researchers tested how the method predicted fracture risk in this group of women over a period of 10 years. They found that when they did not take into account the women’s race or ethnicity, they were able to predict fracture risk using the indices just as accurately as they could using the traditional method of combining bone mineral density measures with race and ethnicity information.
     
    “The importance of bone size to fracture risk has been recognized by engineers and radiologists for some years now,” said the study’s lead investigator, Dr. Shinya Ishii, who started the research while a fellow in the UCLA Division of Geriatrics and is now at the University of Tokyo. “But no one, until now, has combined bone density, which is the traditional measure of osteoporosis, with bone size and body size to get at a more uniform way of assessing osteoporosis that applies across racial lines and does away with the need to know the person’s race or racial mixture.”
     
    The researchers noted that further study is needed to determine if the same strength measures will work in men. Their findings also do not show how well these measures will continue working as women age. The results of the study do, however, point toward a new approach for assessing fracture risk, they say.
     
    Other researchers included Drs. Gail Greendale, Carolyn Crandall and Mei-Hua Huang of UCLA and Drs. Jane A. Cauley and Michelle E. Danielson of the University of Pittsburgh.
     
    National Institute on Aging grants (NR004061; AG012505, AG012535, AG012531, AG012539, AG012546, AG012553, AG012554, AG012495) to SWAN; a National Institute on Aging grant (AG026463) to the Hip Strength Through the Menopausal Transition Substudy; the Veterans Affairs Greater Los Angeles Healthcare System Geriatric Research, Education and Clinical Center, and a Veterans Affairs Advanced Geriatrics Fellowship supported this research. 
     
    The UCLA Division of Geriatrics within the department of medicine at the David Geffen School of Medicine at UCLA offers comprehensive outpatient and inpatient services at several convenient locations and works closely with other UCLA programs that strive to improve and maintain the quality of life of seniors. UCLA geriatricians are specialists in managing the overall health of people age 65 and older and treating medical disorders that frequently affect the elderly, including memory loss and dementia, falls and immobility, urinary incontinence, arthritis, arthritis, high blood pressure, heart disease, osteoporosis, and diabetes. As a result of their specialized training, UCLA geriatricians can knowledgably consider and address a broad spectrum of health-related factors — including medical, psychological and social — when treating patients.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • UCLA’s heart transplant program ranked among nation’s best

    The heart transplant program at Ronald Reagan UCLA Medical Center has again been recognized as one of highest ranking in the nation by an agency of the U.S. Department of Health and Human Services.
     
    UCLA’s program is one of only seven heart transplant centers nationwide — and the only one in California — to be ranked at the silver level by the Health Resources and Services Administration (HRSA), which has federal oversight of the nation’s organ donation and transplantation network.
     
    The HRSA survey measures the performance of organ transplant centers by assessing transplant rates, post-transplant survival rates and mortality rates for patients after they are placed on organ-donation waiting lists. To earn silver status, a program must achieve better-than-expected performance in at least two of those categories. Only one center — a liver transplant program in Florida — earned a gold ranking for achievements in all three.  
     
    UCLA, which also earned silver status in 2010, when the HRSA organ transplant center survey was inaugurated, is the only heart transplant program in the U.S. to have earned a silver ranking twice.
     
    “As the only two-time silver-level heart transplant program, we are incredibly proud of our team’s hard work in providing the very best care for our patients who undergo this lifesaving treatment,” said Dr. Abbas Ardehali, a professor of cardiothoracic surgery and director of UCLA’s heart transplant program. “This recognition acknowledges that patients in the UCLA heart transplant program have a better chance of survival.”
     
    The HRSA awards were presented Oct. 4 at ceremony held in Grapevine, Texas.
     
    In addition, Ronald Reagan UCLA Medical Center received the Department of Health and Human Services’ Bronze Medal of Honor for Organ Donation for achieving and sustaining national goals for organ donation, including a donation rate of 75 percent or more of eligible donors.
     
    “We are so proud of our health care professionals who reach out with great compassion and sensitivity to explain and inspire individuals and families to save lives as organ and tissue donors during emotionally difficult times,” said Dr. J. Thomas Rosenthal, the medical center’s chief medical officer. “Every life touched by organ and tissue donation crosses a bridge between death and life, grief and meaning, hope and healing. This award is a true honor.”
     
    The Medal of Honor awards were presented for work done between April 1, 2010, and March 31, 2012.
     
    The HRSA leads federal efforts to increase organ and tissue donation and transplantation and supports the Donation and Transplantation Community of Practice, which brings together donation and transplantation professionals, hospital staff and other professionals involved in the donation process to identify and share best practices. For more information, visit www.organdonor.gov.
     
    One of the nation’s busiest transplant centers, Ronald Reagan UCLA Medical Center offers heart, lung, liver, kidney, intestinal, pancreas, cornea, auto islet, bone marrow, hand and face transplant services. For more information, visit www.transplants.ucla.edu.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • New University of Wisconsin-led NSF Center for Chemical Innovation taps PNNL expertise

    Nanoparticles hundreds times smaller than the width of a hair are more and more a part of people’s daily lives. Used in everything from car coatings to clothes to cosmetics, little is known about their safety in the environment.

    Chemist Robert Hamers at the University of Wisconsin-Madison is leading a multi-institutional effort to gain new understanding about nanomaterials, especially how they get into cells and tiny organisms such as those found in freshwater lakes.

    A Department of Energy’s Pacific Northwest National Laboratory scientist, Galya Orr has been using and improving high resolution microscopy at EMSL, DOE’s Environmental Molecular Sciences Laboratory at PNNL. She will be applying the expertise she’s gained from studying how nanoparticles enter lung cells found lining our airways.

    Orr is also a member of PNNL’s NIEHS Center for Nanotoxicology, one of five NIH-funded groups nationally that seeks to understand how manmade nanomaterials interact with biological tissues.

    Read more about the $1.75 million project here.

  • UCLA researchers reveal how ‘cleaving’ protein drives tumor growth in prostate, other cancers

    Researchers led by Tanya Stoyanova and Dr. Owen Witte of UCLA’s Eli and Edythe Broad Center of Regenerative Medicine and Stem Cell Research have determined how a protein known as Trop2 drives the growth of tumor cells in prostate and other epithelial cancers.
     
    This discovery is important because it may prove essential for creating new therapies that stop the growth of cancer, the researchers said. The study is featured on the cover of the Oct. 15 issue of the journal Genes and Development.
     
    The Trop2 protein is expressed on the surface of many types of epithelial cancer cells — cells that form tumors that grow in the skin and the inner and outer linings of organs — but little was known about the protein’s role in the growth and proliferation of cancer cells. The UCLA researchers discovered that Trop2 controls those processes through a mechanism that leads to the protein being cleaved into two parts, one inside the cell and one outside. This Trop2 division promotes self-renewal of the cancer cells, resulting in tumor growth.
     
    “Determining the mechanism of this protein is important for planning treatments that stop the growth of prostate cancer, but it is also overexpressed in so many other types of cancer that it might be a treatment target for many more patients beyond that population,” said senior author Witte, director of the Broad Center and a professor in the department of microbiology, immunology, and molecular genetics at UCLA.
     
    The finding may have a critical clinical impact, the researchers said, since preventing the cleavage of Trop2 by mutating those sites on the protein where it splits eliminates the protein’s ability to promote tumor cell growth. Using this knowledge, they said, new therapy strategies can be developed that block Trop2 molecular signaling, thus stopping its ability to enhance tumor growth in a variety of epithelial malignancies, including prostate, colon, oral cavity, pancreatic and ovarian cancers, among others.
     
    “The reason I became interested in Trop2 was that it is highly expressed in many epithelial cancers but no one knew precisely how the protein worked to promote the disease,” said Stoyanova, the study’s first author and a postdoctoral scholar in the department of microbiology, immunology and molecular genetics at UCLA.
     
    Funding for the study was provided by the California Institute for Regenerative Medicine Training Grant (TG2-01169), the U.S. Department of Defense Prostate Cancer Research Program (PC110638) and the Howard Hughes Medical Institute.
     
    The Eli and Edythe Broad Center of Regenerative Medicine and Stem Cell Research: UCLA’s stem cell center was launched in 2005 with a UCLA commitment of $20 million over five years. A $20 million gift from the Eli and Edythe Broad Foundation in 2007 resulted in the renaming of the center. With more than 200 members, the Broad Stem Cell Research Center is committed to a multidisciplinary, integrated collaboration among scientific, academic and medical disciplines for the purpose of understanding adult and human embryonic stem cells. The center supports innovation, excellence and the highest ethical standards focused on stem cell research with the intent of facilitating basic scientific inquiry directed toward future clinical applications to treat disease. The center is a collaboration of the David Geffen School of Medicine at UCLA, UCLA’s Jonsson Cancer Center, the UCLA Henry Samueli School of Engineering and Applied Science and the UCLA College of Letters and Science.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter

  • UCLA researchers’ discovery revives hope in promising lymphoma treatment

    Researchers at UCLA’s Jonsson Comprehensive Cancer Center have discovered the mechanism by which an experimental drug known as GCS-100 removes from lymphoma cells a protein that prevents the cells from responding to chemotherapy.
     
    The discovery revives hope in a drug that had been tested in clinical trials years before but had been delayed indefinitely. The researchers hope GCS-100 can be combined with chemotherapy to create an effective treatment for diffuse large B-cell lymphoma (DLBCL), the most common and aggressive form of non-Hodgkin lymphoma, a cancer of the immune system.
     
    The findings are published in the advance online issue of the journal Blood and will appear in a forthcoming print issue of the journal.
     
    The UCLA researchers found that a protein called galectin-3 binds to an enzyme called CD45 on the surface of lymphoma cells. This protein–enzyme combination regulates the cancer cells’ susceptibility to chemotherapy, essentially protecting them from chemotherapy drugs.
     
    Derived from citrus pectin, GCS-100 works outside the cancer cells to remove the protective galectin-3. Once the galectin-3 is removed, a lymphoma cell can be effectively killed by chemotherapy drugs, part of a chain reaction of programmed cancer-cell death known as apoptosis.
     
    Although the researchers knew the drug had shown action against lymphoma cells, the finding that GCS-100 literally removed the barrier to the initiation of cell death by removing galectin-3 from the cell surface was unexpected.  
     
    “We let the results guide our ideas, and we were able to establish a mechanism for GCS-100,” said the study’s first author, Mary Clark, a graduate student researcher in pathology and laboratory medicine. “I am excited to follow the progress of GCS-100 and hope to see its use in the clinic as an adjunct therapy for lymphoma in the near future.”
     
    Dr. Linda Baum, a professor of pathology and laboratory medicine and senior researcher on the study, said, “This drug had been abandoned because of the vagaries of the economy. My hope would be to restart this drug in clinical trials and, using this new knowledge, to include it in a more targeted lymphoma therapy.” 
    Early clinical trials of GCS-100 had shown no known side effects of the drug other than a mild rash in some patients, which other research has demonstrated is the result of the drug also promoting the development of T cells, which are created by the immune system to fight disease.
     
    Funding for the research was provided by the Ron and Maddie Katz Family Foundation and the National Institutes of Health.
     
    UCLA’s Jonsson Comprehensive Cancer Center has more than 240 researchers and clinicians engaged in disease research, prevention, detection, control, treatment and education. One of the nation’s largest comprehensive cancer centers, the Jonsson Center is dedicated to promoting research and translating basic science into leading-edge clinical studies. In July 2012, the Jonsson Cancer Center was once again named among the nation’s top 10 cancer centers by U.S. News & World Report, a ranking it has held for 12 of the last 13 years.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • Earth sunblock only needed if planet warms easily

    An increasing number of scientists are studying ways to temporarily reduce the amount of sunlight reaching the earth to potentially stave off some of the worst effects of climate change. Because these sunlight reduction methods would only temporarily reduce temperatures, do nothing for the health of the oceans and affect different regions unevenly, researchers do not see it as a permanent fix. Most theoretical studies have examined this strategy by itself, in the absence of looking at simultaneous attempts to reduce emissions.

    Now, a new computer analysis of future climate change that considers emissions reductions together with sunlight reduction shows that such drastic steps to cool the earth would only be necessary if the planet heats up easily with added greenhouse gases. The analysis, reported in the journal Climatic Change, might help future policymakers plan for a changing climate.

    The study by researchers at the Department of Energy‘s Pacific Northwest National Laboratory explored sunlight reduction methods, or solar radiation management, in a computer model that followed emissions’ effect on climate. The analysis shows there is a fundamental connection between the need for emissions reductions and the potential need for some sort of solar dimming.

    “It’s a what-if scenario analysis,” said Steven Smith with the Joint Global Change Research Institute in College Park, Md,, a joint venture between PNNL and the University of Maryland. “The conditions under which policymakers might want to manage the amount of sun reaching earth depends on how sensitive the climate is to atmospheric greenhouse gases, and we just don’t know that yet.”

    The analysis started with computer-based virtual worlds, or scenarios, that describe different potential pathways to reduce greenhouse gas emissions, which limits the amount of heat in the earth system due to greenhouse gas accumulation. The researchers combined these scenarios with solar radiation management, a type of geoengineering method that might include shading the earth from the sun’s heat by either brightening clouds, mimicking the atmospheric cooling from volcanic eruptions or putting mirrors in space.

    “Solar radiation management doesn’t eliminate the need to reduce emissions. We do not want to dim sunlight over the long term — that doesn’t address the root cause of the problem and might also have negative regional effects. This study shows that the same conditions that would call for solar radiation management also require substantial emission reductions in order to meet the climate goals set by the world community,” said Smith.

    How much sun blocking might be needed depends on an uncertain factor called climate sensitivity. Much like beachgoers in the summer, the earth might be very sensitive to carbon dioxide, like someone who burns easily and constantly slathers on the sunscreen, or not, like someone who can get away with SPF 5 or 10.

    Scientists measure climate sensitivity by how many degrees the atmosphere warms up if the concentration of carbon dioxide doubles. Smith said if the climate has a medium sensitivity of about 3 degrees Celsius (5.4 degrees Fahrenheit) per doubling of carbon dioxide, “it’s less likely we’d need solar radiation management at all. We’d have time to stabilize the climate if we get going on reducing emissions. But if it’s highly sensitive, say 4.5 degrees Celsius (8.1 degrees Fahrenheit) per doubling, we’re going to need to use solar radiation management if we want to limit temperature changes.”

    According to NOAA’s August report, the earth’s temperature has already risen about 0.62 degrees Celsius (1.12 degrees Fahrenheit) since the beginning of the 20th century as the carbon dioxide in the atmosphere has grown from 290 parts per million to 379 parts per million.

    But the atmosphere hasn’t reached equilibrium yet — even if humans stopped putting more carbon dioxide into the air, the climate would still continue to change for a while longer. Scientists do not know what temperature the earth will reach at equilibrium, because they don’t know how sensitive the planet is to greenhouse gases.

    Further, the study showed that, when coupled with emission reductions, the amount of solar radiation management needed could be far less than the amount generally considered by researchers so far.

    “Much of the current research has examined solar radiation management that is used as the sole means of offsetting a doubling of carbon dioxide concentrations. What we showed is that when coupled with emissions reductions, only a fraction of that amount of ‘solar dimming’ will be needed. This means that potential adverse impacts would be that much lower,” said Smith. “This is all still in the research phase. We do not know enough about the impacts of potential solar radiation management technologies to use them at this time.”

    The study will also help decision-makers evaluate solar reduction technologies side-by-side, if it comes to that. Smith and his coauthor, PNNL atmospheric scientist and Laboratory Fellow Phil Rasch, devised a metric to quantify how much solar radiation management will be needed to keep warming under a particular temperature change threshold. Called degree-years, this metric can be used to evaluate the need for potential sunlight dimming technologies.

    Whether such technologies will be needed at all, time will tell.

    This work was supported by the non-profit Fund for Innovative Climate and Energy Research.


    Reference: Steven J. Smith and Philip J. Rasch, 2012. The Long-Term Policy 1 Context for Solar Radiation Management, Climatic Change, doi: 10.1007/s10584-012-0577-3. (http://www.springerlink.com/content/31674q46k61p86h7/)