Author: Serkadis

  • Microsoft Q2 2013 by the numbers: $21.5B, 76 cents EPS

    Late this afternoon, Microsoft answered a question oft-asked by investors this month: What’s up with Windows 8? The new operating system, which launched October 26, was supposed to lift sagging PC sales and demonstrate the capability to successfully compete with so-called post-PC platforms like Android and iOS. Now we know more. Windows & Windows Live revenue passed Business, making the OS division most-valuable again.

    For fiscal second quarter, ended December 31, Microsoft revenue was $21.46 billion, up 3 percent year over year. Operating income: $7.77 billion, a 3 percent decrease. Net income was $6.38 billion, or 76 cents a share.

    Average analyst consensus was $21.53 billion revenue and 74 cents earnings per share, for the quarter. Revenue estimates ranged from $19.94 billion to $23.32 billion, with estimated year-over-year growth of 3.1 percent — mighty modest for a holiday quarter when new PC and phone operating systems launched and Microsoft released its first tablet, Surface RT.

    Shares dipped by 2 percent in early after-market trading, falling to $27.06 from the $27.63 close. Like Apple yesterday, Microsoft beat earnings consensus but missed on sales.

    When adjusting for the impact of Office and Windows upgrade offers (meaning non-GAAP view), revenue grew by 5 percent to $22 billion, operating income by 4 percent to $8.3 billion, and EPS by 4 percent to 81 cents.

    “Our big, bold ambition to reimagine Windows as well as launch Surface and Windows Phone 8 has sparked growing enthusiasm with our customers and unprecedented opportunity and creativity with our partners and developers”, CEO Steve Ballmer boasts. “With new Windows devices, including Surface Pro, and the new Office on the horizon, we’ll continue to drive excitement for the Windows ecosystem and deliver our software through devices and services people love and businesses need”.

    Windows & Windows Live revenue rose 24 percent year over year to $5.88 billion, buoyed by a deferral from the previous quarter. Without the extra lift, revenue still increased by 11 percent.

    Microsoft’s Perception Problem

    As I’ve oft said, in business, perception is everything. To many people, Windows is Microsoft and the fate of one influences the other. Perception is a devil. Take Apple, for example, which reported $54.5 billion revenue and $13.06 billion net income yesterday. Today, shares closed down 12.35 percent, in part on perception that growth days are over, despite simply huge quarterly numbers. Microsoft’s problem is by no means comparable, but Apple’s situation makes a point. If investors so punish the company for such a great quarter, what can negative or positive perceptions about Windows’ future do?

    Microsoft is no longer bound to Windows, despite marketing hype about “reimagining”. In October, CEO Steve Ballmer described the company’s new direction as “devices and services“. The Business division, with flagship Office, generally generates more revenue than Windows & Windows Live, and last quarter Server & Tools did, too. The company is in process of removing dependence on Windows as top to its hugely successful vertical applications stack built around Office and server software and now extended through cloud services, such as 365,  Azure, Skype and SkyDrive among many others. Windows is still hugely valuable, and anchors a huge ecosystem, but Microsoft can transcend the OS.

    The problem: Public sentiment says something else — that Windows can’t compete in the post-PC, what I call connected-devices, era. If Windows can’t, neither can Microsoft. I don’t agree. Microsoft’s apps, datacenter and server software already are primed to serve multiple devices — not just the PC — and that’s a longstanding development strategy now far advanced. Microsoft is ready to move beyond Windows, and the holiday quarter PC shipments show such a future is inevitable. Windows won’t go away but stand alongside other platforms rather than being the overwhelmingly dominant one.

    Business and Server & Tools succeed for many reasons, and they will continue to do so as long as enterprises stay the course buying annuity contracts. Combined, more than half the two groups’ revenues come from volume-licensing contracts with Software Assurance.

    Companies get the license and annually pay 25 percent or 29 percent of the full price to get upgrades (or even to exercise downgrade rights) over two- or three-year periods. Software Assurance insulates Microsoft from economies’ ups and downs and those for PC purchases. If ever businesses back away in mass from annuity or subscription contracts, that’s the day to seriously worry about Microsoft’s future.

    Whither Windows 8

    Where the flagship operating system matters most is where Microsoft tries to take it: touchscreen devices, such as hybrids and tablets, with Surface RT and Pro serving as reference-designs for OEM partners to emulate. The Redmond, Wash.-based company announced plans to port Windows to ARM processors in January 2011, then followed up with the tile-based Modern UI that unifies ARM and x86 operating systems, including Windows Phone. Pundits poo-poo PC shipments, which stank in Q4, as evidence Surface and Windows 8 are failures. I ask: By what measure? Seems to me, Microsoft already changed Windows’ course to embrace a broader range of devices, with a unifying UI. Transitions like this take time to succeed, or fail.

    There is need. Had Microsoft not made-over Windows, the problem wouldn’t be perception but crisis very real. Three legs support the profit center, and Windows bound to traditional PCs would be one cut off. Instead, Windows 8 holds future device promise. Much depends on the devices the company and its partners produce and apps and services supporting them. Honestly, looking at holiday PC lineup, Surface RT is about the only thing looking good. OEMs failed to deliver compelling products that get people buying.

    PCs continued their more-than-year-long collapse during fourth quarter. Windows 8 gave no meaningful lift. Shipments fell 4.9 percent year over year, according to Gartner. For all 2012: down 3.5 percent. Manufacturers shipped 90.3 million and 352.7 million units for the respective time periods. IDC offers grimmer perspective: PC shipments fell 6.4 percent for Q4 — two points more than forecast — and 3.2 percent for the year.

    Consumers aren’t buying Windows PCs like they used to, and their infatuation with iPad, some other tablets and smartphones, is spreading. “Tablets have dramatically changed the device landscape for PCs, not so much by ‘cannibalizing’ PC sales, but by causing PC users to shift consumption to tablets rather than replacing older PCs”, Mikako Kitagawa, Gartner principal analyst, says, speaking about Q4 PC shipments.

    She no longer believes that PCs and tablets will coexist for a meaningful time. “There will be some individuals who retain both, but we believe they will be exception and not the norm. Therefore, we hypothesize that buyers will not replace secondary PCs in the household, instead allowing them to age out and shifting consumption to a tablet”.

    Combine that with the bring-your-own-device (BYOD) to work movement, and the future looks grim. Or does it? Microsoft already has its apps, cloud and server businesses primed for BYOD, as I’ve written here before. Then there is broader context, for those calling Windows 8 a flop because PC shipments fell in Q4. Apple got hit, too. Analyst consensus was for 5.2 million Macs shipped, but only 4.1 million did. Yesterday, Apple CEO Tim Cook laid blame on late delivery of new iMacs. He said that Mac shipments would have been otherwise higher.

    I don’t find that credible. But let’s assume for a moment he’s right. iMac is an expensive beast, starting at 1,299 and selling for as much as $1,999 in standard config. If Apple can command such selling strength, why can’t Microsoft OEMs? They should, by releasing innovative hybrid desktop and portable designs that capitalize on Windows 8’s best features. I contend they did not in fourth quarter.

    Something else to consider when looking at the PC market and future of Windows. In a compelling rebuttal to claims the PC is dying, Derrick Wlodarz, who owns a computer repair business, makes an observation often ignored: “Whereas customers of mine were getting 3-4 years out of machines back in the early 2000s, they now push their PCs to 4-6 year replacement cycles without much sweat”. Major reason: Older hardware has more than enough processor and graphics power to meet modern needs.

    “Now, a Windows Vista or newer PC could likely run on for five, six, seven or possibly more years without much issue. And this is the untold trend that I see in my customer base”, he asserts. “Solid, secure operating systems installed onto well-engineered computer hardware equals a darn long system life”.

    So if the PC in the den or office is good enough and not in need of replacement, why not buy a new smartphone or tablet? Remember: Microsoft pushes ahead with solutions for both these categories. Then there is fourth quarter to consider, where Windows delivered solid growth — now if only the broader ecosystem could capitalize upon it.

    Division Highlights

    Microsoft reports revenue and earnings results for five divisions: Windows & Windows Live, Server & Tools, Business, Online Services and Entertainment & Devices.

    Windows & Windows Live. Revenue soared 24 percent year over year despite weak PC sales, surely buoyed by low-cost Windows Pro upgrade price that ends January 31. A $622 million deferral helped lift revenue to $5.88 billion. Without it, revenue would have grown by 11 percent.

    To date Microsoft has sold 60-million Windows 8 licenses.

    OEM revenue grew by 17 percent, outpacing the broader PC market.

    Server & Tools. “We see strong momentum in our enterprise business”, Microsoft COO Kevin Turner says. “With the launch of SQL Server 2012 and Windows Server 2012, we continue to see healthy growth in our data platform and infrastructure businesses and win share from our competitors. With the coming launch of the new Office, we will provide a cloud-enabled suite of products that will deliver unparalleled productivity and flexibility”.

    Revenue rose 9 percent, or $347 million, to $5.19 billion. As previously mentioned, the division is insulated against economic maladies, because about 50 percent of revenues come from contractual volume-licensing agreements.

    New bookings increased by 15 percent. Meanwhile System Center revenue grew by 18 percent and SQL Server by 16 percent.

    Business. “We saw strong growth in our enterprise business driven by multi-year commitments to the Microsoft platform, which positions us well for long-term growth”, Microsoft CFO Peter Klein says. “Multi-year licensing revenue grew double-digits across Windows, Server & Tools, and the Microsoft Business Division”.

    Despite touted growth, revenue fell by 10 percent year over year to $5.69 billion. However, when removing adjustments for Office upgrade offer and pre-sales, revenue grew by 3 percent.

    Bookings increased by 18 percent and multi-year licensing by 10 percent. However, consumer revenue fell by 2 percent.

    Like Server & Tools, Business division is largely insulated against sluggish PC sales. Sixty percent of revenue comes from annuity licensing to businesses.

    Online Services Business. Online services revenue rose by 10 percent, or $109 million, to $823 million. However, the division remains unprofitable. Search and display ads drove up online advertising revenue by 15 percent.

    Entertainment & Devices. Microsoft shipped 5.2 million Xbox consoles, down from 8.2 million a holiday quarter earlier. As a platform, Xbox 360 revenue fell 29 percent, or $1.1 billion.

    Xbox Live subscriptions now exceed 40 million.

    Windows Phone sales are up 4 times year over year, which is a polite way of saying they’re not good enough. If they were, Microsoft would say how many.

    Users made 138 billion Skype calls, up 59 percent year over year.

  • PNNL awarded $2.8M to keep troops cool while using less fuel

    A new, energy-efficient air chilling system could keep troops on the front lines cool while using about half as much diesel as current systems. The system’s decreased fuel consumption could also save lives by reducing attacks on American soldiers who deliver fuel to field operations.

    The Department of Energy’s Pacific Northwest National Laboratory will receive up to $2.8 million over three years to develop the system, the Department of Defense, Navy and DOE’s Advanced Research Projects Agency-Energy, also known as ARPA-E, announced Wednesday. PNNL’s project was among five awarded a total of $8.5 million to improve the efficiency of battlefield heating and air conditioning systems by 20 to 50 percent.

    “PNNL is looking forward to adapting its ongoing research into advanced, energy-efficient cooling technologies and apply it toward important military needs,” said PNNL Laboratory Fellow and project leader Pete McGrail. “Our team has a strong emotional connection to the success of this project, as it could help prevent American soldiers from being injured or killed while moving fuel in dangerous supply convoys around the battlefield.”

    PNNL is partnering with Oregon State University and Power Partners, Inc. of Athens, Ga. on the project.

    PNNL’s system will be a next-generation adsorption chiller that is specially designed to be smaller, lighter, more efficient and operate under the extreme temperatures experienced at bases on the frontlines, also called forward operations. The chiller will use a novel nanomaterial called a metal organic framework, or MOF. MOFs are crystal-like compounds made of metal clusters connected to organic molecules, or linkers. Together, the clusters and linkers assemble into porous 3D structures. PNNL developed a MOF that can hold up to three times more water than the silica gel used in today’s adsorption chillers. This helps make PNNL’s test adsorption chiller system much smaller and lighter. This project will build on advances in adsorption cooling technology PNNL has already made under ARPA-E’s Building Energy Efficiency Through Innovative Thermodevices, or BEET-IT, program.

    Further improvements for this project will include breakthroughs in microchannel heat exchanger technology and improvements in the MOF’s thermal properties. Both advances will help reduce the size and weight of the chiller further and squeeze out more cooling efficiency.

    “This will be the most advanced adsorption cooling system ever developed, and these advances are needed to meet very demanding military requirements,” McGrail said.

    PNNL’s military system will run off of waste heat coming from a diesel generator. This could reduce the diesel fuel use needed to cool field military installations by up to 50 percent. The planned 3-kilowatt unit will weigh about 180 pounds and take up about 8 cubic feet.

    This isn’t the first time the two systems have received support. PNNL began developing its MOF adsorption chiller for commercial buildings in 2010, when PNNL received ARPA-E funding for the BEET-IT, program. PNNL also received ARPA-E funding in 2011 to adapt the adsorption chiller to heat and cool electric vehicles with minimal impact on driving distance.

  • eBay Reportedly Bans Django Unchained Toys

    Last week, news came out that NECA’s Django Unchained action figures had been discontinued after drawing controversy from groups like the National Action Network and Project Islamic Hope.

    Since then, naturally, the toys became instant eBay shopper bait. Today, TMZ is reporting, however, that eBay has banned the toys from sale on the site, providing a statement from the company that they were removed because they violate its Offensive Materials Policy. However, you can go to eBay right now and find numerous listings for the toys. Here’s one:

    Django Toy Listing on eBay

    Apparently they haven’t been able to keep the toys off the site. This could end up hurting some sellers, as the company “cautioned sellers not to re-list the items,” according to TMZ, which also shares this quote from an email eBay has been sending sellers:

    “Since the manufacturer of this product has discontinued the item’s sale due to its potentially offensive nature, we are not allowing it to be sold on eBay.”

    Here are some things that are still perfectly acceptable to sell on eBay, according to its Offensive Materials Policy:

    – KKK memorabilia pricing guides
    – News and magazine articles about the KKK
    – Documentaries about the KKK
    – Books about the KKK
    – The film “Birth of a Nation” and the book it is based on, “The Clansmen”
    – Stamps, letters and envelopes displaying Nazi postmarks
    – Currency issued by the Nazi German government
    – Replica or novelty stamps or currency of Nazi Germany

    Listings for the toys are also showing up on other big ecommerce sites. You can currently find them listed on Amazon, Bonanza, and even in Google Shopping, which is now based solely on product listing ads.

  • Yandex Launches Social Search App Wonder Aimed At US

    Russian search engine company Yandex has launched a new social search app for the iPhone and iPod Touch for people in the U.S. It’s called Wonder, and taps into Facebook, Twitter, Instagram, Foursquare, iTunes, and Last.fm to provide answers to questions based on data from your friends, as well as location and music info and options (such as previewing and purchasing songs).

    Take a look:

    Wonder by Yandex Labs from Yandex Labs on Vimeo.

    The app uses natural language voice search first and foremost, but includes a keyboard input option. Right now, it only works in English and understands a few types of questions pertaining to places, music and news. It utilizes speech recognition and text-to-speech technology from Nuance Communications. Here are some examples Yandex provides for the types of questions it works for:

    – If you are looking for a proven sushi place in New York, you can just ask: what sushi restaurants do my friends go to in New York?

    – When you are looking for coffee shops in a new area, you can ask: coffee shops nearby.

    – If you need to catch up with your friends on a Friday night, just ask: where do my friends party?

    – You know your friend John has a good taste for music, ask: what music does John listen to?

    – Feel like listening to electronic music, ask: I wonder what electronic music are my friends listening to?

    – Want to catch up on news, ask: news shared by my friends.

    Wonder’s launch comes at an interesting time, amidst a slow roll-out of Facebook’s own attempt at social search. Of course this is a mobile app, and Facebook’s launch does not include mobile (though that will come in time). It’s unclear whether or not Yandex intends to release Wonder on Android.

  • Intel Makes Mobile Push Into Africa Via Partner Safaricom, Releases Android-Powered Yolo Smartphone

    lexington

    It’s no secret that Intel is gunning to gain some mobile traction in emerging markets, and the chipmaker doesn’t seem to be wasting any time in 2013. Kenyan wireless operator (and Intel partner) Safaricom has just officially revealed Africa’s first Intel smartphone, the Android-powered Yolo at an event in Nairobi.

    Yeah, you read that right: the Yolo.

    Now despite what you make of its name, the phone isn’t actually encouraging young, tech-savvy Africans make poorly-considered life decisions. Instead, it seems more like the continuation of some weird existing naming practices — Intel’s first Atom-powered Android smartphone for instance was dubbed the XOLO X900 when it made its debut in India in April 2012.

    The announcement doesn’t come as much of a surprise since Intel’s Mobile and Communications VP Mike Bell pointed out at CES that Safaricom (among other carriers in developing regions) would release smartphones based on the company’s value-oriented smartphone reference design in Q1 2013. That focus on highly cost-sensitive markets means that the Yolo and its ilk don’t exactly have a spec sheet that will set your world on fire — the Yolo sports a 3.5-inch touch display, and its Atom Z2420 processor can hit speeds of up to 1.2GHz, encode and decode 1080p video, and support HSPA+ data speeds. Naturally, that sort of performance is reminiscent of the sorts of devices you can find on domestic store shelves a few years ago (a sentiment Engadget echoed when they got some brief hands-on time at CES), but it’s still a pretty compelling package considering the competition in Kenya.

    Of course, there’s always the issue of cost to deal with. Safaricom will soon begin selling the Yolo (and 500MB of free data access) for Kshs. 10,999 (roughly $126) — sounds like a pretty sweet deal, but companies like Huawei have already waging a price war with devices like the $80 IDEOS smartphone on the front line. Really, with the explosion of even less expensive smartphones in Kenya and beyond, one has to wonder how much of a market Intel will actually be able to carve out in Africa.

  • New supercomputer coming to EMSL this summer, supplied by Atipa Technologies

    A new supercomputer expected to rank among the world’s fastest machines will be ready to run computationally intense climate and biological simulations along with other scientific programs this summer. This computational work will aid research in climate and environmental science, chemical processes, biology-based fuels that can replace fossil fuels, new materials for energy applications and more.

    Chosen by a competitive process, Atipa Technologies in Lawrence, Kan., will provide the machine to EMSL, the Department of Energy’s Environmental Molecular Sciences Laboratory. EMSL is a national user facility on DOE’s Pacific Northwest National Laboratory’s campus that provides experimental and high performance computing capabilities to enable users to address environmental and energy challenges through molecular-level theory and experiment. It is also home to the new supercomputer’s predecessor, Chinook. As a national user facility resource, the new system will be available to scientists everywhere,who will be able to apply on a competitive basis to use it. Currently, about 400 scientists use Chinook.

    “We’re developing a supercomputer that will aid energy, environment and basic science missions important to DOE,” said PNNL computational scientist Bill Shelton, the associate director at EMSL who manages high performance computing. “Enhanced computing power will benefit our users who conduct experiments and want to verify them with modeling. Integrating computational theory with experiment is critical to accelerating scientific discovery.”

    Funded by DOE’s Office of Science, the new $17 million machine will likely peak at 3.4 quadrillion — 3.4 million billion — calculations per second and be more than 20 times faster than the four-year-old Chinook. The new supercomputer’s capacity and speed are expected to rank it among the world’s top 20 fastest machines when it comes online. Peaking at 3.4 petaflops, the new computer will be able to do in one hour what would take a typical laptop more than 20 years to do.

    Atipa Technologies has been providing high performance computers to DOE and its labs for more than a decade.

    “We’re excited to have the opportunity to provide the new  supercomputer with a theoretical peak performance of 3.38 petaflops and 2.7 petabytes of usable storage. It will be built and deployed by Atipa Technologies in collaboration with Supermicro,” said Mike Zheng, president of Atipa Technologies.

    As EMSL’s flagship high performance computer, researchers from around the world will be able to use it. The EMSL team designed it for researchers who typically need resources of this scale but don’t generally have access to such a powerful computer. This wide availability makes it stand out from other supercomputers.

    “Its uniqueness is that it will be optimally configured for climate and chemistry simulations and biological analyses,” said Shelton.

    For example, the new machine will offer added speed for improved climate models. “The new computer provides a wonderful opportunity for climate scientists to get more work done and get each simulation done more quickly,” said PNNL climate scientist Phil Rasch. “It is a huge jump in the computing power available to us.”

    And it will produce more details about how organisms work. “I’m excited because with the amount of data researchers are generating in biology, this supercomputer will open up new avenues for our users,” said EMSL biology science lead Scott Baker. “More computing power is like having more pixels in a picture. We’ll be able to look at proteins and complex biological interactions more realistically. This will allow us to better understand and control organisms like microbes so that we can develop new renewable fuels.”

    The design’s 196,000 processing units are Intel processors combined with Intel Phi many integrated core (MIC) accelerator cards. The accelerator cards will ratchet up the power. They work with the conventional processors and memory and allow up to 120 extra calculations per node to be performed simultaneously rather than one at a time. (Anyone with a graphics card in their personal computer has taken advantage of a hardware accelerator.)

    The system’s 23,000 Intel processors have 184,000 gigabytes (184,000 billion bytes) of memory available, about four times as much memory per processor as other supercomputers. The additional memory will allow scientists to use the processors more efficiently for biology, climate research, chemistry and materials science.

    Atipa will deliver the computer’s components by July 2013 and assemble it at EMSL. The EMSL team will spend a few months installing and configuring the system and getting it up to speed. They expect to have it running for national and international researchers in October 2013. In the meantime, EMSL will be sponsoring a naming contest among EMSL users and friends.

    The New Supercomputer’s Fast Facts:

    • Theoretical peak processing speed of 3.4 petaflops
    • 42 racks
    • 195,840 cores
    • 1440 compute nodes with conventional processors and Intel Xeon Phi “MIC” accelerators
    • 128 GB memory per node
    • FDR Infiniband network
    • 2.7 petabyte shared parallel filesystem (60 gigabytes per second read/write)
    • Working reference: HPCS4A

    DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit the Office of Science website.

  • First Clip From jOBS Hits The Web [Video]

    Entertainment Tonight has posted the first clip we’ve seen from the upcoming Steve Jobs biopic jOBS, starring Ashton Kutcher and Josh Gad as Jobs and Steve Wozniak. The clip features the duo in a scene talking about Wozniak’s operating system.

    Kutcher and Gad will appear at Macworld/iWorld next week to talk about their experiences playing the roles of Apple’s co-founders.

    The film also got an official release date this week: April 19.

    [via Slashgear]

  • AMC Releases New Mad Men Season 6 Cast Photos (Premiere Date: April 7)

    The best TV show about advertising of all time (to be fair, I haven’t seen many others) returns on April 7. AMC announced the news on Wednesday, and now, the network has released a handful of new cast photos from the upcoming season.

    Mad Men season 6

    Mad Men Season 6

    Mad Men Season 6

    Mad Men

    It will be interesting to see if the show can generate another vial hit this season on par with that of last season’s Fat Betty Draper.

    Credit for all photos: Frank Ockenfels/AMC (via blogs.amctv.com)

  • Twitter Launches Six-Second Video App Vine for iPhone

    As previously reported, Twitter CEO Dick Costolo previewed what an upcoming Twitter video-sharing experience would look like when he tweeted, using Vine, which the company acquired last year.

    Today, the company officially launched the offering in the form of a downloadable iOS app. From the company blog:

    Today, we’re introducing Vine: a mobile service that lets you capture and share short looping videos. Like Tweets, the brevity of videos on Vine (6 seconds or less) inspires creativity. Now that you can easily capture motion and sound, we look forward to seeing what you create.

    I understand Twitter’s long-standing philosophy of brevity, but I’m not sure it works as well in video form. Is there really a great deal of demand to share six-second video clips? I’m sure you can point to any number of potential use cases, but I just can’t see this being a huge thing among Twitter users. Of course, there’s always the strong possibility that I’m completely wrong. Plenty of similar sentiments were expressed by many when Twitter itself came out.

    More from the Vine blog:

    Posts on Vine are about abbreviation — the shortened form of something larger. They’re little windows into the people, settings, ideas and objects that make up your life. They’re quirky, and we think that’s part of what makes them so special…We also believe constraint inspires creativity, whether it’s through a 140-character Tweet or a six-second video.

    Twitter makes no mention of an Android app, or an app for any other mobile platform, but it stands to reason these will come in time.

    It should be noted that you don’t need a Twitter account to use Vine. Perhaps Vine will find its own user base, completely independent of Twitter.

  • Joyent Offers Hadoop Solution for Big Data Challenges

    Joyent announced a new Apache Hadoop-based solution, built using the Hortonworks Data Platform (HDP), that allows companies to run enterprise-class Hadoop on the high-performance Joyent Cloud.

    As a new entrant into the big data landscape Joyent is addressing industry demand to reduce costs and decrease query response times. Software product development services company Altoros Systems said that Hadoop clusters on Joyent Cloud produced a nearly 3X faster disk I/O response time versus identically-sized infrastructure. Through the use of the Joyent operating system virtualization and CPU bursting technology, Joyent says it is able to extract better response times and deliver results to data scientists and analysts faster.

    “We are pioneering a new era of big data and our Hadoop offering is just the start of our 2013 agenda,” said Jason Hoffman, CTO and Founder, Joyent. “We intend to continue bringing our technical expertise to the market and reverse the typical understanding of big data implementations – that they’re expensive and hard to use. We’re committed to meeting the insatiable demand for faster analytics and data retrieval, changing how computing functions for the enterprise.”

    Global telecom Telefonica was an early adopter of Joyent’s Hadoop solution.   “Joyent technology powers our service – Instant Servers – and is providing Telefonica Digital an advantage to deliver the fastest performing Hadoop big data solution in our marketplace,” said Carlos Morales Paulin, Global Managing Director, M2M, Cloud Computing and Apps, Telefonica. “Joyent has inversed the big data cost equation while at the same time innovating how computing on large distributed and unstructured data can be accomplished for large enterprises. Our customers can now get insight from their data quicker than ever before without the massive cost that’s typically associated with high-performance big data solutions.”

    The Apache Hadoop solution is available immediately for Joyent customers.

  • TechCrunch Makers: An Evening At The Van Brunt Stillhouse

    Screen Shot 2013-01-24 at 9.17.25 AM

    What do you do when you already have the coolest job in the world? You start a business where you can have another amazing job on evenings and weekends.

    Daric Schlesselman is an editor for the Daily Show in Manhattan who lives in deepest Red Hook, a small, cool community on the edge of Brooklyn. There he rents a former paint factory where he’s set up the Van Brunt Stillhouse and storage facility where he makes some of the nicest grappa, whiskey, and rum this side of the Gowanus.

    Schlesselman started out as a homebrewer but has taken investment to build a small, artisinal distillery in Red Hook. He makes booze to match the season – rum in the summer, whiskey in the fall, and grappa anytime – and he’s the perfect example of someone who followed his dream to sweet fruition.

    The stillhouse is compact and well-appointed with plenty of barrels of delicious whiskey aging in white oak. He may not be making 3D printers or electronic eyes, but the Van Brunt Stillhouse shows us that even a mild-mannered TV editor can, with a little time, energy, and perseverance, build a real business making some amazing stuff.

    TechCrunch Makers is a video series featuring people who make cool stuff. If you’d like to be featured, email us!.

  • Tanzania’s pass/fail roller coaster

    A primary school teacher answering questions in her class. Picture: Neema Kambona/DFID

    A primary school teacher answering questions in her class. Picture: Neema Kambona/DFID

    You know that heart stopping feeling when you crest the first peak of a big roller coaster as it goes into free fall? That feeling of dread is perhaps only equalled by the torture of opening up your exam results – at the time it seems your whole life might depend on the hidden grades inside!

    In the UK last year, GCSE (Grade 10 for 16 year olds) pass rates finally were reported as having ‘dropped’ for the first time ever by an ‘under whelming’ half of one percentage point, reversing a decades long upward trend. Many have commented that exams, and increasingly interwoven coursework, have become easier to pass – ‘grade inflation’ – potentially to allow more students to enter tertiary education. There were howls of protest and legal challenges this year over how the pass mark for English GCSEs were being adjusted and its effect on grades and students’ career prospects.

    Over the Christmas holidays, Tanzanians were shocked and bemused to receive the outcomes of the Primary School Leaving Examinations (PSLE) taken by students around 14-15 years old in age and are usually considered necessary to enter secondary school. National pass rates (grades A-C) were reported as having plummeted from 57% in 2011 to 30% in 2012, that’s almost halved – not one half of a percentage point drop. It was reported that in two rural Western regions that 48 schools had no students pass at all. However, not all failing students face ruin. It appears that entry requirements into secondary school will be relaxed, as the government continues to expand access to secondary education (enrolment rates have tripled since 2005).

    Secondary school enrolment since 2000

    Exam results can be used for different purposes to filter out students for a limited intake into more advanced levels of education or as an absolute measure of competences. Major changes in pass rates are not that unusual if one looks at Tanzanian results in past years, but this one does seem unexpectedly large and has left many people scratching their heads for solutions. If failing students are being sent en masse to secondary schools, is the problem merely being shunted up the system?

    Did the switch to automated marking of multiple choice questions cause confusion or did it prevent cheating? For an exam taken by close to a million students the benefits of automation are clear, in previous years teacher training colleges stopped lessons for weeks as trainee teachers were co-opted in for marking by hand. Were the questions or curriculum made harder, or the grade boundaries adjusted? If you have any ideas please let me know; we are also discussing with government colleagues for possible explanations.

    MDG 2: Achieve Universal Primary Education

    Over the past decade the emphasis in developing countries has evolved from education expansion – ‘bums on seats’ in pursuit of the MDG 2 on access – to all children learning at school (or elsewhere).  Clearly examination pass rates are one measure of learning; as posted earlier, other approaches such as civil society led testing of children on basic literacy and numeracy skills now provide useful alternative measures that demonstrate disturbingly low levels of ability in children in Africa and South Asia. We face a real challenge to determine how best to support Tanzania’s children to learn. Primary School Leaving Exam (PSLE) pass rates are one of the key indicators agreed to measure progress between the UK, other development partners and government, but in this instance it appears our tape measure or stopwatch may have malfunctioned!

  • Backed By Y Combinator And Google Ventures, CircuitHub Aims To Be A One-Stop Shop For Electrical Part Info

    419012_317697981624973_1401993484_n

    Say you’re building a gadget. You’ll probably need several widgets, gizmos and electronic thingymabobs. CircuitHub is now here to help. The startup launched today and is attempting to be the world’s first free online, collaborative parts library. Best of all, it works seamlessly with popular design programs.

    This tool is aimed squarely at makers. By offering a comprehensive and detailed parts library, CircuitHub hopes to be the main resource for finding electronic components. But CircuitHub will only be successful if it can build this massive database. The entire system is open for group collaboration. Spend a few minutes and add some parts to the database.

    Using Dropbox for cloud storage, CircuitHub integrates nicely with Altium, Eagle, OrCAD and Allegro. Use CircuitHub’s library with your design software. That’s the genius here. CircuitHub isn’t attempting to disrupt a maker’s workflow; the startup is trying to improve it.

    By sourcing the right part from the start, makers will experience less hassle when approaching manufacturing.

    “Kickstarter is the largest crowd-funding site where anyone can help fund ideas proposed by anyone else” explained Andrew Seddon, CircuitHub’s co-founder, in a released statement. “The single biggest project and the highest funded category are both dominated by electronics. Yet 84 percent of the top physical product-based projects were severely delayed primarily due to problems with interfacing design data into and through factories. This problem is exactly what the CircuitHub library is designed to address.”

    CircuitHub is backed by Y Combinator with investments from Google Ventures and notable angel investors including Paul Buchheit (the inventor of Gmail), Matt Cutts (creator of Google SafeSearch), Alexis Ohanian (cofounder of Reddit), Harj Taggar (cofounder Auctomatic), and Garry Tan (cofounder of Posterous), among others.

  • Right target, but missing the bulls-eye for Alzheimer’s

    Alzheimer’s disease is the most common cause of late-life dementia. The disorder is thought to be caused by a protein known as amyloid-beta, or Abeta, which clumps together in the brain, forming plaques that are thought to destroy neurons. This destruction starts early, too, and can presage clinical signs of the disease by up to 20 years.
     
    For decades now, researchers have been trying, with limited success, to develop drugs that prevent this clumping. Such drugs require a “target” — a structure they can bind to, thereby preventing the toxic actions of Abeta.
     
    Now, a new study out of UCLA suggests that while researchers may have the right target in Abeta, they may be missing the bull’s-eye. Reporting in the Jan. 23 issue of the Journal of Molecular Biology, UCLA neurology professor David Teplow and colleagues focused on a particular segment of a toxic form of Abeta and discovered a unique hairpin-like structure that facilitates clumping.
     
    “Every 68 seconds, someone in this country is diagnosed with Alzheimer’s,” said Teplow, the study’s senior author and principal investigator of the NIH-sponsored Alzheimer’s Disease Research Center at UCLA. “Alzheimer’s disease is the only one of the top 10 causes of death in America that cannot be prevented, cured or even slowed down once it begins. Most of the drugs that have been developed have either failed or only provide modest improvement of the symptoms. So finding a better pathway for these potential therapeutics is critical.”
     
    The Abeta protein is composed of a sequence of amino acids, much like “a pearl necklace composed of 20 different combinations of different colors of pearl,” Teplow said. One form of Abeta, Abeta40, has 40 amino acids, while a second form, Abeta42, has two extra amino acids at one end.
     
    Abeta42 has long been thought to be the toxic form of Abeta, but until now, no one has understood how the simple addition of two amino acids made it so much more toxic then Abeta40.
     
    In his lab, Teplow and his colleagues used computer simulations in which they looked at the structure of the Abeta proteins in a virtual world. The researchers first created a virtual Abeta peptide that only contained the last 12 amino acids of the entire 42–amino-acid-long Abeta42 protein. Then, said Teplow, “we just let the molecule move around in a virtual world, letting the laws of physics determine how each atom of the peptide was attracted to or repulsed by other atoms.”
     
    By taking thousands of snapshots of the various molecular structures the peptides created, the researchers determined which structures formed more frequently than others. From those, they then physically created mutant Abeta peptides using chemical synthesis.
     
    “We studied these mutant peptides and found that the structure that made Abeta42 Abeta42 was a hairpin-like turn at the very end of the peptide of the whole Abeta protein,” Teplow said.
     
    The hairpin turn structure was not previously known in the detail revealed by the researchers, “so we feel our experiments were novel,” he said. “Our lab is the first to show that it is this specific turn that accounts for the special ability of Abeta42 to aggregate into clumps that we think kills neurons. Abeta40, the Abeta protein with two less amino acids at the end of the protein, did not do the same thing.”
     
    Hopefully, the work of the Teplow laboratory presents what may the most relevant target yet for the development of drugs to fight Alzheimer’s disease, the researchers said.
     
    Other authors on the study included Robin Roychaudhuri, Mingfeng Yang, Atul Deshpande, Gregory M. Cole and Sally Frautschy, all of UCLA, and Aleksey Lomakin and George B. Benedek of the Massachusetts Institute of Technology.
     
    Funding for the study was provided by grants from the State of California Alzheimer’s Disease Research Fund, a UCLA Faculty Research Grant, the National Institutes of Health (AG027818, NS038328) and the James Easton Consortium for Alzheimer’s Drug Discovery and Biomarkers.
     
    The Mary S. Easton Center for Alzheimer’s Disease Research at UCLA is part of the UCLA Department of Neurology, which encompasses more than 20 disease-related research programs, along with large clinical and teaching programs. These programs cover brain mapping and neuroimaging, movement disorders, Alzheimer’s disease, multiple sclerosis, neurogenetics, nerve and muscle disorders, epilepsy, neuro-oncology, neurotology, neuropsychology, headaches and migraines, neurorehabilitation, and neurovascular disorders. The department ranked first among its peers nationwide in National Institutes of Health funding (2002–09). 
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • T-Mobile Has The Nexus 4 In Stock! You Guys! Hurry!

    nexus 41

    If you’re quick, you can snag a Nexus 4 from T-Mobile right now for $199 on contract. Ever since its launch, the phone has been rather hard to purchase. Blame Google. Blame LG. But it doesn’t matter now, ’cause you can buy one right this very second.

    The Nexus 4 launched on the Google Play store late last year. It sold out almost immediately. T-Mobile then started selling the phone in some retail locations last week. Now, right on schedule, the Nexus 4 is available through its website as long as you’re willing to sign a contract — which is kind of a bummer.

    Part of the Nexus 4′s breakout success comes from its original pricing. Google cut the wireless carrier out of the picture and sold the phone at a fair price without requiring a new contract. At $349 the phone was slightly more than a comparable iPhone, but owners weren’t locked into a 2 year service agreement.

    So, if you’re willing to lock yourself into a two-year agreement, here’s the link to the hottest Android phone currently on the market. If not, keep on refreshing the Nexus 4′s Google Play product page. It’s bound to be in stock sometime in 2013.

  • UCLA study first to image concussion-related abnormal brain proteins in retired NFL players

    (For more video clips, visit http://bit.ly/XUGLFI.)
     
    Sports-related concussions and mild traumatic brain injuries have grabbed headlines in recent months, as the long-term damage they can cause becomes increasingly evident among both current and former athletes. The Centers for Disease Control and Prevention estimates that millions of these injuries occur each year.
     
    Despite the devastating consequences of traumatic brain injury and the large number of athletes playing contact sports who are at risk, no method has been developed for early detection or tracking of the brain pathology associated with these injuries.
     
    Now, for the first time, UCLA researchers have used a brain-imaging tool to identify the abnormal tau proteins associated with this type of repetitive injury in five retired National Football League players who are still living. Previously, confirmation of the presence of this protein, which is also associated with Alzheimer’s disease, could only be established by an autopsy.
     
    The preliminary findings of the small study are reported Jan. 22 in the online issue of the American Journal of Geriatric Psychiatry, the official journal of the American Association for Geriatric Psychiatry.
     
    Previous reports and studies have shown that professional athletes in contact sports who are exposed to repetitive mild traumatic brain injuries may develop ongoing impairment such as chronic traumatic encephalopathy (CTE), a degenerative condition caused by a build up of tau protein. CTE has been associated with memory loss, confusion, progressive dementia, depression, suicidal behavior, personality changes, abnormal gait and tremors.
     
    “Early detection of tau proteins may help us to understand what is happening sooner in the brains of these injured athletes,” said lead study author Dr. Gary Small, UCLA’s Parlow–Solomon Professor on Aging and a professor of psychiatry and biobehavioral sciences at the Semel Institute for Neuroscience and Human Behavior at UCLA. “Our findings may also guide us in developing strategies and interventions to protect those with early symptoms, rather than try to repair damage once it becomes extensive.”
     
    Small notes that larger follow-up studies are needed to determine the impact and usefulness of detecting these tau proteins early, but given the large number of people at risk for mild traumatic brain injury — not only athletes but military personnel, auto accident victims and others — a means of testing what is happening in the brain during the early stages could potentially have a considerable impact on public health.
     
    For the study, the researchers recruited five retired NFL players who were 45 years of age or older. Each had a history of one or more concussions and cognitive or mood symptoms. The players represented a range of positions, including linebacker, quarterback, guard, center and defensive lineman.
     
    “I hope that my participation in these kinds of studies will lead to a better understanding of the consequences of repeated head injury and new standards to protect players from sports concussions,” said Wayne Clark, a player in the study who had normal cognitive function.
     
    For the study, the UCLA scientists used a brain-imaging tool they had developed previously for assessing neurological changes associated with Alzheimer’s disease. They employed a chemical marker they created called FDDNP, which binds to deposits of amyloid beta “plaques” and neurofibrillary tau “tangles” — the hallmarks of Alzheimer’s — which they then viewed using a positron emission tomography (PET) scan, providing a “window into the brain.” With this method, researchers are able to pinpoint where in the brain these abnormal proteins accumulate.
     
    After the players received intravenous injections of FDDNP, researchers performed PET brain scans on them and compared the scans to those of healthy men of comparable age, education, body mass index and family history of dementia.
     
    The scientists found that compared to the healthy men, the NFL players had elevated levels of FDDNP in the amygdala and subcortical regions of the brain. These regions control learning, memory, behavior, emotions, and other mental and physical functions. Those players who had experienced a greater number of concussions were found to have higher FDDNP levels.
     
    “The FDDNP binding patterns in the players’ scans were consistent with the tau deposit patterns that have been observed at autopsy in CTE cases,” said study author Dr. Jorge R. Barrio, a professor of molecular and medical pharmacology at the David Geffen School of Medicine at UCLA.
     
    Each of the research volunteers also received a standard clinical assessment to gauge their degree of depression (Hamilton Rating Scale for Depression, or HAM-D) and cognitive ability (Mini-Mental State Examination, or MMSE). The players had more depressive symptoms than the healthy men and generally scored lower on the MMSE test, demonstrating evidence of cognitive loss. Three players had mild cognitive impairment, one had dementia and another had normal cognitive function.
     
    Elevated levels of FDDNP have been shown in studies to be associated with cognitive symptoms in normal aging, mild cognitive impairment and dementia, according to Barrio. The FDDNP signals appear to reflect a range of mental symptoms that have been observed in CTE cases, he noted.
     
    Although the FDDNP marker also binds to another abnormal brain protein called amyloid beta, previous autopsy studies have shown the amyloid plaques are observed in less than a third of CTE cases in retired football players, suggesting that the FDDNP signal in the players represents mostly tau deposits in the brain. 
     
    “Providing a non-invasive method for early detection is a critical first step in developing interventions to prevent symptom onset and progression in CTE,” said Small, director of the UCLA Longevity Center. “FDDNP is the only imaging marker currently available that can provide a measure of tau in living humans.”
     
    According to Small, a recent study of more than 3,400 retired professional football players showed that they had a higher-than-average risk of dying from Alzheimer’s disease. Small’s team also is studying lifestyle interventions for delaying the onset of Alzheimer’s symptoms. His new book “The Alzheimer’s Prevention Program,” released in paperback this month, features the latest research on this topic and offers the public practical strategies for protecting brain health.
     
    Research into CTE and the long-term effects of mild traumatic brain injuries such as sports-related concussions has been picking up momentum.
     
    “It is the holy grail of CTE research to be able to identify those who are suffering from the syndrome early, while they’re still alive. Discovering the effects of prior brain trauma earlier opens up possibilities for symptom treatment and prevention,” said study author Dr. Julian Bailes, director of the Brain Injury Research Institute and the Bennett Tarkington Chairman of the department of neurosurgery at NorthShore University HealthSystem, based in Evanston, Ill.
     
    The study was funded by the Brain Injury Research Institute; the Fran and Ray Stark Foundation Fund for Alzheimer’s Disease Research; the Ahmanson Foundation and the Parlow-Solomon Professorship.
     
    UCLA owns three U.S. patents on the FDDNP chemical marker. Small and Barrio are among the inventors. Disclosures are listed in the full study
     
    Additional study authors included Vladimir Kepe, Ph.D.; Prabha Siddarth, Ph.D.; Linda M. Ercoli, Ph.D.; Dr. David A. Merrill; Natacha Donghue, B.A.; Susan Y. Bookheimer, Ph.D.; Jacqueline Martinez, M.S.; and Dr. Bennet Omalu.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • UCLA Health System chosen as a Medicare Shared Savings Program accountable care organization

    The Centers for Medicare and Medicaid Services (CMS) has announced that the UCLA Health System has been selected to participate in the federal government’s Medicare Shared Savings Program as an accountable care organization.
     
    As a participant in the program, the UCLA Health System will work with CMS to provide high-quality service and care to Medicare fee-for-service beneficiaries while reducing the growth in Medicare expenditures through enhanced care coordination.
     
    “Through participation in the Medicare Shared Savings Program and other initiatives, UCLA is taking an innovative approach to health care, focusing on high-value, high-quality care that is truly patient-centered,” said Dr. Molly Coye, chief innovation officer for the UCLA Health System. “UCLA aims to be a leader in transforming health care and reining in uncontrolled health care costs.”
     
    The Medicare Shared Savings Program was created under the Affordable Care Act to help health care providers better coordinate care for Medicare fee-for-service beneficiaries through accountable care organizations, or ACOs — groups of doctors, hospitals and others who collaborate to provide high-quality service and care for their patients.
     
    “UCLA Health System has truly outstanding, high-quality, evidence-based medical programs, and the Medicare Shared Savings Program provides us with an important framework to better coordinate care for our Medicare fee-for-service beneficiaries,” said Dr. David Feinberg, president of the UCLA Health System, CEO of the UCLA Hospital System and associate vice chancellor for health sciences at UCLA.
     
    By creating its own ACO, the UCLA Health System was able to submit an application to participate in the program. UCLA was chosen specifically by CMS to create incentives for health care providers to work together to treat individual patients across care settings — including doctors’ offices, hospitals and other health care facilities.
     
    “Accountable care organizations save money for Medicare and deliver higher-quality care to people with Medicare,” said U.S. Secretary of Health and Human Services Kathleen Sebelius. “Thanks to the Affordable Care Act, more doctors and hospitals are working together to give people with Medicare the high-quality care they expect and deserve.”
     
    The Shared Savings Program will reward ACOs that lower the growth of health care costs while meeting performance standards for quality of care and putting patients first. The participation of health care providers in an ACO is purely voluntary.
     
    ACOs must meet quality standards to ensure that savings are achieved through improved care coordination and the provision of care that is appropriate, safe and timely. CMS has established 33 quality measures on care coordination and patient safety, the appropriate use of preventive health services, improved care for at-risk populations, and patient and caregiver experience of care. Federal savings from this initiative are up to $940 million over four years.
     
    “UCLA Health System is one of only a few academic medical centers to participate in this program,” said Dr. Samuel A. Skootsky, chief medical officer of the UCLA Faculty Practice and Medical Group. “UCLA Health System has a strong foundation in primary care, including a ‘medical home’ initiative to improve care coordination for our patients, in addition to exemplary specialty care. This Medicare Shared Savings Plan challenges hospitals and doctors, together with their patients, to reevaluate and redesign patient care to be more patient-centered and efficient — across all care settings, including at home.”

    The recent announcement was the culmination of a comprehensive selection process that began in the fall of 2011 with the national release by CMS of the Notice of Intent to Apply and application form. The UCLA Health System was selected based on vigorous eligibility criteria and program requirements.
     
    “We have a successful primary care network,” said Dr. Patricia Kapur, CEO of the UCLA Faculty Practice Group. “The framework we currently have in place provides us with a perfect opportunity to work with the federal government’s Shared Savings Program to transform the delivery of excellent medical care.”
     
    The Shared Savings Program is not a Medicare Advantage plan or an HMO. Beneficiaries with fee-for-service Medicare will still have the right to use any doctor or hospital that accepts Medicare, at any time. Find out more about the Shared Savings Program and see a list of the program’s 106 new ACOs announced Jan. 10. 
     
    The UCLA Health System will continue to serve all Medicare fee-for-service beneficiaries, including those in the Medicare Shared Savings Program. The program will support UCLA’s efforts to improve the quality of care it provides, in return for the opportunity to benefit from reduced growth in health care costs.
     
    The UCLA Health System, which comprises the UCLA Hospital System and the UCLA Medical Group and its affiliates, has provided a high quality of health care and the most advanced treatment options to the people of Los Angeles and the world for more than half a century. Ronald Reagan UCLA Medical Center, the Resnick Neuropsychiatric Hospital at UCLA, Mattel Children’s Hospital UCLA, and UCLA Medical Center, Santa Monica (which includes the Los Angeles Orthopaedic Hospital) deliver hospital care that is unparalleled in California. Ronald Reagan UCLA Medical Center is consistently ranked one of the top five hospitals in the nation and the best in the western United States by U.S. News & World Report. UCLA physicians and hospitals continue to be world leaders in the full range of care, from maintaining the health of families to the diagnosis and treatment of complex illnesses.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • Why We’re Raising the Signature Threshold for We the People

    When we launched We the People, none of us knew how popular it would be, but it's exceeded our wildest expectations. Through the past year, interest in We the People exploded and we're closing in on 10 million signatures.

    When we first raised the threshold — from 5,000 to 25,000 — we called it "a good problem to have." Turns out that "good problem" is only getting better, so we're making another adjustment to ensure we’re able to continue to give the most popular ideas the time they deserve.

    Starting today, as we move into a second term, petitions must receive 100,000 signatures in 30 days in order to receive an official response from the Obama Administration. This new threshold applies only to petitions created from this point forward and is not retroactively applied to ones that already exist.

    In the last two months of 2012, use of We the People more than doubled. In just that time roughly 2.4 million new users joined the system, 73,000 petitions were created and 4.9 million signatures were registered.

    read more

  • Simple intervention helps doctors communicate better when prescribing medications

    When it comes to prescribing medications to their patients, physicians could use a dose of extra training, according to a new study led by a UCLA researcher.
     
    In previous studies, Dr. Derjung Tarn and her colleagues found that when doctors prescribed medicines, the information they provided to patients was spotty at best, they rarely addressed the cost of medications and they didn’t adequately monitor their patients’ medication adherence.
     
    The logical next step, Tarn said, was to devise an intervention aimed at improving how physicians communicate to their patients five basic facts about a prescribed medication: the medication’s name, its purpose, the directions for its use, the duration of use and the potential side effects. And it appears to have worked.
     
    Tarn and her co-researchers found that physicians who completed the training demonstrated a significant improvement in how they communicated this crucial information. Compared to a control group that didn’t receive the training, these doctors discussed at least one additional topic out of the five — and they sometimes went beyond the basics, touching on other pertinent facts about medications that are important for patients to know.
     
    The intervention is described in the January issue of the journal Annals of Family Medicine.
     
    “We were pleasantly surprised to see that a simple intervention was effective in improving the content of discussions,” said Tarn, the study’s lead author and assistant professor of family medicine at the David Geffen School of Medicine at UCLA.
     
    The researchers conducted a controlled clinical trial between February 2009 and February 2010 with 27 primary care physicians and 256 patients. The training consisted of a one-hour interactive educational session that encouraged doctors to communicate the five basic facts about prescribed medications. The researchers also gave participating patients a flier listing the five facts. In addition, they recorded the audio of the physician–patient interactions. The success of the physicians’ communication of the key facts to patients was measured using the Medication Communication Index, or MCI.
     
    The researchers found that the mean MCI for the physicians in the intervention group was 3.95 out of five, compared with 2.86 for those physicians who didn’t receive the training. The intervention-group doctors also received higher ratings from their patients on how they communicated information about medications than did the physicians in the control group.
     
    And, significantly, the training resulted in more than just better communication about the medications the physicians prescribed, according to the study.
     
    “Interestingly, higher MCI scores also were associated with more reports of communication about topics not directly included in the intervention,” the researchers write. “For example, the intervention encouraged physicians to discuss potential medication side effects with patients, but patients also reported better communication about the risk of experiencing side effects and what to do if side effects occurred.”
     
    The study has some limitations. Patients were predominantly white, most had at least some college education, and there were more Hispanics than African Americans. Also, having an audio recorder in the examination room may have enhanced communication for physicians in the intervention group more than for those in the control group, who were unaware of what the researchers were studying. In addition, the researchers didn’t examine the doctors’ style of communication, and they don’t know if any additional time spent talking about new prescriptions might have detracted from conversations about other topics.
     
    Still, the study suggests “that a brief, practical intervention can improve physician communication about newly prescribed medications in ways that affect patients,” the researchers write. “The intervention should be tested for its clinical impact.”
     
    Tarn’s co-researchers on the study were Chi-hong Tseng and Neil S. Wenger of UCLA, Debora A. Paterniti of UC Davis, and Deborah K. Orosz of Harvard University.
     
    A grant from the National Institute on Aging (5K12AG001004) funded the study.
     
    The UCLA Department of Family Medicine provides comprehensive primary care to entire families, from newborns to seniors. It  provides low-risk obstetrical services and prenatal and inpatient care at UCLA Medical Center, Santa Monica, and outpatient care at the University Family Health Center in Santa Monica and the Mid-Valley Family Health Center, located in a Los Angeles County Health Center in Van Nuys, Calif. The department is also a leader in family medicine education, for both medical students and residents, and houses a significant research unit focusing on health care disparities among immigrant families and minority communities and other underserved populations in Los Angeles and California.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • Childhood obesity linked to more immediate health problems than previously thought

    While a great deal of research on childhood obesity has spotlighted the long-term health problems that emerge in adulthood, a new UCLA study focuses on the condition’s immediate consequences and shows that obese youngsters are at far greater risk than had been supposed.
     
    Compared to kids who are not overweight, obese children are at nearly twice the risk of having three or more reported medical, mental or developmental conditions, the UCLA researchers found. Overweight children had a 1.3 times higher risk.

    “This study paints a comprehensive picture of childhood obesity, and we were surprised to see just how many conditions were associated with childhood obesity,” said lead author Dr. Neal Halfon, a professor of pediatrics, public health and public policy at UCLA, where he directs the Center for Healthier Children, Families and Communities. “The findings should serve as a wake-up call to physicians, parents and teachers, who should be better informed of the risk for other health conditions associated with childhood obesity so that they can target interventions that can result in better health outcomes.” 

     
    With the dramatic rise in childhood obesity over the past two decades, there has been a parallel rise in the prevalence of other childhood-onset health conditions, such as attention deficit–hyperactivity disorder, asthma and learning disabilities. But previous studies on the topic have been limited due to a narrow focus on a specific region of the county, a small sample size or a single condition.
     
    The new UCLA research, a large population-based study of children in the United States, provides the first comprehensive national profile of associations between weight status and a broad set of associated health conditions, or co-morbidities, that kids suffer from during childhood.
    Overall, the researchers found, obese children were more likely than those who were classified as not overweight to have reported poorer health; more disability; a greater tendency toward emotional and behavioral problems; higher rates of grade repetition, missed school days and other school problems; ADHD; conduct disorder; depression; learning disabilities; developmental delays; bone, joint and muscle problems; asthma; allergies; headaches; and ear infections.

    For the study, the researchers used the 2007 National Survey of Children’s Health, analyzing data on nearly 43,300 children between the ages 10 and 17. They assessed associations between weight status and 21 indicators of general health, psychosocial functioning and specific health disorders, adjusting for sociodemographic factors.

    Of the children in the study, 15 percent were considered overweight (a body mass index between the 85th and 95th percentiles), and 16 percent were obese (a BMI in the 95th percentile or higher).

    The study, which is currently available online, will be published in the January–February print issue of the journal Academic Pediatrics.

    The UCLA researchers speculate that the ongoing shift in chronic childhood conditions is likely related to decades of underappreciated changes in the social and physical environments in which children live, learn and play. They propose that obesity-prevention efforts should target these social and environmental influences and that kids should be screened and managed for the co-morbid conditions.

    The researchers add that while the strength of the current study lies in its large population base, future studies need to examine better longitudinal data to tease out causal relationships that cannot be inferred from a cross-sectional study.

    “Obesity might be causing the co-morbidity, or perhaps the co-morbidity is causing obesity — or both might be caused by some other unmeasured third factor,” Halfon said. “For example, exposure to toxic stress might change the neuroregulatory processes that affect impulse control seen in ADHD, as well as leptin sensitivity, which can contribute to weight gain. An understanding of the association of obesity with other co-morbidities may provide important information about causal pathways to obesity and more effective ways to prevent it.”

    Halfon’s co-authors on the study included Kandyce Larson and Dr. Wendy Slusser, both of UCLA. 

    The study was supported by funding from the Maternal and Child Health Bureau of the Health Resource Services Administration.

    The authors have no financial ties to disclose.

    For more information on the UCLA Center for Healthier Children, Families and Communities, please visit www.healthychild.ucla.edu.

     
    For more news, visit the UCLA Newsroom and follow us on Twitter.