Author: Serkadis

  • Nexus 4 is back in stock, but for how LONG?

    In the United States, Google Play has both Nexus 4 models available for sale — after nearly two months stocked out. The bumper is available, too. If you’re one of the gadget geeks looking for this smartphone, get it while you can, and that might not be for long.

    Google launched Nexus 4 on November 13, but sold out in just hours. The phone reappeared on November 27. A day later, Google Play redefined “sold out” by listing shipment date as 8-9 weeks. From a retail distribution perspective, Nexus 4 is pure disaster. It’s anyone’s guess how many could have sold over the holidays, but greedy gadget geeks couldn’t get the phone short of paying extortion-like prices.

    But for those who waited, sales resumed today. The 8GB model is $299 and the 16 gigger $349. That’s unlocked, which is timely considering that unlocking is now illegal in the United States.

    T-Mobile is another option: $200 with two-year contract. On the carrier’s no-contract plan, Nexus 4 is $150 down payment plus 20 bucks per month thereafter for total $500. Obviously, Google Play is a bargain by comparison.

    Nexus 4 specs: 4.7-inch display, 1280 x 768 pixel resolution, 320 pixels per inch; Qualcomm Snapdragon S4 Pro processor; 2GB RAM; 8GB or 16GB storage (depending on model); 8-megapixel rear-facing and 1.3MP front-facing cameras; GSM/EDGE/GPRS (850, 900, 1800, 1900 MHz), 3G (850, 900, 1700, 1900, 2100 MHz), HSPA+ 21; WiFi N; wireless charging; Bluetooth; NFC; SlimPort HDMI; accelerometer; ambient-light sensor; barometer; compass; GPS; Gyroscope; microphone; 2100 mAh battery; unlocked; Android 4.2. Measures 133.9 x 68.7 x 9.1 mm and weighs 139 grams. Sorry there is no LTE.

    As I write, Google Play lists the phone as “ships soon”, which isn’t exactly the same as “in stock”, but as close as you’re going to get considering availability so far. Device “ships in 1-2 weeks”.

    I ordered one for my wife. She takes lots of photos, and after months of use must say the N4 camera is better than Galaxy Nexus, which she has. I’ll Craigslist her phone, soon as the other arrives. I hope that with Galaxy Nexus in excellent condition, spare battery, charger and extra back cover that selling price will cover most of the cost buying her Nexus 4.

    Nexus 4 isn’t my favorite Google phone, but it has grown on me since my first-impressions review. I’ll do a follow-up sometime soon.

    I got to ask: Will you buy Nexus 4?

    Photo Credit: Joe Wilcox

  • New iOS 6.1 Adds Fandango Movie Purchasing To Siri’s Capabilities

    Back in November, developers who were beta testing Apple’s iOS 6.1 indicated that in the then upcoming version of Apple’s operating system, Siri would let users purchase movie tickets through Fandango. Now, Apple has released the iOS update to all users. Sure enough, Fandango has announced the functionality.

    Here’s a bit from Fandango’s announcement:

    iOS 6.1 makes it convenient to purchase tickets through Fandango’s award-winning mobile app. Using iOS devices, moviegoers simply ask Siri to find a specific movie, nearby theaters or desired showtimes. Siri then offers the option to “Buy Tickets” and launches the Fandango app for customers to complete their ticket purchase. For added convenience, moviegoers can add their Paperless Mobile Tickets to Passbook, which lets them scan their iPhone or iPod touch to get into the movie at select theaters.

    “Fandango is committed to innovating across all platforms and helping shape the future of moviegoing,” said Paul Yanover , President of Fandango. “With this new Siri feature, movie fans can quickly and easily discover the nearest theaters, find the most convenient showtimes, and buy tickets through Fandango to help make movie night perfect.”

    Siri movie ticket purchases

    The feature can be used on iPhone, iPad and iPod Touch devices (though we may see Siri make her way to Macs at some point).

    image via 9to5Mac

  • Microsoft really doesn’t want you to buy Office 2013

    They’re here! Today Microsoft released new versions of the flagship productivity suite alongside cloud companions. But if you look closely, all the chatter is about Office 365. The software giant wants your head in the cloud, and tidy, easy-to-account subscription revenue with it. CEO Steve Ballmer and team endlessly blather about “reimagining” Windows, but Office gets the bigger makeover — not just how people work, but how they pay to do it.

    Subscription revenue is Microsoft’s Holy Grail, and one sought since the mid 1990s, because it smooths out revenue and locks in customers. New Office releases come about once every three years. Office 2007 launched six years ago tomorrow and its successor in May 2010. The company can’t depend on consistent sales, which tend to spike around new releases. Subscription — how Microsoft sells Office 365 — is smoother.

    Holy Grail

    Ballmer started the subscription Grail quest by putting profits before customers, and pissing off business and tech decision-makers in the process. In May 2001, Microsoft announced a radically new licensing method that removed off-the-shelf upgrades and pushed businesses to subscription-like model. Software Assurance raised the cost of upgrades by as much as 107 percent, according to Gartner.

    Today, most organizations either pay full price or give Microsoft 29 percent of Office’s full price over two or three years. They pay cost per license upfront plus this additional fee annually. About 60 percent of Office revenue comes from annuity contracts, which is money in the bank and commitment that discourages switching to other products — at least through the contract period.

    Businesses are on the hook, but consumers swim free, and that’s not good for Office, which faces increasing competition from services like Google Apps. There is the impact of slowing PC sales, too. During calendar fourth quarter (fiscal 2013 second for Microsoft), Business division consumer revenue declined 2 percent year over year, which is a consistent trend. By comparison, bookings rose 18 percent, to “near historical high renewal rates for Office”, Chris Suh, general manager of Microsoft investment relations, said during last week’s earnings call. Businesses bought or renewed licensing contracts.

    There is no volume-licensing plan for consumers, or small businesses. Microsoft pushes real subscriptions instead. Marketing emphasis around Office 365 and pricing for subscriptions and suite reveal how Ballmer hopes to get the Holy Grail.

    Pay-More Principle

    If anything, Office 2013 pricing discourages suite sales. Let’s start with the traditional product — how 2013 versions cost compared to their predecessor — direct from Microsoft Store. The operation no longer offers Office 2010, but the prices below were valid before today:

    • Home and Student: $139.99 for 2013; 119.99 for 2010
    • Home and Business: $219.99; for 2013; $199.99 for 2010
    • Professional: $399.99 for 2013; 349.99 for 2010

    From one perspective, the new versions cost about 10 percent to 16 percent more, which says much about what Microsoft execs think of the newer suite’s value but also can be interpreted as intentional barrier to sales. However, Microsoft Store prices are for downloads, which, by the way, is in most markets the only way to get Office 2013. There are no DVDs, which is another sign the company seeks to dramatically change how consumers and small businesses consume and pay for the suite.

    The older version was available on DVD and with more generous licensing rights. The 2010 Home and Student sold for $149.99 (up to three PCs); Home and Business, $249.99 (up to two PCs); and Professional, $499.99 (up to two PCs). So from a different perspective, the new pricing is substantially less, except Microsoft takes away licenses.

    Like the direct download 2010 versions previously available, the 2013 retail replacements come with one PC license. That works out to a hidden, and quite substantial price increase. License to license: 180 percent increase between Office Home and Student 2010 and 2013 versions and 76 percent for Home and Business. When Microsoft introduced Software Assurance nearly a decade ago, prices increased mainly by taking away choices larger businesses had. Now it’s the consumer market’s turn.

    Looked a differently, Microsoft nearly trebles Office Home and Student 2013 for anyone wanting rights for three PCs (from $149.99 to $419.97). For many consumers or small businesses, installing Office on two or more PCs for lower price hugely appeals.

    Sun and Clouds

    But the choice Microsoft takes away from one hand, it offers in another. Multiple device rights are still available for anyone buying into the subscription model, meaning Office 365, which includes desktop software and cloud services.

    Microsoft really wants consumers choosing Office 365 Home Premium — just look at what greets them today at Office.com. The marketing is all about the cloud service, which costs $99 per year per household and comes with five Office licenses for Windows or OS X versions. That’s where Microsoft gives out the licenses, to subscribers. (Note: Microsoft refers to the suite available with Office 365 as one license for up to five devices, mainly PCs and Macs, in the household. For simplicity’s sake and so there is no confusion about semantics, I refer to it as multiples.)

    What a bargain, right? That depends. Buying software grants the user a perpetual license. Technically, Microsoft still owns Office retail but buyers have rights to use it forever. It’s like you own the software. Office 365 is a subscription product that allows the user access to the software as long as he or she pays. If you don’t renew the service, that’s the end of Office.

    So that $150 price for Office Home and Student 2010 is one time, for three licenses. The second year of Office 365 means the buyer pays about $50 more to continue using the product. Double that in year three: $99.99 x 3 = $299.97. Office Home and Student 2010 price: $149.99. That one-time payment covers you, while Office 365 is another $99.99 every year, and that’s assuming Microsoft doesn’t increase the subscription price later.

    For Ballmer and team, which want to smooth out Microsoft revenue and generate more of it, Office 365 is gold. Most buyers will turn round the licensing comparison the other way. The subscription suite costs $99.99 for five devices. A family wanting just two would pay $279.98 for Office Home and Student 2013 outright, or $699.95 for five.

    But the math isn’t that simple. The Office version included with 365 is equivalent to Professional, which adds Access, Outlook and Publisher to Excel, OneNote, PowerPoint and Word. That version, with single perpetual license, sells for $399.99, or 300 percent more than Office 365 for one year. Then there are added incentives for the subscription version, such as Office app cloud access via browser on any PC, 20GB SkyDrive storage and 60 minutes of Skype calls per month.

    So from another perspective, Office 365 is comparatively a helluva bargain, as long as the buyer doesn’t care about having a perpetual license. To be honest, I wouldn’t. The point: Microsoft really doesn’t want you to buy Office 2013 but subscribe to Office 365 instead.

    Photo Credit: Vasiliy Koval/Shutterstock

  • Here’s Eric Schmidt’s Entire Cambridge Speech [Video]

    Google executive chairman Eric Schmidt continues to make headlines. In fact, though he handed the CEO reins to Larry Page in 2011, he may still grab more headlines than page thanks to things like his imaginative speeches (like the one where he talked about replacing himself with a robot at parties) and a recent trip to North Korea.

    His latest speech took place at Cambridge, and is his latest look at the future. Business Insider referred to Schmidt’s vision as “a chilling drone-filled future.”

    The speech is about 45-minutes long, so if you have some time to spend, and want to have your mind blown by one of the most powerful executives of one of the most powerful companies in the world, hit the play button. By the way, he also talks a little about North Korea again and about Genghis Khan.

  • Demand Media’s eNom Teams Up With Parallels On TLDs

    Demand Media announced today that its domain registrar eNom’s Top Level Domain (TLD) program is included in the latest version of Parallels Plesk Panel and in Parallels Domain Name Network.

    “The Internet is undergoing an historic change, and our valuable relationship with eNom enables us to provide solutions and information to educate customers who want to be on the forefront of this revolution,” said John Zanni, VP of marketing and alliances for Parallels. “Access to the new TLD space will tremendously benefit our customers, and eNom makes this possible.”

    Parallels customers will be able to participate in “all aspects of the initial phases” of the new TLD launch, the two companies say. That means during the “sunrise” and “landrush” periods.

    “The excitement around new TLDs is growing, and our integration with Parallels helps validate this,” said Chris Sheridan, VP of business development for eNom. “Through these integrations, we’re opening up the opportunity for a new group of businesses and consumers to become actively involved in the innovative new TLD market. eNom’s session on new TLDs is aimed at educating attendees on this exciting time for the web.”

    The services, called “eNom New TLD Extension for Parallels Plesk Panel”, will be available through the Server Management Extensions in the Service Provider interface or from the Parallels Partner Products site.

    Sheridan will be speaking about the new TLDs program at Parallels Summit next week in Vegas.

  • Mistrust of government often deters older adults from HIV testing

    One out of every four people living with HIV/AIDS is 50 or older, yet these older individuals are far more likely to be diagnosed when they are already in the later stages of infection. Such late diagnoses put their health, and the health of others, at greater risk than would have been the case with earlier detection.
     
    According to the Centers for Disease Control and Prevention, 43 percent of HIV-positive people between the ages of 50 and 55, and 51 percent of those 65 or older, develop full-blown AIDS within a year of their diagnosis, and these older adults account for 35 percent of all AIDS-related deaths. And since many of them are not aware that they have HIV, they could be unknowingly infecting others.
     
    Various psychological barriers may be keeping this older at-risk population from getting tested. Among them are a general mistrust of the government — for example, the belief that the government is run by a few big interests looking out for themselves — and AIDS-related conspiracy theories, including, for example, the belief that the virus is man-made and was created to kill certain groups of people.
     
    Now, a team of UCLA-led researchers has demonstrated that government mistrust and conspiracy fears are deeply ingrained in this vulnerable group and that these concerns often — but in one surprising twist, not always — deter these individuals from getting tested for HIV. The findings are published Jan. 29 in the peer-reviewed journal The Gerontologist.
     
    “Our work suggests that general mistrust of the government may adversely impact peoples’ willingness to get tested for HIV/AIDS,” said Chandra Ford, an assistant professor of community health sciences at the UCLA Fielding School of Public Health and the study’s primary investigator. “HIV/AIDS is increasing among people 50 and older, but there’s not a lot of attention being paid to the HIV-prevention needs of these folks. Older adults are more likely to be diagnosed only after they’ve been sick, and as a result, they have worse prognoses than younger HIV-positive people do.
     
    “Also, the CDC recommends that anyone who’s in a high-risk category should be tested every single year,” she said. “These findings mean that the CDC recommendations are not being followed.”
     
    The researchers sought to test the association between mistrust of the government, belief in AIDS conspiracy theories and having been tested for HIV in the previous year. For the cross-sectional study, they worked with data from 226 participants ranging in age from 50 to 85. Participants were recruited from three types of public health venues that serve at-risk populations: STD clinics, needle-exchange sites and Latino health clinics.
     
    Of the participants, 46.5 percent were Hispanic, 25.2 percent were non-Hispanic blacks, 18.1 percent were non-Hispanic whites and 10.2 percent were of other races or ethnicities. The data were collected between August 2006 and May 2007.
     
    The researchers found that 72 percent of the participants did not trust the government, 30 percent reported a belief in AIDS conspiracy theories and 45 percent had not taken an HIV test in the prior 12 months. The more strongly participants mistrusted the government, the less likely they were to have been tested for HIV in the prior 12 months.
     
    Several of the findings surprised the researchers — for example, the fact that HIV testing rates among this population were not higher at the locations where the participants were recruited, given that these locations attract large numbers of people with HIV.
     
    “This finding is concerning because the venues all provide HIV testing and care right there,” Ford said.
     
    And there was an even bigger, perhaps counterintuitive surprise. The more strongly participants believed in AIDS conspiracy theories, the more likely they were to have been tested in the previous 12 months.
     
    “We believe they might be proactively testing because they believe it can help them avoid the threats to personal safety that are described in many AIDS conspiracies,” Ford said. “For instance, if I hold these conspiracy beliefs and a doctor tells me I tested negative, I might get tested again just to confirm that the result really is negative.”
     
    By contrast, individuals who reported mistrusting the government may not have been tested because the venues where they were recruited were, in fact, government entities, Ford said.
     
    The study has some weaknesses. For instance, the study design did not allow the researchers to determine whether the participants held their beliefs before or after being tested; thus, the researchers couldn’t tell what prompted their mistrust of the government or conspiracy beliefs. Also, it’s possible that the prevalence of these theories is higher in this group than it is in the general public and that some participants may have been afraid to tell the truth.
     
    The next step in the research is to study other groups of older adults to determine if these views are more widely held than just among the at-risk population the researchers studied.
     
    Steven P. Wallace, Sung-Jae Lee and William Cunningham, all of UCLA, and Peter A. Newman of the University of Toronto co-authored the study.
     
    The National Institute of Mental Health (5 RO1 MH069087, 5K01MH085503, R34MH089719); the UCLA Resource Centers for Minority Aging Research Center for Health Improvement of Minority Elderly (RCMAR/CHIME), under a grant from the National Institute on Aging (P30-AG02-1684); the UCLA AIDS Institute; the UCLA Center for AIDS Research (CFAR); the California Center for Population Research (5R24HD041022); and the National Institute on Drug Abuse (R01 DA030781) funded this study.
     
    The UCLA Fielding School of Public Health is dedicated to enhancing the public’s health by conducting innovative research; training future leaders and health professionals; translating research into policy and practice; and serving local, national and international communities.
     
    The Resource Centers for Minority Aging Research Center for Health Improvement of the Elderly (RCMAR/CHIME) is part of the effort to reduce health disparities between minority and non-minority older adults. It does so by increasing the number of researchers who focus on the health of minority elders; enhancing the diversity in the professional workforce by mentoring minority academic researchers for careers in minority elders health research; improving recruitment and retention methods used to enlist minority elders in studies so that research can accurately identify and work toward solutions to health disparities; and creating culturally sensitive health measures that assess the health status of minority elders with greater precision and increase the effectiveness of interventions designed to improve their health and well-being. A central coordinating center provides logistical support to the RCMAR centers, facilitates communication and collaboration, and oversees dissemination activities designed to reach the larger research and health professional communities, public policymakers and consumers. The coordinating center is also the national clearinghouse for measurement tools, instruments, publications, community activity, pilot research and other resources developed by RCMAR investigators.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • PNNL smart grid management technology licensed to Calico

    The Department of Energy’s Pacific Northwest National Laboratory and Calico Energy Services of Bellevue, Wash., today announced that Calico has licensed a portfolio of advanced energy management intellectual property developed by PNNL. The technology was licensed by Battelle, which manages PNNL for the Department of Energy.

    The technology was developed in response to the critical challenges facing electric utilities today, including the need to improve reliability, reduce costs and integrate renewable energy. It coordinates large numbers of smart grid assets, including demand response, distributed generation, and distributed energy storage, typically owned and controlled by customers, to form a virtual control system with the smooth, stable, predictable response required by utility operators.

    “PNNL’s technology represents a major leap forward in our nation’s ability to manage grid reliability, balance the ever-expanding complexities of our electricity distribution system, integrate renewables and engage consumers in energy savings programs,” said PNNL engineer Rob Pratt, who led the team that developed the licensed technology. “We look forward to seeing utilities and consumers benefit from this technology.”

    PNNL’s development of the technology was funded by DOE’s Office of Electricity Delivery and Energy Reliability and the American Recovery and Reinvestment Act.

    The innovative technology portfolio is based on a single, integrated smart grid model that uses an economic signal to automatically balance supply and demand at the lowest possible cost. Sophisticated algorithms enable a variety of intelligent devices within a distribution system to address electricity imbalances in real-time using automated demand response and real-time bidding that is orders of magnitude faster than human operators. These devices include generation, storage, renewable energy generation, and end-point controls such as thermostats, hot water heaters, and large load controllers.

    “PNNL’s patent portfolio is a breakthrough that allows an electric power system to virtually balance itself,” said Jesse Berst, founder and chief analyst at SmartGridNews.com. “The traditional method uses centralized manual dispatch to coordinate supply and demand. But manual methods will never keep up with our new systems, which will have hundreds of thousands of distributed resources scattered throughout. To manage that kind of complexity, you must distribute and automate the process, as PNNL has now made possible.”

    “PNNL’s technology will be commercialized into a module of our Energy Intelligence SuiteTM, or EIS, and will be an excellent complement to the energy management platform we deliver to our utility customers today,” said Mike Miller, president and CEO of Calico Energy Services. “EIS serves as a unified operations center that integrates disparate data, devices, software engines, and applications. It allows utilities to make informed decisions and to precisely control energy resources and grid assets.  The capacity to leverage distributed automation provides a unique capability and adds substantial value to our solutions.”

    PNNL’s technology has already proven highly effective in real-world installations. For example, it was a key part of the Pacific Northwest GridWiseTM Demonstration Project, which PNNL led on Washington state’s Olympic Peninsula from 2006 to 2007. A related version of the technology is also being used in the Battelle-led Pacific Northwest Smart Grid Demonstration Project, a large-scale demonstration project designed to help bring the nation’s electric transmission system into the information age. It has shown the ability to provide a market mechanism to reward electricity consumers, while reducing energy consumption where and when it is needed using real-time demand and pricing signals.

    For utility operations teams, the intelligence of PNNL’s technology – particularly its ability to provide automated demand management and price bidding – will also reduce administrative complexity while providing far faster control over loads.

    “At one time, the Soviet Union used a centrally planned economy, but the complexities of the modern world forced it to switch to a market-based approach,” Berst said. “In the same fashion, the electric power system is still trying to get by with centralized dispatch and control. Thanks to these breakthroughs from PNNL, it can now adopt a market-based approach that is far faster and more precise.”

  • Is Yahoo Poised For A Search Comeback?

    Yahoo released its earnings report for Q4 and the full year 2012. The report was better than many analysts had expected, and this was helped significantly by better-than-expected search performance from the company who has outsourced its search back-end to Bing.

    Here are the search highlights from the release:

    • GAAP search revenue was $482 million for the fourth quarter of 2012, a 4 percent increase compared to $465 million for the fourth quarter of 2011. GAAP search revenue was $1,886 million for the full year of 2012, a 2 percent increase compared to $1,853 million for the prior year.
    • Search revenue ex-TAC was $427 million for the fourth quarter of 2012, a 14 percent increase compared to $376 million for the fourth quarter of 2011. Search revenue ex-TAC was $1,611 million for the full year of 2012, a 9 percent increase compared to $1,478 millionfor the prior year.
    • Paid clicks, or the number of clicks on sponsored listings on Yahoo! Properties and Affiliate sites, increased approximately 11 percent compared to the fourth quarter of 2011 and increased approximately 8 percent compared to the third quarter of 2012.
    • Price-per-click increased approximately 1 percent compared to the fourth quarter of 2011 and decreased approximately 2 percent compared to the third quarter of 2012.

    During a conference call following the report’s release, CEO Marissa Mayer, a major player in Google’s search efforts over the years, indicated that search is a big priority for the former king of search engines. Wired quotes her:

    “Overall in search, it’s a key area of investment for us,” Mayer said. “We need to invest in a lot of interface improvements. All of the innovations in search are going to happen at the user interface level moving forward and we need to invest in those features both on the desktop and on mobile and I think both ultimately will be key plays for us.”

    “We have a big investment we want to make and a big push on search. We have lost some share in recent years and we’d like to regain some of that share and we have some ideas as to how.”

    Despite rumors that have been whispered throughout the industry from time to time, there’s nothing here to suggest that Yahoo and Bing will be breaking its deal off prematurely, as Mayer seems much more concerned with the front end. It will be interesting to see what becomes of it.

    The company has already been pushing out a redesigned home page to users (though we’ve seen more complaints than praise).

    Can Yahoo make a comeback in search? What do you think?

  • Amazon Launches Elastic Transcoder In Beta

    Amazon announced the launch of Amazon Elastic Transcoder for Amazon Web Services for transcoding video files between different digital media formats. You can use the service, which Amazon says is highly scalable, to convert video files from their source format into versions that will play on smartphones, tablets and PCs.

    “For example, customers can use Amazon Elastic Transcoder to convert their large high resolution ‘master’ video files into smaller versions that are optimized for playback on websites, mobile devices, connected TV’s and other video platforms,” the company says. “Amazon Elastic Transcoder removes the need to manage infrastructure and transcoding software, providing scalability and performance by leveraging AWS services. The service manages all aspects of the transcoding process transparently and automatically. It also supports pre-defined transcoding presets that make it easy to transcode video for smartphones, tablets, web browsers and other devices. With Amazon Elastic Transcoder, customers can create enterprise, training, user-generated, broadcast, or other video content for their applications or websites.”

    On the product page, Amazon says:

    Amazon Elastic Transcoder manages all aspects of the transcoding process for you transparently and automatically. There’s no need to administer software, scale hardware, tune performance, or otherwise manage transcoding infrastructure. You simply create a transcoding “job” specifying the location of your source video and how you want it transcoded. Amazon Elastic Transcoder also provides transcoding presets for popular output formats, which means that you don’t need to guess about which settings work best on particular devices. All these features are available via service APIs and the AWS Management Console.

    Amazon Elastic Transcoder explained

    There are no contracts or monthly commitments to use the service. You pay based on the minutes you need to transcode and the resolution of the content.

    Check out Elastic Transcoder here.

  • Using a tweet to get the power back on faster

    Your power just died — what’s the first thing you do? No, not go get candles. If you’re like many of us, you probably grab your phone and tweet, or write a Facebook message, about how supremely annoying and inconvenient losing power is (ugh, you were in the middle of Downton Abbey!). But turns out, bitching publicly on social media could actually be helpful to your local utility, if they’re using new big data software recently launched by GE.

    This week at Distributech — it’s like the CES for the power grid sector — GE is formally unveiling its big data analytics and visualization software called Grid IQ Insight. It sucks in data from everywhere — grid sensors, smart meters, weather reports — including public social media data that consumers write about their electricity. The analytics can find and determine if the data is relevant (“My power’s been out for an hour, PG&E sucks!”) and can use the location data from the phone tweet to get a picture of an outage in a certain area.

    The idea is to give utilities a better window into when outages occur before they start getting phone calls from angry customers. They’ll still get those, but if they see an explosion of social media messages coming from a certain neighborhood, they can potentially reach out to those customers first and let them know they’re working on the problem. It’s about better customer service and quicker fixes to power outages.

    During a demo on Monday night, GE execs showed a demo of a visualization of a grid in a neighborhood and tweets that would come into the system in real time. The social media messages were coded — red for negative, blue for neutral and green for positive. ‘They’re usually red,” joked a GE exec.

    The grid visualization can show all sorts of other data in real time, not just social media messages. Importantly utility workers can see when connected grid systems like substations and transformers are having problems. The big data analytics platform uses Cassandra and MySQL databases under the hood.

    GE isn’t the first to do this. A startup called Space Time Insight has built a grid data visualization tool that organizations like the California Independent System Operator Corporation use to watch California’s grid in real time. For Distributech — or DTech as the industry likes to call it — Space Time Insight launched the latest version of its software and also announced Canadian utility Hydro One as a new customer.

    Developing tools to help utilities manage the massive influx of big data from the power grid is a hot space. (Make sure to check out our 13 energy data startups to watch in 2013). GE’s big data software is also part of its efforts to sell technology for the “Industrial Internet,” or bringing digital technologies to the sectors like transportation, aviation, locomotives, power generation, oil and gas development, and other industrial processes.

  • Google Makes AdWords API Usage Free

    Google announced that it is going to start making AdWords API usage free starting March 1. They’re doing away with the preferred pricing model.

    There will be two levels of API access once the changes take effect: basic and standard. The former will be the default, allowing for up to 10,000 operations per day. The latter will be available to qualified developers with no daily limit on operations. Neither option comes with a charge.

    “If you have an approved AdWords API token and plan to execute fewer than 10,000 operations per day, there’s no action needed. You’re covered with basic access,” says AdWords product manager S. Srikanth Belwadi.

    “Based on your history with the AdWords API and details you’ve shared with us, you might be pre-qualified for standard access,” says Belwadi. “If so, we will contact you within the next week and let you know. Please keep your contact email address up-to-date in the My Client Center (MCC) account associated with your developer token. If you haven’t been contacted or if you haven’t applied for standard access by February 28th 2013, your token will only have basic access starting March 1st 2013.”

    If you think you need standard access, you can apply for it here.

    There’s an FAQ section in Google’s help center further discussing the changes here.

  • AWS launches transcoding service a week after Microsoft goes after media biz

    Amazon Web Services now offers transcoding services in the cloud, a product launch for the cloud computing giant that follows a week after Microsoft announced a similar (but more expansive) service in its Windows Azure cloud. AWS Elastic Transcode will benefit companies that want to adapt their video files to a variety of consumer devices, from smartphones to big0screen TVs.

    Transcoding traditionally has been done on dedicated hardware located inside the data centers and head ends of telecommunications providers and cable operators, or in the data centers of content companies and CDNs. For example, Netflix encodes each movie it has 120 times to meet the needs of all the devices it supports. But as online video becomes more popular and devices proliferate, transcoding becomes an issue for everyone, from small blogs that want to do video to Disney.

    Now, instead of buying dedicated hardware and software, they can go to Amazon, which will offer folks 20 minutes of transcoding each month for free. After that, it will charge between 0.015 cents per minute to 0.036 cents per minute depending on whether the customer wants high-definition or standard definition, and where in the world the transcoding will occur.

    From the Amazon release:

    In addition, Amazon Elastic Transcoder provides pre-defined presets for popular devices that remove the trial and error in finding the right settings and output formats for different devices. The service also supports custom presets (pre-defined settings made by the customer), making it easy for customers to create re-useable transcoding settings for their unique requirements such as a specific video size or bitrate. Finally, Amazon Elastic Transcoder automatically scales up and down to handle customers’ workloads, eliminating wasted capacity and minimizing time spent waiting for jobs to complete. The service also enables customers to process multiple files in parallel and organize their transcoding workflow using a feature called transcoding pipelines. Using transcoding pipelines, customers can configure Amazon Elastic Transcoder to transcode their files when and how they want, so they can efficiently and seamlessly scale for spikey workloads. For example, a news organization may want to have a “high priority” transcoding pipeline for breaking news stories, or a User-Generated Content website may want to have separate pipelines for low, medium, and high resolution outputs to target different devices.

    Amazon isn’t the first in the cloud encoding/transcoding market, but it does have the largest customer base in the cloud, including Netflix, which clearly delivers a lot of video. As I mentioned earlier, Microsoft has launched a Media platform service that will include transcoding, aimed at giving customers all the tools it needs to deliver streaming video content online. Microsoft’s service uses the same tools it used to host the London Olympics last year. Other companies such as Encoding.com provide cloud encoding services as well.

  • Palm creator’s brain-mimicking software helps manage the smart grid

    Jeff Hawkins, the man who brought us the Palm Pilot, is back with a new streaming analytics company that’s now being used by energy-management company EnerNOC to predict the future for the institutions running our electrical grids. Hawkins’s new company, called Numenta, processes data as it streams off sensors, servers and other machines, and then quickly recognizes patterns so it’s able to predict in real time what happens next.

    If you visit the web site for Numenta, which was founded in 2005 but just recently emerged from stealth mode, you’ll see lots of images or neurons, synapses and dendrites, and lots of text explaining neurological processes. Don’t be intimidated. The long story short is that Numenta’s software, called Grok, is able to recognize patterns (e.g., temporal and spatial) from streaming data and then automatically build models that allow it to predict what will happen next.

    The goal isn’t necessarily to be as intelligent as the human brain, but to be as fast as the human brain when it comes to processing data that Grok understands. People love to talk about “big data,” VP of Marketing Joe Hayashi explained, but “our mission is to help people act on fast data.” In Numenta’s largely machine-to-machine world, where the data half-life might be measured in seconds, the human-driven process of big data is just too slow.

    “They can only go as fast as the data scientists can build models and really understand it,” Hayashi said.

    The simplest explanation of Grok you'll ever see

    The simplest explanation of Grok you’ll ever see

    Grok, on the other hand, is continuously learning from every new data point that hits the system, and it’s always readjusting its models to account for any changes it sees in the patterns of data. Not only does this help it make predictions faster and more accurately, but it also helps Grok spot anomalies that could cause problems. Ideally, Hayashi said, the software will be part of a machine-to-machine system that makes decisions on its own, in real time, without human intervention.

    For a customer such as EnerNOC, which helps energy suppliers operate more efficiently, Grok will help the company’s frequency-reserve service called DemandSMART optimally draw power from customers that are part of the program. Frequency reserve markets rely on a network of customers voluntarily (although for compensation) reducing power usage during peak times in order to ensure grid integrity. Grok could also help EnerNOC predict potential mechanical failures by identifying and flagging behavior it hasn’t seen before, or by discovering patterns that lead to failure.

    Actually, Hayashi explained, EnerNOC is a really good example of where Numenta and Grok fit into the data-processing ecosystem. EnerNOC, like many Numenta users, already has a system in place for processing real-time data, but that system only lets the company see what’s happening now. Introducing Grok into the environment, will let them “know what’s going to happen,” he said.

    All of Numenta’s detailed comparisons to how the brain works might be an impressive way to describe the technology, but it might also bury the importance of what the company is trying to do. As we’ll discuss at various sessions during our Structure: Data conference in March, the advent of ubiquitous sensors, webscale server farms and just an abundance of machines everywhere is generating more data, and at faster speeds, than human beings could ever hope to make sense of on their own. If we’re going to keep up, we’re going to have to learn to let software shoulder a lot of the analytical load.

    Feature image courtesy of Shutterstock user pixeldreams.eu.

  • Google Launches Updated Maps For North Korea

    Just a couple weeks after executive chairman Eric Schmidt returned from North Korea, Google has published more detailed maps of the country.

    “The goal of Google Maps is to provide people with the most comprehensive, accurate, and easy-to-use modern map of the world,” said Google Map Maker senior product manager Jayanth Mysore. “As part of this mission, we’re constantly working to add more detailed map data in areas that traditionally have been mostly blank. For a long time, one of the largest places with limited map data has been North Korea. But today we are changing that with the addition of more detailed maps of North Korea in Google Maps.”

    “To build this map, a community of citizen cartographers came together in Google Map Maker to make their contributions such as adding road names and points of interest,” added Mysore. This effort has been active in Map Maker for a few years and today the new map of North Korea is ready and now available on Google Maps. As a result, the world can access maps of North Korea that offer much more information and detail than before.”

    Google Maps North Korea

    Google acknowledges that the map is not perfect, and it is encouraging people to continue using Map Maker to help it improve.

    The updated maps are now live .

  • Apple Actually Announces 128GB iPad

    Rumor was going around that Apple was working on a 128GB version of its iPad. Some were skeptical. In an interesting turn of events, Apple actually came out and announced the device without a big event talking about how amazing and magical it is.

    The device is a 128GB version of the fourth-generation iPad with Retina Display. It comes in both the Wi-Fi and Wi-Fi + Cellular models. Previous versions came with 64GB of storage.

    “With more than 120 million iPads sold, it’s clear that customers around the world love their iPads, and everyday they are finding more great reasons to work, learn and play on their iPads rather than their old PCs,” said Apple SVP of Worldwide Marketing, Philip Schiller. “With twice the storage capacity and an unparalleled selection of over 300,000 native iPad apps, enterprises, educators and artists have even more reasons to use iPad for all their business and personal needs.”

    The new devices will be available starting Tuesday, February 5 in black and white. Oh, by the way, they cost $799 and $929 for the Wi-Fi and Wi-Fi + Cellular models, respectively.

    On Monday, Apple launched iOS 6.1. More on that here.

  • Startup Gridco wants to build a next-gen power grid that looks like the Internet

    Will tomorrow’s power grid look much more like the Internet than it does today with decentralized and distributed networking? That’s the idea behind Gridco Systems, a startup founded by Naimish Patel, a serial entrepreneur who previously was the co-founding CTO of optical networking company Sycamore Networks. Patel and his team are using digital solid state transformers and software that ingests data in real time to create a new type of distributed control and power electronics networking product for utilities that looks far more like an Internet network product than a utility tool.

    Three-year-old Gridco is selling this networking gear and software to utilities to enable them to have greater control over their networks and to be able to maintain a greater degree of reliability, more similar to the reliability that Internet companies currently have, and utilities rarely have. The promise is that potential utility customers can have self-healing, and smarter grids, so when one section of a grid goes down, other areas of the grid can route around that section and the overall system can maintain function.

    With old-school electromechanical grid equipment, which is the dominant form of grid power electronics today,  it’s hard to create this type a resilient grid. The sector needs digital control systems, says Patel, to enable the installation of widespread solar panels and wind turbines. These type of new solid state transformers are new, and so is the combo of using them with more sophisticated digital networking tools.

    Gridco has raised a little over $20 million so far, from investors including General Catalyst, North Bridge venture partners, Lux Capital, and RockPort Capital. This is a Series A round, so you can imagine that the company might raise quite a few more rounds before it matures.

    Patel tells me he wanted to create a product in the power grid industry because of the “massive opportunity” for “an industry that sorely needs it.” He likes “solving big problems,” says Patel, as does his investors. The team is still a small group, though Patel wouldn’t name how many employees the company has. The company also was mum on all details of its technology.

    “The network could embody the best of telco networks, delivering higher degrees of reliability to the distribution network,” says Patel. Patel wouldn’t tell me which utilities were trialling its technology, but said the company is engaged in talks around pilots for a variety of vendors.

    The clear draw back of this technology is that because it’s a replacement product, meant to replace the current old mechanical transformers, utilities — and their rate approval boards — might take awhile to justify the investment.

  • There’s something you should know about Apple

    Panic in Cupertino: Headless chickens run around smacking into one another, because they don’t know they’re dead.

    That’s the fundamental problem with Apple, and this situation is largely independent of recent stock price declines that analysts, bloggers, reporters and other writers can’t opine enough about. Falling shares are part of a necessary correction, as reality displaces perception. To understand what’s happening now, you need to look into the past — three years, which by Internet counting is like a lifetime.

    Three years. I want you to repeat “three years” like a mantra while reading this analysis. That’s all it took for Apple’s recent rise and about all it could take for the fall. The company isn’t going away and surely will remain successful for a long time — just more as a niche brand, as it was before. That is, unless CEO Tim Cook and company do something dramatic, like apply the “David Thinking” that spurred success in the past, while giving up status quo approach for the future (that’s something I don’t expect).

    History Lesson

    In September 2009, the Financial Accounting Standards Board changed reporting rules that greatly benefitted Apple. Beforehand, the company deferred a portion of iPhone revenue over 24 months, rather than put it all on the books at once. Reasoning: iPhone buyers commit to two-year contracts for subsidized pricing (they pay less than devices cost) and Apple delivers iOS updates over time. The new rules let Apple realize revenue immediately, and the company adopted the change with fiscal 2010 first quarter results. Three years ago this month.

    The change was dramatic. Apple beat analyst revenue consensus by more than $3.5 billion, reaching $15.6 billion. The company didn’t stop there, but revised earnings reports going back two years, essentially raising revenue after the fact. “Not since the Soviet Union, have I seen any entity so brazenly try to rewrite history”, I accused. There was blatant (although not illegal) manipulation in the restatement, and its breadth and suddenness.

    The past revision was huge. For example, for fiscal fourth quarter 2008, Apple reported $7.9 billion revenue and net profits of $1.14 billion. The company shipped 6.9 million iPhones, but only reported revenue of $806 million. The revised figures raised revenue to $11.52 billion and net profit to $2.25 billion. The difference: $3.62 billion revenue and $1.11 billion net income. Apple didn’t report the revised results in those past quarters. You can change the historical record, but not the past or decisions Apple investors made three months or eight quarters earlier.

    The accounting change, along with iPad and iPhone 4 sales success, hugely lifted Apple revenue and profit in 2010. So much that in April 2011, I posted a chart showing the gains. During calendar 2010, Apple revenue rose from $13.5 billion in first quarter to $26.7 billion in the last and profit from $3.1 billion to $6 billion. So, both nearly doubled. iOS revenue more than doubled — and then some — to $15.9 billion from $6.4 billion.

    Margin Call

    Since, Apple is nothing short of an amazing money making machine, and that’s more than about accounting changes. For calendar 2012, revenue reached $164.7 billion. The difference ($88.46 billion) is more than Apple total revenue for calendar 2010 ($76.24 billion). Profit more than doubled, from $16.7 billion to $39.58 billion. If things are so good now, why is the share price so bad?

    Apple’s problem isn’t past or present, but the future and whether its decline will be as fast as the meteoric rise. Three years. The stock chart above shows steady overall climb, beginning in early 2009 through November 2011, when a steep ascent started. Apple shares soared 94 percent, before reaching a record high in September 2012. As Apple shares rose, analysts, bloggers, reporters and other writers chattered about the stock reaching $1,000 (all time high was $705.07) and market cap reaching 1 trillion. I looked askance as usual, but, that’s just me. So given the hype, Apple’s decline since is quite shocking. At close of market today, shares are down 36 percent.

    While shares fall, revenues rise. Except Apple missed Wall Street consensus two quarters in a row, which causes some people to be nutty about the stock. Meanwhile, during calendar fourth quarter, gross margin plummeted to 38.6 percent from 44.7 percent a year earlier and 40 percent from Q3. The company forecasts gross margin between 37.5 percent and 38.5 percent for calendar first quarter.

    iPad already tugs margins downward, as iPad mini cannibalizes larger slate sales. By my math, iPad category average selling prices fell 12.3 percent quarter on quarter — from $535 to $467. Year over year, iPad ASPs are down $101, Apple’s CFO admits. Only the company’s inability to manufacture enough minis to meet demand kept matters from being worse. “Our iPad units grew faster than our iPad revenue in the December quarter”, he says. “We would expect iPad ASPs to be down quite a bit in the March quarter on a year-over-year basis for the same reasons”.

    Feel the Pinch

    Something else risks margins. In November 2011 analysis “Apple is the new Dell“, I recounted how real-time manufacturing gave the Windows PC maker supply chain advantage over competitors. Apple has accomplished similar feat by using economies of scale and sheer influence to lock in lucrative component price deals that lock out competitor access.

    As I explained then about these competitors: “They’ll adapt and improve Apple’s supply-chain recipe, just like Dell’s PC competitors did a decade ago. As significant, as more manufacturing capacity comes online, Apple won’t as easily get the best prices or, by monopolizing supply, shut out competitors producing smartphones, tablets and other connected mobile devices. As components become more readily available and for lower costs, competitors can improve margins and still lower selling prices against products like iPhone and iPad”. That situation absolutely is underway right now.

    But there’s something more: All the bad buzz about Apple fosters perceived weakness that suppliers and other partners will exploit. You must understand market retribution dynamics. The powerful are often punished when weakened. Microsoft and Nokia are examples. As Apple’s reputation and share price fall, so will its muscle. Apple’s success makes it unpopular in many quarters, and nowhere more than among partners that felt gave away more than than wanted or received less of the gravy than they feel deserved. The situation makes way for competitors like Samsung, which already is a major component supplier, to lock in some good deals of its own.

    All this happens as Apple’s most important, iPhone, faces increased risk. During calendar fourth quarter, iOS smartphone share fell to 22 percent from 23.6 percent a year earlier, according to Strategy Analytics. Meanwhile, Android rose to 70.1 percent from 51.3 percent. For all 2012, iOS nudged up to 19.4 percent from 19 percent share, while Android reached 68.4 percent, up from 48.7 percent. The differences between the quarter and year, strongly suggest sales surge at the end, for Android, which forebodes poorly for Apple when iOS got big lift from iPhone 5’s recent launch.

    During calendar Q4 iPhone revenue rose to $30.67 billion from $16.25 billion three months earlier. The device accounted for 56.3 percent of all Apple revenue and 52.7 percent a year earlier. Any risk to iPhone hurts the whole company.

    What’s a Year?

    All this leads me back to the three years theme and how quickly a company’s fortunes can change and how fast the collective human brain is to forget. Hell, not even three years but one. Or less. As previously mentioned, in 10 months Apple shares rose 94 percent (leading to predictions about how high they could go) only to fall nearly 40 percent in another five (leading to speculation how low the stock will fall).

    Apple rival Samsung is startling example, too. In fourth quarter 2011, Apple actually shipped more smartphones than Samsung — 23 million and 22.5 million, respectively, according to IDC. A year later, Samsung shipments rose 76 percent, with 63.7 million smartphones to Apple’s 47.8 million.

    But Samsung claimed glory sooner, in Q1 2012, beating out Nokia in global handset shipments and Apple in smartphones, according to Strategy Analytics. The South Korean manufacturer’s rise was fast, ah hum, like Apple’s. In just 10 quarters, Samsung went from a bottom-feeding 5 percent smartphone share to top-dog 29 percent.

    Nokia invented the smartphone in the mid-1990s and was the undisputed global handset leader for years, even after Apple released iPhone. But in just three years, the Finnish-phone maker’s fortunes collapsed. Smartphone share in Q4, according to strategy analytics: 3 percent. Three years earlier: 39.2 percent. Many of the markets Nokia dominated three years ago belong mostly to Samsung and somewhat to Apple.

    Research in Motion is another example, with smartphone share ahead of Apple in Q4 2009 — 20.2 percent to 16.4 percent, respectively. Three years later, Strategy Analytics lumps RIM with Other.

    We forget how fast fortunes are made or lost. Apple’s rise is a three-year story, and it’s fall can be just as fast — as impossible as such decline might seem to many. Cloud-connected devices form a highly complex and dynamic market. Anything can happen. Apple won’t go down easy, but I promise that there will be a pileup of competitors, and even partners, seeking to do just that.

    Photo Credit: Joe Wilcox

  • Get iOS 6.1 NOW

    Apple today released iOS 6.1, which is available via over-the-air download. The update extends 4G LTE support to 36 additional carriers, bringing the total to 70. High-speed data is also available to 23 more carriers for iPad. Apple already supports LTE in Australia, Canada, Japan, South Korea, United Kingdom and United States, among others. New coverage extends to Denmark, Finland, Italy, Philippines, Switzerland and some Middle Eastern countries, to name a few.

    “If you look at the total of all of these and the incremental subscribers that are in those countries, that’s over 300 million”, Apple CEO Tim Cook boasts. Availability and adoption aren’t the same thing. IHS iSuppli sees global LTE subscribers reaching 198.1 million this year, up from 92.3 million — that’s 115 percent — in 2012. The analyst firm forecasts 139 percent compound annual growth rate through 2016, with 1 billion expected subscribers.

    In that respect, LTE is more an investment in the future. By contrast, with Nexus 4 smartphone and Nexus 7 tablet, Google chose HSPA+, which is more broadly deployed, allowing the release of lower-cost devices. For example, the Nexus 7 with cellular radio sells for $299, compared to $459 for iPad mini, with half the storage; comparable, price is $559. Nexus 4 with 16GB storage sells for $349 from Google, unlocked. The comparable, unlocked iPhone is $649.

    Apple claims 500 million cumulative iOS device sales through end of 2012. There are “nearly 300 million iPhone, iPad and iPod touch devices on iOS 6 in just five months”, Phil Schiller, Apple’s senior senior veep of global marketing, says. That’s 60 percent on the newest version — likely more considering not all devices sold are still being used.

    By stark contrast, only 10.2 percent of Android users are on newest version Jelly Bean. Apple’s advantage is controlling OS updates rather than letting either carriers or manufacturers do so.

    Since I don’t own any iOS devices, there is nothing to say about using the software. So, please, give us your report.

    Photo Credit: Joe Wilcox

  • Suh Saves Man In Pool (Man Happens To Be Comedian Louie Anderson)

    Ndamukong Suh, the famous (or infamous, if you prefer) defensive tackle for the Detroit Lions reportedly saved comedian Louie Anderson from drowning (along with help from famed diver Greg Louganis). You couldn’t make this up.

    Suh is to appear on Splash, a reality show about diving, a move even his teammates have reportedly spoken out against. I guess it’s a good thing for Anderson he was there. TMZ reports:

    Sources close to the production tell TMZ, Louie — who’s a contestant on the show — was practicing his dives on Wednesday when he became a little too bushed to pull himself up the ladder … falling back into the water again and again.

    We’re told Louie eventually had to be rescued by Suh and divemaster Greg Louganis — who physically lifted the foundering funnyman to the poolside. Louie then sat coughing up water for several seconds.

    So, there’s at least one more person that probably doesn’t have a problem with Suh, despite his bad reputation as a dirty player.

    The episode on which Suh will appear will air on Tuesday, March 19 at 8:00 PM on ABC.

    Suh recorded two tackles in the Pro Bowl on Sunday.

    Image: Ndamukong Suh (img.ly)

  • After a rise and fall, BlackBerry 10 is RIM’s last, best comeback attempt

    The year was 1999 when Research In Motion first unveiled its initial BlackBerry email pager, the beginning of a strong product brand that continues to this day. In that time, the ubiquitous BlackBerry has grown beyond a simple email machine to capable smartphones in 2003, gathering a cult-like following of “crackberry” users.

    blackberrysGiven that success, it once seemed unfathomable that RIM wouldn’t easily be one of the world’s top 5 smartphone makers, yet in 2012 it barely held on to the fifth spot as Samsung, Apple, Nokia and HTC sold more smartphones. Put in perspective: RIM’s 32.5 million smartphones sold all year were easily trumped by Apple 47.8 million handsets sold in the final quarter of 2012 alone.

    As quickly as RIM’s BlackBerry rose to the top in the first half of the last decade, it just as quickly fell behind the touchscreen smartphone revolution started by Apple in 2007. Now, after several years of losing market share and stalling growth of its BlackBerry subscriber base, RIM is rebooting the product line this week with the debut of BlackBerry 10.

    Not the first time for a comeback effort

    This isn’t the first of RIM’s attempts to compete with the current crop of smartphones. 2008 saw RIM debut the BlackBerry Storm, an all touch device that created little more than a drizzle in the market. In 2010, the company launched the BlackBerry Torch 9800 along with the BlackBerry 6 operating system and a WebKit browser. But after using an evaluation device, I felt — as did others, based on meager sales — the Torch was an evolution of the same old BlackBerry experience, not the revolution that RIM really needed at the time.

    playbook4Aware that it needed something new for the future, that same year saw RIM purchase QNX Software Solutions from Harman International. At the time, QNX was used for many in-car information and entertainment systems. RIM’s BlackBerry PlayBook tablet was the first RIM product to use a QNX-based platform and while it was good at what it did, the slate was missing key features: A native email client for one, and direct access to RIM’s popular BBM messaging service.

    Amid those feature misses and lackluster sales, I suggested that RIM made a mistake by putting QNX on a tablet before using it to power BlackBerry smartphones. In hindsight, however, it appears that RIM had little choice: It was reportedly having problems getting the BBM service on PlayBooks because the service is limited to a single device per user. And it took nearly a year to get a native email app on the tablet. It appears to me now — as it did then — that RIM was simply trying to beat others into the nascent and quickly growing tablet market that began in earnest with the iPad in 2010. As a result, it launched a product well before perfecting the experience.

    So why is this time different?

    We’ll know more after Wednesday’s BB 10 launch in New York City, but hints of potential success for RIM are popping up all over the web. First up is the hardware, expected to be two initial handsets; one with a physical keyboard and one, dubbed the Z10, without. From the various leaked images and video of what’s likely a developer phone model in use, they appear perfectly capable and comparable in performance to other high-end smartphones available today.

    bb10-iphone5What about the operating system? Considering that RIM originally planned to have a new platform on phones by early 2012, it has had an extra year to work on BB 10. That year proved tumultuous with the co-CEOs stepping down, market share dropping, pleas to developers to stay the course and barely any growth in BlackBerry subscribers. From the little bit of BB 10 I’ve seen so far, however, the wait may be worth it.

    Expect to hear much about BlackBerry Flow at the launch event: This is RIM’s tightly integrated method of quickly navigating through the operating system in a consistent manner. BlackBerry Hub is the centralized communications center while the BlackBerry software keyboard should provide for fast, accurate entry.

    There’s more to smartphone success than hardware and the OS, however. It seems like RIM has also learned the lesson that mobile apps and content deals are also important. On the app side, the company has put enormous effort into courting developers, even poking fun at itself in a music video. (Hear what RIM’s Alec Saunders has to say about that in our podcast interview.)

    As a result, tens of thousands of apps are likely to be available once the new devices launch. And just today, RIM shared details of its unified content store, listing out all of the media partners, along with news of next-day television content availability. Add in support from carriers — all four major operators in the US plan to sell the new BlackBerry devices — and the puzzle pieces of potential are all there.

    The most likely outcome

    What are the odds that Research In Motion has a hit with the new BlackBerry 10 devices? I’ll have a better idea when I attend the launch event, of course — and I’ll be live-blogging from there — but based on the limited information I have so far, RIM should at least be staying in the smartphone game.

    As I’ve said to many over the past few months, the new devices should appeal to current BlackBerry owners. My unanswered question now is: Will there be enough to sway people away from iOS and Android phones? Until we know more, I think it’s a safe bet that RIM keeps its current user-base happy and possibly steals some market share from its peers.

    Either way, if RIM delivers what I expect in BB 10, it stays relevant in a market where nearly 6 billion people don’t yet have a smartphone. There’s much growth to be found yet, even if BB 10 doesn’t unseat the current smartphone incumbents. But even with the right recipe and ingredients, there’s no guarantees for RIM. Challenges still loom for the company as whole and maintaining a sliver of market share may not be enough for the long road ahead.