Author: David Meyer

  • OnApp to add compute to its expanding federated cloud portfolio

    London’s OnApp closed a new round of financing last month, taking its total funding to $20 million. So what’s it going to do with the (undisclosed) new tranche of cash? Add yet another string to its bow, that’s what.

    Bear in mind that OnApp was only spun out of British hosting provider UK2 a couple of years ago, with software that lets other providers build their own public clouds. The idea there is to help these other hosting providers – OnApp now counts more than 500 of them as customers — ward off the threat that is Amazon, but in the process the company has steadily used that growing federation to diversify into new lines of business.

    In 2011, OnApp launched a content delivery network (CDN) based on those service providers’ spare network capacity. There are around 130 points of presence (PoPs) in that network across 40 countries – each provider gets paid for the traffic going over its own PoP, and OnApp gets a 10 percent cut. In 2012, the company took on EMC by doing pretty much the same thing with OnApp Storage, using its customers’ commodity servers to support a distributed storage system that’s controlled by OnApp.

    All that is made possible through OnApp’s marketplace and now, flush with fresh funding, OnApp is going to use that marketplace to do the same thing with compute capacity, chief commercial officer Kosten Metreweli told me:

    “Adding compute is the most immediate thing. The end customer could now go to [OnApp’s customer] and say, ‘I want to spin this up in Tokyo and Moscow’. They can come to our marketplace, buy compute capacity in those locations and also have the application automatically replicated across those locations as well. It makes it much simpler to roll out true global cloud applications.”

    There are of course other marketplaces for compute capacity, such as SpotCloud. On that subject, Metreweli drew a comparison with OnApp CDN competitor XDN, pointing out that OnApp already has a huge customer base brimming with capacity. “The trouble is, they were setting up a market stall in the middle of an empty street,” he suggested.

    OnApp CCO Kosten MetreweliAnd he’s not just blowing hot air. In terms of CDN scale, OnApp remains behind market leader Akamai and Limelight but it’s way out in front of Amazon CloudFront and has roughly the same number of PoPs as CDNetworks. OnApp Storage is a newer product, but the company gets to draw on the same customer base there. And those customers can’t hang around these days — not with Amazon breathing down their necks.

    “For the majority of service providers, they’d much rather get going with their cloud service, then put their differentiation on top of that,” Metreweli said.

    Apart from its compute play, OnApp also intends to use its newfound funding for market expansion – 40 percent of its business is in North America and it really wants to invade non-English-speaking territories. It also intends to turn its storage play, currently bundled with OnApp Cloud, into a standalone product.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • UK carriers may all be able to roll out 4G using existing spectrum

    The UK currently has only one major 4G network, but that situation may now change even without an uncoming spectrum auction.

    The reason EE has been able to roll out LTE first is that the regulator, Ofcom, gave it permission to ‘refarm’ its existing 2G and 3G spectrum for the super-fast new breed of mobile broadband. Its rivals are now bidding alongside EE for newly-freed-up spectrum in the 800MHz and 2.6GHz bands, which will allow them to deploy 4G networks around the middle of this year.

    But that’s not good enough, apparently. On Friday, Ofcom said it was responding to complaints from Vodafone and Three in which those carriers said they also wanted to be able to refarm their existing 2G and 3G spectrum. Telefonica (O2) and Vodafone have also asked to be allowed to turn up the power on their 2G base stations for 3G use.

    Ofcom already has to allow all this due to a directive from the European Commission, but until now it’s been granting ‘liberalization’ licenses on a case-by-case basis. If the consultation launched today (PDF warning) doesn’t run into big difficulties – and the operators’ rare unity suggests it won’t – this will change very soon.

    According to Ofcom, the proposed changes will “align the permitted technologies across all mobile spectrum licences, including the existing licences at 900MHz, 1800MHz and 2100MHz and the licences to be awarded by auction in the 800MHz and 2.6GHz bands”.

    “This will meet a long standing objective to liberalise all mobile licences so that there are no regulatory barriers to the deployment of the latest available mobile technology,” the regulator said.

    It will be interesting to see how this affects the bidding in the spectrum auction. The consultation closes on 29 March, by which time that auction process should be over with.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Huawei finds favor at CERN: researchers sign up for more UDS cloud storage

    China’s Huawei may find business tough in the U.S. due to suspicions over its motives, but its cloud efforts are clearly appreciated elsewhere. A year after it started working with CERN on cloud storage – something of a priority for a research organization that generates more than 25 petabytes of physics data each year – Huawei has become an official CERN openlab partner, with at least three more years’ collaboration now assured.

    The new arrangement was announced on Thursday, along with confirmation of Russia’s Yandex becoming an openlab associate in the field of data processing. Huawei’s involvement is a bigger deal than that, as it puts the Chinese firm on a par with Intel, HP, Oracle and Siemens, all of which work particularly closely with CERN to see how their technologies can help with the Large Hadron Collider experiments.

    Huawei UDS cloud storageIn Huawei’s case, the company is contributing its self-healing UDS cloud storage system for use and validation. UDS is targeting the upcoming exascale (an exabyte is roughly a million terabytes) era with a mass object-based storage infrastructure that uses ARM’s energy-efficient processor architecture alongside cheap SATA disks. It also offers Amazon S3 API compatibility and claims eleven-nines (99.999999999 percent) reliability, so users theoretically don’t need to back up data stored in a UDS-toting cloud.

    UDS provides a bit of insight into how openlab works. Huawei first delivered a 384-node version of UDS to CERN in early 2012, after which the researchers played around with it for three months. In September of that year, Huawei released UDS to the general enterprise market (in more normal eight-node configurations). The benefits for both sides of this partnership are clear: CERN has to push technological limits in order to handle the very big data generated by the LHC, and Huawei gets both valuable feedback from the researchers and a glowing report card to show off to the wider world.

    As for the next steps in this partnership, CERN has now hired two computer scientists to work with Huawei on its implementation there, and more UDS storage systems will be deployed at the Swiss facility in the next few months.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Why is OpenStack adoption slower in Europe?

    The vendor-led OpenStack convoy is gathering pace. As Chris Kemp, NASA’s former CTO and now the head of OpenStack-based appliance firm Nebula, laid it out today at Cloud Expo Europe in London, the infrastructure-as-a-service project has more than half a million downloads and can count thousands of members from more than 850 companies in 88 countries.

    Chinese adoption of OpenStack is growing particularly quickly, and the United States and India are doing well too, he said. But Europe? Not so much — yet.

    Why is that? Well, one of the answers is entirely predictable: Europe is just a bit behind the curve when it comes to cloud adoption.

    “My sense that there’s probably a bit more of a conservative attitude towards change and adoption of new technology here,” Kemp told me. “If you look at folks that are leading IT at a lot of America’s largest companies, there’s a lot of competition, a lot of folks encouraging people to take risks. We’re seeing more people in U.S. companies understand how to make apps work in a very reliable way, even on unreliable infrastructure, because the big internet companies there haven’t had a choice.

    “There’s a cultural desire here to have more control over infrastructure. I think private cloud will be bigger in Europe than in the U.S. in the medium term.”

    But that’s not the only reason. There’s another factor that seems backward given the first: it appears OpenStack, just a couple of years old, is feeling the effects of being a relative latecomer to this particular market.

    “It’s a function of some of the earlier cloud technologies getting an earlier start here,” Kemp said. “Eucalyptus made an early run at Europe, and then there’s the OpenNebula project.”

    Marten Mickos Eucalyptus Systems Chris Kemp OpenStack Sameer Dholakia Citrix Structure 2012

    (L to R) Jo Maitland, GigaOM; Chris Kemp, CEO Nebula and co-founder, OpenStack; Sameer Dholakia, Group VP and GM, Citrix; Marten Mickos, CEO, Eucalyptus Systems
    (c)2012 Pinar Ozger [email protected]

    That’s not to say Kemp, who’s naturally very bullish on OpenStack, thinks the market isn’t ripe for takeover. Particularly regarding OpenNebula, the one big European contender in this space, he was pretty scornful of that rival’s attempt to target the enterprise by adding an open-source service layer on top of its core product.

    “You want interoperability, portability and a large ecosystem of tools that all work together at the end of the day – that’s especially what enterprises want,” he said. “The world doesn’t have enough attention for five cloud ecosystems. If you’re EMC or NetApp, are they working on an OpenNebula driver? Where are the OpenNebula conferences? If it’s not there, they’ve already lost this round.”

    But let’s pull back here and consider whether this rivalry really matters. To a certain extent, according to cloud strategy researcher Simon Wardley, it doesn’t – he sees deeper issues facing the putative cloud service provider industry in Europe.

    “The issues about public or private, or which stack to adopt, are all whats, hows and whens. There’s not enough of the why,” he told me. “These are implementation details and they are, to me, secondary to strategy.”

    “In Silicon Valley there’s a lot more thinking about how you manipulate the value chain to compete. For example, if I’m a bank, should I be providing banking as a cloud? It’s that level of strategic play which is important. Most people [in Europe] are thinking about using the cloud because everyone else is doing it: they’re not thinking strategically about using IT as a weapon against others.”

  • OpenNebula open-sources service management layer with enterprise in mind

    OpenNebula, the European answer to the likes of Eucalyptus and OpenStack that counts CERN and China Mobile among its customers, is moving to differentiate itself from competitors by freely releasing OpenNebulaApps, a suite of cloud application management tools that sit on top of its traditional infrastructure management toolkit.

    The OpenNebulaApps tools were previously available only to OpenNebulaPro customers but, according to project director Ignacio Llorente, OpenNebula realized there was more value in opening them up:

    “Most customers are interested in our enterprise support – they want us to provide them with commercial support and a service-level agreement. These components weren’t so important for them, so we realized it was more important for us to release these components to the community, to compete [with OpenStack, Eucalyptus etc].

    “As we are an open-source community, it is much easier for us and our customers to be fully open-source and not to have special add-ons only available for customers. We have a quality assurance process for all open-source technology and also have the community as testers.”

    There are three tools in the OpenNebulaApps collection: AppStage allows automated software stack installation and configuration for virtual machines (VMs); AppFlow is for automatically executing and managing multi-tiered applications that consist of interconnected VMs; and AppMarket lets users build and deploy private marketplaces, so that users can share virtual appliances across multiple OpenNebula instances.

    The suite is being released under the Apache license and will become part of the main OpenNebula distribution. It’s not the first move OpenNebula has made recently to boost enterprise uptake by opening up functionality to more users: a couple of weeks ago, sponsor company C12G said the community would get access to every maintenance release and service pack.

    Llorente described the target users of this latest release as enterprises that see cloud computing as an extension of data center virtualization and that want to, for example, use the VMware hypervisor while avoiding the vCloud VMware component because OpenNebula is “more cost-effective” and supports other hypervisors as well. He suggested that this was a different type of customer from those who want to build an Amazon Web Services-like cloud on-premises.

    “While OpenStack and Eucalyptus can be seen as an open source incarnation of the Amazon cloud model, OpenNebula can be seen as an open source incarnation of the VMware vCloud cloud model,” he explained.

    The open-sourcing of OpenNebulaApps will have some casualties in OpenNebula’s own ecosystem – after all, there’s overlap with projects such as RIM’s Carina environment manager that were designed to run on top of OpenNebula.

    “Yes, this is going to be a problem,” Llorente said. “[Various users] are providing functionality on top of OpenNebula and we are now releasing components with similar functionality, but this is an open ecosystem. Users can decide which solution they want to use.”

  • With 1M contributors, OpenStreetMap claims most detailed maps in some countries

    The crowd-sourced mapping project OpenStreetMap has amassed a million contributors since its inception in 2005 and, according to navigation app maker Skobbler, boasts greater accuracy in England, Russia and Germany than rivals such as Google Maps.

    Berlin-based Skobbler has a tool called GeOS to help developers more easily incorporate OpenStreetMap (OSM) maps into their services. Given that, and since its own apps such as GPS Navigation 2 also use OSM, it is not surprising to see the company talking up the platform. That said, it has some good points to make about the OSM model’s success.

    The key point is that traditional mapping services such as Navteq come out of the automotive GPS market. Google also collects most of its data by driving around. OSM, by way of contrast, is generally better at collecting details that are to be found off-road or by pedestrians.

    However, for a service that’s often compared with Wikipedia, contributing to OSM is still a specialist business, limiting the number of active contributors to those with the technical know-how. Indeed, while the million milestone is notable, the number of active monthly contributors is less than 20,000. (This is not that surprising – a very small proportion of Wikipedia’s 18.3 million registered users make regular edits.)

    “While reaching one million [contributors] is a major milestone for OSM, it’s still early days in terms of fulfilling its potential,” Skobbler co-founder Marcus Thielking said in a statement. “With 90 percent of the population still realistically unable to participate, we’re expecting easier OSM access to allow everyone and anyone to help increase the success of this amazing modern alternative to conventional mapping.”

    The question is, where is that easier access going to come from? Unsurprisingly, Thielking reckons the solution lies in more commercial products using OSM, a shift that would spur demand for more – and more standardized – contributions.

    That would benefit Skobbler but it would also be good news for those that already use OSM data. The biggest name there is Apple – the maps in iOS 6 use OSM data for some parts of the world – but others include Foursquare, Craigslist and, of course, Wikipedia.