Author: Jordan Novet

  • Long a cloud kingpin, Amazon now fighting back against AWS competition

    Amazon Web Services Senior Vice President Andy Jassy didn’t refer to any competitor by name when he pointed out AWS’ advantages before a crowd of around 4,000 at the AWS Summit in San Francisco on Tuesday. But it’s not hard to take a guess on who he was talking about. With Microsoft hyping its Windows Azure Infrastructure as a Service (IaaS), Amazon is trying to persuade people — Amazon faithful or not — that Azure just doesn’t compare.

    The Amazon experience

    Amazon has a fair amount of experience in the public-cloud realm, having launched AWS in 2006 and, Jassy said, having envisioned it a decade ago. Since then, Amazon’s S3 offering has grown to encompass 2 trillion objects stored on behalf of third-party developers who can’t necessarily afford to run their own infrastructure or don’t want the management hassles.

    As for Microsoft, it made Windows Azure generally available as an IaaS cloud earlier this month. As my colleague Barb Darrow reported, Microsoft initially went for Platform as a Service (PaaS) instead of IaaS, and developers kept moving toward AWS for all of its standard services. But that’s a different story.

    Amazon doesn’t want to rest on its history: Jassy talked up the smorgasbord of AWS technologies and services, from the Elastic MapReduce Hadoop implementation to the Redshift data warehouse. “What (customers) don’t want to settle for is an AWS 2009 type of platform,” Jassy said. “As a lot of other companies are just launching their solutions, we have much better technology than anybody else. We are iterating and innovating at a very fast clip.” Since the beginning of 2013, Jassy said, AWS has introduced 71 services and features. Compare that with, say, 82 new services and features Amazon rolled out in the entirety of 2011.

    The virtuous price cycle

    Jassy also talked up the “virtuous cycle” of adding AWS customers, increasing usage, creating economics of scale and, in turn, getting new customers. Through its Trusted Advisor feature, the company can suggest to customers that they scale down compute instances if they’re sitting idle, and at the same time AWS keeps lowering prices again and again — 31 times since 2006, Jassy said. In other words, Amazon’s pitch is that AWS can be good at a low price. (To be fair, other cloud providers can outperform Amazon’s EC2 computing service in certain instances when it comes to running an application and serving up the result.)

    Conference attendees walk the exhibition floor at the AWS Summit 2013 in San Francisco on April 30.

    Conference attendees walk the exhibition floor at the AWS Summit 2013 in San Francisco, April 30, 2013

    On top of those traits, AWS already has built up an ecosystem of customers, from startups such as Mailbox to enterprises such as Shell. A marketplace has sprung up for products that customers can run on top of AWS. And with operations in nine regions, AWS is global, permitting its customers to become global, too.

    Is public better?

    Jassy took a minute to talk about how “old-guard tech companies” are pitching private clouds as secure, even though private clouds might not be able to match the benefits of sharing Amazon infrastructure. But he did say he understands that some companies need to keep certain operations on premise, and in that case companies should consider services such as Direct Connect to bridge the divide between an AWS deployment and a local data center.

    How did all of this play with conference attendees? Many people I spoke with said Amazon was ready when people needed elastic compute and storage services and now, even if Google, Microsoft, Rackspace and others can match up, they’ve already committed to Amazon, at least for core features such as S3 and EC2, and are looking at paying for other services. One entrepreneur working at a startup said he relies on AWS to run his applications, and his fallback is to deploy in a second AWS region. Only if that doesn’t work would he look at other public-cloud providers.

    Of course, this is an Amazon event, so it’s not surprising to hear such things. But perhaps as Google, Microsoft and Rackspace get their own virtuous cycles going, the story will be different in a year or so.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • New Relic sees revenue boost, enterprise growth and mobile-monitoring interest

    The application-performance management (APM) world has seen lots of action lately. Providers such as AppDynamics, AppNeta and New Relic have taken on hefty lots of venture funding this year.

    That’s why it’s no surprise that New Relic — honing in on a public offering as soon as next year, CEO Lew Cirne has said — has strong revenue growth, customer growth and enterprise adoption in particular to report.

    In the first quarter of 2013, the company posted 130 percent more revenue than in the first quarter of 2012, according to figures it provided to GigaOM, but it did not disclose actual dollar figures. Customer growth was up 134 percent, AND enterprise customer growth specifically came in 65 percent ahead year over year, it said. New Relic can also now call Comcast, General Electric and Saks Fifth Avenue its customers.

    The mobile APM capability New Relic released last month has proven compelling to customers, judging by the early adoption that’s happened so far. More than 1,000 iOS and Android mobile apps are now being monitored through the New Relic mobile tool, including the Nike Running app and an app from the Wanelo social shopping site, said Patrick Moran, New Relic’s vice president of marketing.

    Will New Relic add network monitoring, like competitor AppNeta? WIll AppNeta, AppDynamics, Compuware and others with APM offerings introduce new products, forcing New Relic’s hand? Or will New Relic just keep on keeping on with its current feature set? With more companies recognizing that real-time insight into performance helps devops respond more quickly and keep customers happy, there are plenty of topics for discussion as this market grows.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • 10gen introduces a backup option for MongoDB

    There’s no question that MongoDB is popular among developers. 10gen, the company behind the NoSQL database, has been building out its executive team. Now 10gen is adding a support mechanism that could give users some assurance that they won’t lose their data in the event of a disaster.

    The MongoDB Backup Service, now in limited release with general release slated for the summer, lets customers determine how often they want to back up their databases at colocation facilities 10gen uses. If a user wants to back up every six hours, for example, then that user has many options to choose from in the way of restoring a database to a previous state. They can choose the version from six, 12, 18 or 24 hours ago. Restores require two-factor authentication and work across multiple shards. Customers pay only for the amount of backup that they use.

    10gen, based in New York and Palo Alto, Calif., expects the service to be a hit not necessarily with big companies but with small and medium-sized businesses. “It allows them to focus on building out applications instead of worry about this operational part of the infrastructure,” said Kelly Stirman, director of product marketing at 10gen. Regardless of company size, the feature could be valuable for anyone working in Mongo with larger data sets.

    Beyond that, backing up means users can move data from a production environment into a testing environment to look for issues so their production environment won’t be affected.

    While many MongoDB users already back up their databases, the systems are typically homemade, Stirman said. The MongoDB Backup Service, by comparison, is more reliable.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Novell, eyeing brand revival, aims at on-premise sharing space

    There’s a new contender in the document-sharing space, with security features that should ingratiate it to businesses and differentiate it from Dropbox and other cloud-storage providers. But the name of the vendor is not new.

    Novell — the company that dominated the network operating system market with NetWare before running into the Microsoft Windows NT/Windows Server buzzsaw and coming out with open enterprise server and identity and access management software — is releasing Filr. It’s the first Novell product that doesn’t build on existing products since Novell was acquired by the Attachmate Group.

    Bob Flynn, Novell’s president and general manager, wants to reposition the company as lively and innovative but also reliable in its core strengths in networking and file management. Flynn sees the company’s software assets as strong but said the brand still suffers in the aftermath of the company’s rough patch. Whenever he hears from a company that doesn’t want to buy from Novell anymore, he asks why. It’s usually not the technology or the user experience that gets in the way. Rather, it’s the company itself. “Well, I haven’t heard from you guys in years, and you’re sitting right in the middle of my infrastructure, running mission-critical stuff for me. Where are you going? What do you expect me to do?” he said, representing the sentiments of an unhappy customer. That’s what Flynn is trying to change.

    So here comes Filr, which will be generally available on Tuesday. On the front end, browser, desktop, iPad, iPhone and Android applications keep files arranged in neat areas: those available inside an enterprise’s network, those that people have shared with a user and those that are exclusive to the user. On the back end, the software runs as a virtual appliance, files can sync up, and deployments can be made to comply with security standards a customer has in place.

    Novell Filr iPhone and iPad apps.

    Novell Filr iPhone and iPad apps.

    Companies that already pay Novell for maintenance of Open Enterprise Server software will get complimentary access to Filr. Others can get the product for $45 per user per year.

    Bringing a safe solution to the bring-your-own-device party isn’t the most surprising move. Arguably it should have come sooner. Flynn said the idea was on the drawing board soon after the Attachmate Group closed on its acquisition of Novell in 2011, but executives wanted to focus on rolling out additions to existing products first. Now that Novell has introduced Filr, it, along with a dozen other companies, will try to become the Dropbox of the enterprise, and with an on-premise option it will compete with Microsoft SharePoint, FileReflex, mobilEcho and others.

    Novell might not succeed at this — the company shuttered its Vibe Cloud collaboration offering a couple of years ago. Then again, maybe the company is different now. Flynn certainly seems up for leading the charge of a turnaround.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Brocade unveils network virtualization software and gear

    Brocade on Tuesday unveiled new software following on its November acquisition of Vyatta, showing that the network vendor, like Cisco and others, is indeed going after higher layers of the networking stack.

    It sounds as if Brocade wants to bill itself as the company that can generate network resources on the fly, only when and where they’re needed. It’s also investing heavily in software, a move that could bring it into competition with giants such as VMware.

    Among the new products are a couple intended for network-function virtualization. Brocade’s Vyatta vRouter can virtually set up and configure networks on the fly. It’s available through Amazon Web Services’ marketplace. The Virtual ADX is intended for fast application delivery and control of application management.

    Brocade also has decided to make its VCS fabric for connections across hardware available as a plug-in for OpenStack, so users can scale out their networks across multiple clouds.

    While software is in the spotlight, Brocade also has a new four-port, 40 GbE card for its MLXe router. The card’s ports support both OpenFlow and traditional routing protocols.

    I expect more vendors could follow suit as hardware vendors continue to push their software lineups; hype surrounding software-defined networking and network virtualization continues, and companies wonder what they should try. That means we’ll see a lot of new products hit the market, even as customers try to hire to figure out how to wade through all the FUD.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Cloud benchmarks show smaller providers coming out ahead — but they’re still benchmarks

    Big-name cloud service providers capture the tech-press headlines day after day, but lesser-known players might deserve more recognition, at least based on new benchmarks out from Compuware.

    The company found that, although Rackspace and Windows Azure were consistently fast in taking an application request, processing it and delivering it back, a handful of competitors are actually faster and more consistently so, while other providers are available more often. The figures from Compuware represent the average response times over a year of monitoring performance on 38 cloud facilities, as part of Compuware’s normal Global Provider View cloud-benchmarking application.

    Comparing average response time, consistency (as a standard deviation of response time) and availability, Compuware has found that three of the top five data centers are run by Layered Tech (in Illinois, Missouri and Texas). There’s also Qube with its New York data center and Connectria’s Missouri data center. Altogether, Connectria came out looking the best. It ranked fourth for speed, eighth in consistency and second in availability.

    Source: Compuware Global Provider View

    Cloud providers ranked from left to right, when factoring in average response time, consistency (standard deviation of response time) and availability. Source: Compuware Global Provider View

    Rackspace took sixth place, while Azure was No. 9. As for Amazon Web Services, its EC2 service from Northern California and Northern Virginia data centers came in 26th and 29th overall, respectively.

    Benchmarks don’t guarantee that companies will get the same results. Far from it — application type, price and other variables are in play, and interested parties would do well to run their own tests, as my colleague Derrick Harris explained in a 2011 GigaOM Research article that explores similar benchmark reports (subscription required). Another moving part is the number of features available in each cloud. AWS, for example, has DynamoDB, Redshift, CloudFront and others, which makes it more like platform than just a bunch of servers to use.

    Even with these variables, Stephen Pierzchala, application-performance management technology strategist at Compuware, drew some insight from the data analysis. “If you’re just looking at pure performance, you can look at other vendors, rather than simply the names that you know,” Pierzchala said. “There are other cloud providers that seem to be delivering a standard application with pretty good performance, and it may fit into our budget, or may fit into our business needs a little bit more in terms of delivery.”

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Facebook relies on natural-language processing to power Graph Search

    Since Facebook debuted its Graph Search function in January, the social network has given access to a small percentage of users — millions, while Facebook had 1.06 billion monthly active users at the end of 2012. The feature aggregates people, places and things based on user input and quickly provides interesting and sometimes surprising content. That only happens thanks to nifty natural-language processing work that goes on behind the scenes. And it only works with English — for now. Engineers are trying to figure out how to make the product available in other languages.

    In an article set to be posted to Facebook’s engineering blog on Monday, research scientist Maxime Boucher and Xiao Li, engineering manager on the natural-language team in Graph Search, provide detailed information on the ways in which Graph Search calls on natural-language processing to guess what users want.

    • Graph Search breaks down search strings into multiple components that serve as commands with which the system can query the database. For instance, “my friends who live in San Francisco” would be run like this: pulling up the user, grabbing that person’s list of friends, calling on the filter for people who currently live in a place, and filtering out only those friends who have San Francisco in that field. Graph Search considers that search query “intersect(friends(me), residents(12345)).” And if that’s exactly what the user had in mind, that query gets converted into language for the Unicorn search engine to chew on.Search terms sometimes include words Graph Search has no use for. At other times, words for guiding queries are missing. And users might plug in terms in the wrong order. Say a user types in “friends San Francisco.” Graph Search might offer “my friends who live in San Francisco” as a good option. If it sees “San Francisco friends,” it could respond with “my friends who live in San Francisco,” which is more in accord with the correct sequence for a query.
    • Graph Search analyzes words users enter to look for possible entities that users are referring to in the database, across more than 20 entity categories, such as cities, employers and schools. Using statistics for the entity categories, the tool identifies sequences of words that could be more applicable for certain entities than others. If “san” precedes “francisco,” the user likely is referring to a city, not a person.
    • The system recognizes slang, nicknames for places, misspellings, the many ways of expressing particular types of data and other peculiarities that users type into the search box and swaps out each of those for terms that actually exist in the database. That means, for example, that subject-verb agreement isn’t necessary for the system to serve up query options that might lead to what users want to see. And words such as “besties” get interpreted as “friends.”

      Graph Search is visible on one of the most popular social networks in the world and therefore needs to be satisfying for its users. As Boucher and Li write, “The challenge for the team was to make sure that any reasonable user input produces plausible suggestions using Graph Search. To achieve that goal, the team leveraged a number of linguistic resources for conducting lexical analysis on an input query before matching it against terminal rules in the grammar.”

    Graph Search still has a long to-do list for engineers to address. One of the biggest challenges is to construct and deploy a language-agnostic Graph Search system, so Facebook users all over the world will be able to do what English speakers can do with the tool. It will be difficult to produce a tool that can adjust for unusual spellings, handle incorrect grammar and otherwise optimize search strings entered in any language. “In Russian, there’s so many inflections around words and a lot of language-specific things we haven’t encountered in English,” Li told me in an interview on Friday. Engineers are now looking at different ways to make the tool available for other languages, Li said. One option? A whole lot of drop-down menus.

    While there is still work to do in letting more people try Graph Search, it’s clear that the simple interface for navigating hundreds of millions of objects required engineers to produce a bunch of systems and models. It’s no Google, Siri or DataPop, but, because it contains elements tailored to the data set at hand and common use cases, and because it’s getting better over time, Graph Search is worth keeping an eye on.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Amazon, Facebook want to hire software-defined networking engineers

    While software-defined networking (SDN) seems to be still stuck in the hype cycle, use cases from service providers and a few enterprises were on display at the Open Networking Summit last week. And now more could be on the way, this time from businesses operating at webscale: Facebook and Amazon.

    According to a job description posted on the careers section of Facebook, the social-networking company looks like it wants to deploy software-defined networks at production scale. The right person for the open software-engineer job will head up efforts around “designing and implementing control plane systems for our network.” The job “may involve evaluating third party and open source software,” among other possible tasks. That could mean Facebook wants to explore existing proprietary controllers and other software components from vendors such as Big Switch, alongside the code that will emerge from the OpenDaylight Project vendor-led consortium.

    The Facebook SDN hire suggests that Facebook is now ready to go beyond early-stage work that Facebook was doing at the beginning of the year and apparently was engaged in as early as March 2011.

    As for Amazon, the retailer and Amazon Web Services operator posted earlier this month openings for an “SDN Software Development Engineer” and a “Systems Engineer” focused on “EC2 High Performance Network Virtualization.”

    Amazon has a couple of ideas up its sleeve, judging by the job descriptions. First, it wants to improve the performance of its virtualized networks, to make them perform as well as networks running on bare metal. Then, Amazon wants to make multiple SDN features available to AWS customers. And because AWS is so popular and continues to grow, that could have some neat implications for customers interested in making their bandwidth as scalable as their compute and storage resources.

    Amazon did not respond to a request for comment, and a Facebook spokesman was not able to provide any additional detail. But Jay Parikh, Facebook’s vice president of infrastructure engineering, will be talking at GigaOM’s Structure conference in San Francisco on June 19, and maybe then we’ll hear more information.

    If the new Facebook and Amazon SDN hires move quickly, next year’s Open Networking Summit could be filled with some of the biggest use cases yet. Even so, wide-scale use of SDN among enterprises might come later next year, if not in 2015.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • RightScale sees uptick in cloud adoption and multi-cloud use

    Companies larger than 1,000 employees appear to be a bit further ahead of smaller companies when it comes to adopting cloud computing. Of those larger companies, 77 percent have adopted clouds in some way, compared with 73 percent for companies with fewer than 1,000 employees, according to a recent survey of 625 business, development and IT staffers from cloud-management provider RightScale.

    The interesting area is the rise in the provisioning of resources on multiple clouds, which includes private-public combinations as well as multiple public clouds. Of the more than three quarters of larger companies that are adopting clouds, 77 percent of those are deploying across multiple clouds. Last year’s survey, which did not break out companies by size, found that 68 percent had deployments spanning more than one cloud.

    It’s great news for RightScale, which helps Zynga and other companies keep track of all their clouds from a single pane of glass. It also bodes well for competitors, such as Enstratius (formerly named enStratus) and Server Density.

    Michael Crandell, CEO of RightScale, speaks at the RightScale Compute conference in San Francisco on April 26

    Michael Crandell, CEO of RightScale, speaks at the RightScale Compute conference in San Francisco on April 26

    At a meeting with reporters and representatives of RightScale customers, RightScale CEO Michael Crandell said that although he doesn’t have survey data to back up the assertion, he believes the use of multiple clouds has been steadily rising. “I think it (multi-cloud use) was lower (before the first survey),” Crandell said. “That’s my gut instinct.” Whether it’s to save money, to have different applications running on different hardware or to focus on core competencies, companies have a variety of motivations to try running their operations on clouds.

    Still, compliance with regulations and concerns about security keep some companies from trying out the public clouds or, in some cases, even private clouds. Crandell said it can come down to legal issues. Businesses might not want to risk putting their own customers at risk of data breaches. “It’s their attorneys who are dealing with that as well as ours,” Crandell said, describing the thinking of some businesses.

    Even with those conditions, it does seem that cloud and multi-cloud adoption will keep going up, and that means full speed ahead for management companies such as RightScale.

    Feature image courtesy of Shutterstock user Rechitan Sorin.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Missed out on a WWDC ticket? Try an alternative conference, one block away

    Some Apple developers might be bummed if they didn’t score tickets to the Worldwide Developers Conference in San Francisco, which sold out Thursday in record time — less than five minutes. But there is an alternative.

    Targeting Apple developers and designers and other interested people, regardless of whether they are card-carrying iOS or Mac developers, the second-ever AltWWDC will go down on June 10-14 at San Francisco State University’s downtown campus. That’s a block away from Moscone West, the site of the official Apple event. The schedule is in flux, although the event does have a few speakers locked in, including Victor Agreda Jr. of the Unofficial Apple Weblog and Mac developers Mike Lee, Saul Mora and Brent Simmons.

    The conference will go beyond code and design to also include business and legal issues and lifestyle topics, said an organizer, Rob Elkin. Still, organizers will display live blogs from bloggers covering the official conference keynote for all to see.

    The first AltWWDC event was held last year, after Elkin wanted a place to work and talk with people while he was in San Francisco around the time of last year’s conference, even without a ticket to the official show. He and a friend, Judy Chen, put on the event, which attracted 70-80 people at the busiest times, Elkin said. It happened to be one of a few events going on at the same time as the official Apple conference. This year the alternative efforts, including IndieDevLab, are joining forces. Elkin expects many more people this time around. “As you can imagine, it’s a little bit hectic right now,” he said.

    This year’s AltWWDC attendees won’t be there in person to see Apple folks reveal the nitty-gritty coding details about future platforms that could be announced, such as iOS 7 and OS X 10.9. But there will be talks and plenty of like-minded people to mingle with. Lunch and working space will be available, too, on a first-come, first-serve basis.

    And it’s free! That sure beats WWDC’s $1,599 sticker price.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Box gets hip to HIPAA, adds health-record apps

    Think some cloud-storage options are no good for privacy-sensitive applications like health care? Box wants you to think again.  Keen on boosting its enterprise customer base and prepping for an IPO, the company said Wednesday it’s now HIPAA-compliant, enabling Box to handle personal health information.

    Compliance with the Health Insurance Portability and Accountability Act means that Box provides file redundancy to prevent data loss in a disaster, restrictions on employees’ access to documents, a breach-notification policy, data encryption and other features.

    Beyond talking about meeting regulatory standards, Box is also promoting 10 new partner applications in its marketplace, including the drchrono iPad application for viewing electronic health records and the TigerText Software-as-a-Service (SaaS) for texting and sharing documents among health care providers.

    It’s not as though Box has yet to take on business from health care companies, though. It’s got hundreds of paying health care customers, said Whitney Bouck, general manager of Box Enterprise. Customers include the Garden City Hospital and the Henry Ford Health System, both in Michigan, according to a Box statement. Still, Bouck said that because of the HIPAA compliance and application partnerships, the company expects a much higher annual revenue growth rate in the health care area than the companywide figure, which stands at 160 percent.

    Adding creature comforts to entice customers in health care and other sectors is important for the cloud-storage contenders such as Box, Dropbox and at least a dozen other storage providers, all of which want to become the Dropbox of the enterprise. Box is following Salesforce.com, Microsoft and other cloud collaboration providers by connecting with apps catering to industries. At the same time, those software giants are adding Box-like cloud storage capabilities of their own.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Fusion-io puts down $119M to pick up storage maker NexGen and add enterprise adoption

    In hopes of going beyond way beyond webscale computing and spurring wide enterprise adoption of flash memory, Fusion-io on Wednesday announced in a filing that it has paid around $119 million in total for NexGen Storage.

    The deal, which includes $114 million in cash and around $5 million in stock, expands the Fusion-io product line. NexGen makes hybrid storage arrays that incorporate solid-state storage and hard disk drives, which altogether can provide up to 192TB in total storage capacity. Fusion-io already provides the flash in the NexGen gear, and now Fusion-io will be able to move beyond the original-equipment manufacturer relationship and make revenue directly by selling these boxes.

    Jim Bagley, senior analyst at Storage Strategies NOW, thought it was a good move for Fusion-io. The market for hybrid storage arrays will see “several billion” in sales this year, compared with less than $1 billion for all-flash arrays, he said.

    The NexGen buy comes at a critical time for Fusion-io. Fusion-io’s revenue in the first quarter of the year — the company’s fiscal third quarter — declined 7 percent year on year to $87.7 million. Its net loss was $20 million, compared with $4.7 million a year earlier. Take those as signs that Fusion-io needs to expand its market.

    While Fusion-io can say that Apple and Facebook are big customers, it could face new challenges in widening adoption of its flash memory in the next year or two as big players move in to flash. IBM has gone off on a $1 billion flash campaign, and storage heavyweight EMC is coming out with a few new PCI-Express flash memory cards.

    Fusion-io knows this. It has added to its software holdings with IO Turbine in 2011 and with ID7 just last month. With more companies coming up with options for software-defined storage, Fusion-io might boost its software portfolio further this year while at the same time rolling out new lower-end hardware in the coming months thanks to the NexGen buy. At least Fusion-io knows it needs to act.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Jeda Networks promises software-defined storage controller to come soon

    Jeda Networks, a startup that’s talked about its intention to provide software for virtualizing storage networks, is a few steps closer to delivering on its vision. The company  said Wednesday it will start offering its Fabric Network Controller in an early-ship program next month and make it generally available over the summer.

    As is the case with software-defined networks, the Jeda controller software will separate the control plane from from the data plane. It will run on a virtualized server and take charge of the intelligence that would otherwise reside on a switch. SDN can have a wide range of benefits, although it generally can permit more programmability and elasticity of networks. For Jeda, virtualizing storage-area networks could allow customers to generate and disable those networks in response to changes in demand.

    Other companies other than Newport Beach, Calif.-based Jeda are likewise looking to make software-defined storage a reality, including Convergent.io, ScaleIO and SwiftStack. Approaches differ among those entities, but they share the goal of bringing storage up to the level of programmability of compute resources and, increasingly, networks.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • 3scale gets $4.2M to help companies manage their APIs

    Boatloads of companies are constructing application-programming interfaces (APIs) to stream their data out to other sites, but developers don’t always have an easy way to secure or monetize these streams. Hence the emergence of API-management companies, which have been making lots of news lately. Now investors are putting $4.2 million behind another one of them, 3scale.

    The new funding, which comes from Costanoa Venture Capital and Javelin Venture Partners, brings the total 3scale has raised to $5 million. The company, founded in 2007, will use the new funds to introduce “a whole bunch of product extensions” and add customers internationally, said Steve Willmott, CEO and a co-founder. Customers can already create subscriptions, give and take away access, observe traffic, set up alerts for usage violations, manage payments and use other functions.

    The company targets startups releasing their first APIs as well as enterprises looking for full support. 3scale has more than doubled its customer count in the past year, with more than 200 now, including LiveOps, Skype and the U.S. Department of Energy, Willmott said. The API area has seen exponential growth — now there are more than 13,000 through which to pull and push data. Over the next five years, that number is expected to surpass 1 million, according to a 3scale statement.

    As application developers get their APIs in place, API-management companies are competing to become known as the go-to sources customers can use to track, control and monetize their APIs. And many of the other players have been making waves in the past weeks.

    News surfaced last week about Intel’s acquisition of Mashery. Less than three weeks after API hub MuleSoft said it picked up $37 million in Series E funding, that company on Tuesday said that it is buying ProgrammableWeb, an API directory and news outlet. Also on Tuesday, CA Technologies made news with its purchase of Layer 7 Technologies.

    Because more companies are building out these rivers of content, the competition among the API-management providers, which also includes Apigee and Alcatel-Lucent’s open-source apiGrove, should remain lively for a while.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Salesforce.com seeks more advertising, marketing revenue from social

    At a glitzy Salesforce.com event in San Francisco on Tuesday, it became clear that the coming connection between customer-relationship management tools and social-listening and social-advertising features isn’t just a neat upgrade for Salesforce users. It’s an attempt by Salesforce to get business from advertising and marketing agencies that want to do a better job of targeting specific customer segments.

    The big-picture goal is to carry out on an “incredible new vision for what it means to market and how to transform your company and just get much closer to your customers,” to use the words of Salesforce CEO Marc Benioff.

    At least for one marketing and advertising company on hand to talk up the Salesforce news on Tuesday, the integration between the CRM and social publishing is a welcome improvement.

    “What Salesforce talked about today is something we had to do a little bit more manually before but just one more step in the evolution of our craft,” said Jonathan Nelson, CEO of Omnicom Digital.

    It does seem that contacting leads on social media in response to what people say could bring about more deals and turn whiny users into advocates. What Salesforce is trying to do here is make the most of its CRM service — and also grow it — to help advertisers and marketers do their jobs better. Salesforce has already signed up WPP Group, Mindshare, Resolution and other big companies, and if results beat out other options, the more than $1 billion Salesforce spent on the two social listening and publishing companies enabling the new CRM-publishing connection could pay off.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • VMware boosts quarterly revenue and sees a good year ahead

    VMware executives told investors Tuesday that they were pleased with the company’s performance in the first quarter of the year, boasting $1.19 billion in revenues, up 13 percent year over year, even as profits slipped 9 percent to $174 million. Earnings per share of 74 cents exceeded analyst expectations on average by 4 cents.

    Adoption of products slated for release later this year have executives feeling hopeful about seeing this year’s total revenues. They should come in 14 percent to 16 percent ahead of last year when taking into account the removal of revenues and costs related to the Pivotal Initiative, Chief Financial Officer Jonathan Chadwick said on a call with investors. Last year’s revenue came in at $4.6 billion.

    Following on VMware’s $1.26 billion acquisition of network-virtualization player Nicira, VMware will ship its NSX software, drawing on elements of Nicira software, in the second half of the year. NSX will lower customers’ capital and operational expenditures and “transform network operations in a non-disruptive manner,” said President Carl Eschenbach.

    Eschenbach also said the vCloud Hybrid service will launch on May 21.

    Rather than expecting a negative impact from the OpenStack movement, VMware CEO Pat Gelsinger said he sees OpenStack as offering “an expanding addressable market for VMware.”

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Database startup MemSQL adds scale to speed with distributed version

    In-memory database startup MemSQL has come out with a distributed version of its database, enabling customers to work with much larger data sets stored in memory while ensuring that speeds stay high. It’s a nod toward the fact that users still want answers to queries right away, even while the amount of data companies store just keeps on growing even as .

    With this release, users will be able to scale MemSQL data sets across multiple commodity nodes to enable processing of big workloads at hyperscale. Previously, the MemSQL database was limited to implementations on a single box. The new version is really just the original one modified to scale out to more machines, said Eric Frenkiel, the company’s CEO and co-founder.

    The new version also comes with the MemSQL Watch dashboard to keep track the performance of the database cluster.

    MemSQL Watch dashboard

    MemSQL Watch dashboard

    The sweet spot for MemSQL, which takes SQL queries and converts them to C++, is comparing fresh data with recent historical data. What does that look like? One company that has been using a beta version of the distributed MemSQL distributed database in recent months, Zynga, looks at how games perform from week to week. Gradually, they expanded their cluster to check across larger time frames.

    MemSQL might not be the answer for everyone — different companies have different needs when it comes to databases — but it should at least pique the interest of a lot of companies. No one has ever asked for a slower, less-scalable database.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • New Salesforce.com features meld social media, marketing and CRM

    Salesforce.com is connecting its widely used system for tracking leads in the sales process with social-listening and social-media marketing tools, enabling users to tailor their social-marketing dollars to core customers and observe the resulting comments.

    Starting this summer, customers paying to use the Salesforce customer-relationship-management (CRM) system and the Social.com publishing and monitoring tools will be able to combine both with the new functions. Gordon Evans, vice president of product marketing for the Salesforce Marketing Cloud, said he expects customers to be primarily advertising and marketing agencies, big companies with in-house marketing and advertising operations, and gaming companies that run lots of advertising campaigns.

    Once the new features are in place, if an ad agency wants to run a targeted Facebook or Twitter ad, it will be possible to aim the ad at people in the client’s lead pipeline. Those people are already documented in the CRM, waiting to be contacted or otherwise encouraged to close on a deal; it’s just a matter of using the existing intelligence in conjunction with social-ad publication tools to try to reach them in new ways. If more potential leads get thrown into the CRM, the ad will be able to target them, too. Users will be able to view tweets emerging in real time and perhaps uncover new leads. They can also manage multiple campaigns and compare them all to figure out which ones generate the highest click-through rates and cost per click.

    Salesforce.com is adding customer-relationship management tailoring to social publishing tools.

    Salesforce.com is adding customer-relationship management tailoring to social publishing tools.

    The combined capability relies on pieces of software resulting from Salesforce’s acquisitions of Radian6 and Buddy Media, which together cost Salesforce more than $1.01 billion. More new features are surely on the way. Salesforce CEO Marc Benioff said in February that he wants to make more acquisitions this year, particularly in marketing — and adding more mobile functionalities is a focus this year, too.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Infer takes $10M to find the sales leads most likely to pay off

    On a good day, a sales executive can direct a salesperson to hone in on the best lead in the customer-relationship-management system such as Salesforce.com, and a deal might or might not come through. But getting the most out of sales and marketing staffers every day is the sweet spot. A startup named Infer is emerging from stealth mode with $10 million in venture funding to help more companies get to that sweet spot with a tool for identifying the most promising leads based on a user’s historical deal-making tendencies and external data about potential leads.

    Redpoint Ventures led the Series A round, and Andreessen Horowitz, the Social+Capital Partnership, Sutter Hill Ventures and others also contributed.

    Based in Palo Alto, Calif., the company focuses on sales operations that keep their leads in Salesforce and other cloud-based tools such as Eloqua and Marketo, although Infer Co-founder and CEO Vik Singh said it’s also possible for the software to hook in to on-premise appliances.

    Once signed up with Infer — Box, Jive Software, Tableau Software, Yammer, Zendesk and other companies are already paying customers — the system inspects historical sales information to check which deals have been sealed and which fell apart. That becomes training data for a model that scans “hundreds of signals of external data” to get a sense of which potential leads the company stands a chance of closing. Inputs include news articles, social-media accounts, website-traffic data, industry data, financial data, legal data, trademark data — “anything we can get that can give us more of a complete picture on who the customer is,” Singh said. Users can determine the weight of certain types of information.

    Users also set priorities for the scoring of leads. “Do you want a model where the higher the score, the more likely (you are) to win (a deal), or do you care more about conversion, or do you care more about lifetime revenue, or deal size? We build the model based on that,” Singh said. It’s not just a neat way to prioritize leads; Singh said many Infer customers have boosted conversion rates with the tool.

    Infer data integrated in Salesforce.com

    Infer data integrated in Salesforce.com

    Before starting Infer, Singh spent some time at Google, where he focused on machine-learning methods for automatically providing answers to questions users type into the search box. Another former Googler, one-time chief information officer Douglas Merrill, co-founded a different company that uses lots of external data to make determinations: ZestFinance, formerly known as ZestCash, extracts information from 70,000 sources to figure out if a lender should make a loan.

    The broader strategy the two companies have in common — making predictions based on data — has become more popular in recent years, as companies merge and grow data sets to create more than the sum of their parts. In this case, it seems that the approach could garner wide adoption as a few companies optimize the time of their sales and marketing employees and pull ahead of their competitors, and other companies might want to do the same thing to catch up.

    Of course, if Salesforce or Oracle acquires Infer, then adoption could come even faster.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Who uses Bitcoins? It’s not exactly the hedge-fund set

    Rather than speculate on the future of the Bitcoin, the online-audience-describing people over at Quantcast got a novel idea: trying to figure out who actually uses the international digital currency.

    In case you were wondering, it’s not just the super-rich who appear to engage in Bitcoin. While 26 percent of Bitcoiners rake in $150,000 or more in annual income, the biggest group, 52 percent, makes less than $50,000, according to findings visualized Friday on the Quantcast’s I.Q. blog.

    Income of Bitcoin users, via Quantcast I.Q. blog

    Income of Bitcoin users, via Quantcast I.Q. blog

    Age and income of Bitcoin users, via Quantcast I.Q. blog

    Age and income of Bitcoin users, via Quantcast I.Q. blog

    Bitcoin users are overwhelmingly male — 88 percent. The biggest age group is 25-34. It’s possible they attended engineering-heavy schools like the Massachusetts Institute of Technology and the Rochester Institute of Technology. Tech terms such as “Raspberry Pi,” “open source” and “command line” are popular with Bitcoiners.

    But that’s now. Perhaps the core Bitcoin audience will shift if the currency’s volatility ever levels out.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.