Author: Jordan Novet

  • ForgeRock grabs $15M to push access and identity management software

    When Oracle finished buying Sun Microsystems in 2010, it got Sun’s identity and access management software, among many other technologies, but Oracle already had its own versions in its Fusion line of middleware.

    Within a few months, many of the Sun employees who worked on the identity and access management software — for authenticating and keeping track of the permissions of a given website’s users — started working on their own software based on what was already available in open source, focusing not on enterprises’ internal employees but on end users from all over the world. They started a company of their own, ForgeRock.

    Since then, the company has signed up more than 200 customers, including BSkyB, McKesson and the Vatican. It has opened offices in the U.S., France, Norway and the U.K. Now it’s taken in $15 million in a Series B round, bringing total venture funding to $22 million. Foundation Capital led this round, with previous investor Accel Partners also participating. The company will use the new cash to hire employees and add customers in the U.S. and India, its largest markets, as well as in other countries.

    Oracle remains a competitor, as does CA Technologies. Microsoft also has Active Directory, although AD is more focused on internal uses, said Daniel Raskin, ForgeRock’s VP of marketing. Developers certainly can roll their identity and access management software, but that takes time, and integrating the many access and identity management pieces from Oracle and CA can be complex. ForgeRock wants to keep it simple with open-source code and support through subscriptions. The software can let developers set up single sign-on, authentication, a directory for tracking who has access to which files and other features at scale and a system for keeping passwords updated across multiple applications, Raskin said.

    CA and Oracle are hefty competitors to contend with. But if ForgeRock can keep adding customers and ensuring easy scalability, it might have itself a nice little niche.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • OpenDaylight could threaten SDN startups, or new alliances could crumble

    As 18 networking hardware and software vendors offer up developer time and sponsoring money to make open-source code for standardizing software-defined networking (SDN) through the newly established OpenDaylight Project, industry people are monitoring the development to see how things might change and how it will impact the rest of the networking industry.

    Midokura, an SDN startup with overlay-based software for creating load balancers and other virutal devices in Infrastructure-as-a-Service (IaaS) clouds, is closely watching the vendor-led consortium and could join depending on what happens, said Midokura’s chief strategy officer, Ben Cherian. IBM is contributing to OpenDaylight a version of the architecture underlying its virtual overlay product for connecting virtual compute and storage resources, according to the OpenDaylight announcement.

    That could pose a problem for Midokura if it’s approved for inclusion in OpenDaylight, particularly if OpenDaylight code is able to do what Midokura’s does but for free. “Are they (OpenDaylight) going to be an eventual competitor? Maybe. But it’s too early to tell until we see how some of these things map out,” Cherian said.

    There are similar questions about competition around OpenDaylight’s expected controller code, which Cisco and Big Switch Networks are contributing. The Cyan SDN product runs applications on top of its Blue Planet controller. Rafael Francis, senior director of service provider solutions at Cyan, said he views OpenDaylight as less of a threat and more of a validation of an open, vendor-neutral approach. At the same time, Cyan could get involved with the consortium if customers demand it, he said. Beyond that, OpenDaylight controller code could turn out to work best with Cisco gear, said Mike Bushong, vice president of technical marketing at Plexxi. In that case, developers at multiple participating companies would have to work on maintaining code.

    Relations among the many companies involved in OpenDaylight looked copacetic on Monday, with no parties quitting the consortium and with even the Open Networking Foundation feeling good about the OpenDaylight premiere. Dan Pitt, its executive director, said in a statement that the organization is glad to hear that OpenDaylight will support the OpenFlow protocol.

    But the relative calm is only based on what’s happened before. Jason Edelman, a solutions architect at New York-based channel partner Presidio, pointed out that only Cisco and Citrix appear to have contributed code, while others seem to be planning to do so. Unity could fracture as companies start unveiling their proposals. For example, what will VMware put forth? The official announcement doesn’t say. When that information comes to light, executives at Cisco and other sponsors might change their minds about participating.

    Meanwhile there are questions about the fairness of OpenDaylight’s current organizational structure. Jo Maitland, research director for cloud and infrastructure at GigaOM Research, noted in an email that OpenDaylight “will need an elected board, much like the OpenStack Foundation, to offset the competing interests of the more powerful vendors who will throw lots of money and people to make sure the ‘standard’ evolves to suit their position.”

    Although the formation of OpenDaylight seems like a significant development on its own, it ought to seen as the beginning of a story. At this rate, the middle and end will be dramatic to witness.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Private cloud mania drives data center expansion, survey says

    Lots of large companies in North America are expanding their data centers this year or next, and the desire to run internal private clouds is a major motivating factor, according to the results of a new survey of IT decision makers. Despite all the flak private clouds have taken, it’s clear they have a piece of the cloud market, and that piece appears to be growing.

    Of the 300 IT executives surveyed, 98 percent expect to expand their data centers in 2013 or 2014, and 61 percent cited establishing internal clouds as an extremely important reason, according to the study, commissioned by data center builder Digital Realty Trust and conducted by Campos Research and Analysis. Better security, energy efficiency and new applications and services are among the other stated reasons for expansion.

    The hankering for private clouds is fascinating. It shows that objections to the concept could be fading. Critics say private cloud can’t replicate the cost savings that can derive from going with massively scaled shared-resource public clouds exemplified by Amazon Web Services. Others see private cloud deployments as unduly influenced by vendors trying to parlay their dominance in the current server and software realm into cloud. But, then again, regulatory or compliance concerns still rule out the use of public clouds, as GigaOM Research analyst David Linthicum wrote in February (subscription required).

    The wider availability of the OpenStack cloud platform has surely made a difference in the rise of private clouds. It’s helped plenty of companies build private clouds, including eBay, Intel and Yahoo. That trend could keep up, but so could the rise in the adoption of public clouds.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • AppMesh says its mobile apps will help salespeople get out in front of email and deals

    San Francisco startup AppMesh is emerging from stealth mode with iPad and iPhone apps that bring together and optimize salespeople’s inboxes, calendars and ongoing deal data sets. With the apps, getting lost in email and manually updating the sales process are things of the past, the company vows.

    Co-founders Leo Tenenblat and Tom Tobin both worked in product management on analytics at Software-as-a-Service (SaaS) giant Salesforce.com, where they saw the value of pushing out clear information to salespeople’s mobile devices.

    With AppMesh, updates sync quickly between the iPad app and iPhone app. And if an app goes offline, it will sync and replicate to the Amazon Web Services public cloud once it goes back online. Users can export Salesforce data to the apps, although importing back to Salesforce is not currently possible. Android versions are planned. The apps automatically take note of salespeople’s emails and phone calls to potential clients. It also arranges email in different ways — by the size of the deal, by the time the deal opportunity closes and so on. The apps are free for teams of up to five people, and prices are determined on a case-by-case basis for larger user groups.

    The AppMesh application for iPad lets salespeople track meetings, sales opportunities and emails.

    The AppMesh application for iPad lets salespeople track meetings, sales opportunities and emails.

    The product is similar in some respects to Tylr Mobile, a startup that’s still in stealth mode. Meanwhile, some startups, such as Crushpath and Selligy, offer mobile apps for tracking sales relationships, and others, such as Yesware, deal in optimizing salespeople’s email boxes but not with mobile apps. There are also startups that aim to make the best of email but don’t draw from sales apps, such as Taskbox and Mailbox, which Dropbox acquired.

    On top of that, it’s possible Salesforce itself could roll out more sophisticated features. After all, Salesforce plans several announcements around mobile offerings this year. Then again, Salesforce could move to acquire AppMesh or Tylr Mobile. The deal would make sense, because the products are intended to solve a real problem for salespeople.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Network vendors launch open-source OpenDaylight Project to standardize SDN

    Cisco, Juniper,, Big Switch Networks, Nuage Networks, VMware and several other network hardware and software vendors are jumping into the open-source code-development pool with the establishment on Monday of the OpenDaylight Project inside the Linux Foundation.

    Rumored in recent months, the project begins more than two years after the establishment of the Open Networking Foundation (ONF), which counts customers such as Facebook and Google as board members and has nurtured the OpenFlow protocol. OpenDaylight organizers describe the vendor-led consortium as a nice complement to the ONF that will support OpenFlow, but it’s hard to predict how nicely the organizations will really dovetail with one another.

    OpenDaylight is taking proposals for code from engineers working inside and outside the sponsoring companies. The software line will include an open controller, plugins, applications, a virtual overlay and interfaces to bring all those elements together, said Jim Zemlin, executive director of the Linux Foundation. Those elements will sit on top of OpenFlow, as well as vendor-specific interfaces and other standard protocols, which work on top of virtual and physical switches.

    Elements of the OpenDaylight Project

    Elements of the OpenDaylight Project

    Different sponsors are proposing different parts of the total OpenDaylight product line. All contributions must be approved by a steering committee. The first OpenDaylight code is slated to ship in the third quarter of this year.

    OpenDaylight says it intends to play nice with the ONF. But it’s unclear if that will actually happen. There are competing visions, for example, for the northbound API. The ONF believes there should not be a standard, while OpenDaylight will be shipping standard northbound APIs. Meanwhile, there are lots of SDN startups not sponsoring OpenDaylight, so it’s hard to tell what those companies will do.

    OpenDaylight’s stated goal — widening the adoption of and sparking more innovation around software-defined networking — is noble. Vendors’ acknowledgment that the network indeed must become as dynamic and programmable as compute and storage is right on point. Whether OpenDaylight will cripple SDN startups’ efforts to help companies swap out expensive brand-name gear for cheaper commodity equipment is an open question. To be sure, though, the establishment of OpenDaylight is a turning point, and the SDN hype ensures that lots of people will be eager to see what actually comes to pass.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • On the quest to data ownership, lots of questions lie ahead

    Companies are collecting ever more data on end users, through mobile devices, connected devices, sensors and other inputs. While some people appreciate what companies are doing with the data, end users don’t necessarily know what companies are collecting. In a discussion on data science in San Francisco on Thursday, some panelists thought out loud about what it might look like if more data were shared.

    “What does it mean to own data?” said Andreas Weigend, a lecturer at Stanford University and formerly chief scientist at Amazon.com. “… Does it mean I can do with it whatever I want to do with it?”

    Weigend went on to ask if people would be able to rent out their data and make some money off it. Weigend has been thinking a lot about the subject of data ownership and expects to address that topic and others in a forthcoming book, “Our Data.” Different industries have different standards, and those could shift, Weigend told me later.

    After the talk, I couldn’t help but wonder about what Weigend called “a cloud-like store of person-level data,” or what some people refer to as a data locker or simply a personal cloud. Here are some questions that came to mind:

    • Should companies go beyond the data they already share — purchases, bank transactions, phone calls and so on and disclose what it’s silently tracking? Weigend likens that sort of data to crude oil, which requires complex processing before consumers can use it to drive their cars, but some people might like to see what companies are collecting.
    • Should companies — take insurers, for example — have to tell customers what indicators they look for as they make decisions, so customers could learn how to change their behavior to get lower rates or prices? Or should this be proprietary?
    • If data is going to be made available, where should it be kept? Should governments have to make a certain amount of an online storage available for each person, or should private companies offer that service?
    • How quickly would data be updated in a personal locker or repository?
    • To take a step back, would enough consumers want this sort of information enough for businesses to feel compelled to spend time and money making it possible? If people don’t speak out about this, the window of opportunity for setting standards could close.

    It might not be the easiest thing in the world to get businesses into the habit of disclosing to customers the data they keep. But as the internet of things gets bigger, it’s a good time for the dialogue to get louder.

    Feature image courtesy of Flickr user aweigend.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Caves, ships and aging gasometers: 3 unlikely homes for data centers

    Add gasometers to the list of strange places for data centers.

    A gasometer — a place to store different kinds of gases — in Stockholm has sat unused for decades, and now a Swedish cloud service provider has put forth a couple of proposals for turn the facility into a data center, the Royal Pingdom blog reported Thursday.

    The structure was built in 1893 as part of a coal gas plant. Another gasometer was built right next to it 1900. The city of Stockholm is requesting that the public have access to at least part of the data center that’s being proposed for the 1893 gasometer. One proposal would make the data center look a bit like a panopticon, except instead of people the building has floors and rows chock server racks. The other proposal puts forward barracks-like structures to store the server racks, alongside lots of open space.

    Gasometer data center proposal from industrial design firm Splitvision

    Gasometer data center proposal from industrial design firm Splitvision

    Gasometer proposal from architect Albert France-Lanord

    Gasometer proposal from architect Albert France-Lanord

    Aging 19th-century buildings aren’t the only odd places to get a 21st century makeover. Data center providers in Hong Kong are going underground in their hunt for space. Just a couple of weeks ago came news of the Hong Kong government’s apparent interest in building out rock caverns for data centers. Earlier, Wikileaks went underground when it located its servers in an nuclear weapon-proof bunker in Stockholm. Incidentally, the company Wikileaks worked with to get its data center going, Bahnhof, is the same one that’s proposing the gasometer data center.

    Then there was Google’s plan to build out floating data centers and use waves as an energy source. As far as we know, the concept has not led to an actual floating data center yet.

    Other places where site selectors might want to look: abandoned coal mines, mountains, chapels and outer space. And don’t forget about our houses.

    While the United States government is consolidating its IT footprint to become more efficient, companies are moving more toward cloud computing. And as that happens, more companies will want to run data centers to host those services. It can be cheaper to set up data centers in existing structures, and that’s why it shouldn’t be surprising to see people dreaming up data centers inside more and more places, all weirdness aside.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • IBM releases Hadoop box and database technology for quicker data insights

    IBM talked up the latest ways in which it has sped up databases and introduced a Hadoop appliance at a press and analyst event in San Jose, Calif., on Wednesday. The developments aim to bring enterprises closer to running analytics on more types and greater quantities of data as close to real time as possible — a higher and higher priority as big-data projects proliferate.

    In the long run, as more and more data piles up and in greater varieties, IBM wants to help to prevent its customers from drowning in the deluge of data and instead give them tools to get better results, such as more revenue, said Bob Picciano, general manager of information management at IBM Software. That’s why tools have to be fast, capable of working on huge data sets and easy to use.

    Toward that end, IBM announced BLU Acceleration. When a user of an IBM database such as DB2 runs a query, BLU quickly slims down a big set of data to the amount needed for analysis and spreads tiny workloads across all available compute cores to give a result. One feature of BLU — data skipping — essentially fast-forwards over the data that’s not needed and hones in on the small area that is. And with BLU, data can stay compressed for almost the entire duration of the analysis. IBM claimed BLU produces results a thousand times faster than a previous version of the DB2 database without BLU in some tests.

    The IBM PureData System for Hadoop appliance.

    The IBM PureData System for Hadoop appliance.

    IBM also unveiled another IBM PureData box tailored for big-data purposes, this time around Hadoop. Previous boxes in the line include the PureData System for Analytics. The IBM PureData System for Hadoop appliance will become available later this year. It enables customers to start loading data in 90 minutes, compared with two or three weeks for a company’s Hadoop instance in a data center, said Phil Francisco, vice president of Netezza product management and product marketing at IBM Software. The box can store data processed in Hadoop right in the box, a perk for companies facing retention requirements.

    Look for IBM to offer more big-data hardware and software. The company has spent $16 billion on big-data and analytics acquisitions, and it wants to “spend as much organically as well as inorganically to figure out what clients need in this space,” said Inhi Cho Suh, vice president of information management product strategy at IBM Software. Meanwhile, Picciano said IBM will soon come out with a way to do for many servers what the BLU Acceleration does with the processors inside a single server.

    The new IBM products sound like they could speed up analytics. If enterprises don’t believe the need is there now, they will as data gets bigger.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • On Kaggle, GE finds data science solutions for patients and pilots

    Even GE respects the wisdom of the crowd. The manufacturer joined up with Alaska Airlines, the Ochsner Health System and Kaggle in November to ask outside data scientists and designers to help give pilots actionable data and make hospital visits and subsequent care more efficient.

    The organizers of the first Industrial Internet Quests have since received more than 3,000 submissions and were expecting to announce on Wednesday the contestants who will receive a total of $600,000. One submission for the flight competition has earned $100,000 for its developers, a five-person team from Singapore.

    Kaggle has hosted data-science competitions for several other brand-name companies, from Facebook to Ford. Its publicly available leaderboards make data science a bit like a spectator sport, and open-source education on machine learning and natural-language processing makes it possible for lots of people to compete.

    Demand is sky-high for data scientists and application developers, and farming out one-off projects is a common practice in all sorts of industries. That’s why it’s not surprising to see even big companies like GE turning to the crowd for data science solutions. And it’s why this sort of news could become more common in the future.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Deep Information Sciences scores $10M for its general-purpose database

    Investors continue to bet on databases that can handle large swaths and wide varieties of data. The latest proof point:  Deep Information Sciences on Tuesday said it has $10 million in new Series A funding to fuel what it calls a high-performing transactional and analytic database. Funding sources include Stage 1 Ventures, Robert Davoli and angel investors.

    Based in Portsmouth, N.H., Deep rejects the usual SQL, NewSQL, NoSQL, columnar, streaming and in-memory terminology, preferring the term “general-purpose database.” It handles structured and unstructured data and claims to keep latency low for writes, reads and queries. It aims to efficiently use all the cores of available processors, whether on premise or in the cloud.

    One customer, Global Relief Technologies, sped up the work of updating its database with DeepDB with data from employees who log information on their tablets. A process that once took more than a day now takes 17 minutes, according to a Deep spokeswoman.

    Other companies offer databases that mix transactional and analytic capability, including SAP, with its HANA database, and JustOneDB. Deep, which has two commercial customers, responds to the differentiation question by claiming DeepDB performed better in tests for many use cases. In one test, it reportedly blew through 1.72 million transactions per second, compared with 32,000 per second in MySQL using the InnoDB storage engine.

    The <a href="http://gigaom.com/2013/01/08/idc-says-big-data-will-be-24b-market-in-2016-i-say-its-bigger/"willingness to spend on big data has set the stage for a large pool of database providers, and many claim they have unique products. At the end of the day, it could be that enterprises will want multiple types of databases for multiple purposes. If that’s the case, Deep will need to add customers and use cases demonstrating that DeepDB can beat existing options in the transactional market as well as the hot analytic space.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • DataRPM scores $250K, introduces Google-like big data searching

    More companies are realizing that analyzing their big data can lead to insights that increase revenue and produce other business breakthroughs. But getting good answers isn’t always easy, often requiring IT administrators to take charge and leaving all but a handful of business executives equipped to use software. One startup has a nice and simple idea for big data analytics: Google-like search.

    Fairfax, Va.-based DataRPM on Tuesday announced that it has raised $250,000 from angel investors and rolled out a new feature for its business-intelligence Software as a Service (SaaS) called Instant Answers. The feature uses natural-language processing to figure out what users want to see, based on typed or spoken queries, and displays the visualization that the software thinks is the best fit. Users can filter and comment on the results.

    While methods and purposes vary, the idea of making software or a site respond to limited user input isn’t new. The approach reminds me of Facebook’s Graph Search, which rapidly delivers several options for search results based on likes, friends and other user information. Software from BeyondCore also comes to mind, as it quickly displays graphs and audibly speaks out its findings to show the biggest drivers of, say, revenue. BeyondCore CEO Arijit Sengupta took a few minutes of stage time at GigaOM’s Structure:Data conference in New York last month to show off the software.

    More natural-language processing and machine-learning technology could make DataRPM’s Instant Answers tool a better choice in a crowded market. Perhaps the SaaS could keep tabs on which data users call up and how users might modify their searches if they don’t get the data or visualizations they want the first time around. Later, it could predict what users want. The original Instant Answers is nevertheless a good start.

    Feature image courtesy of Shutterstock user anaken2012.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Nuage Networks SDN aims for easy connections in the rack and beyond

    Nuage Networks, a startup inside French network vendor Alcatel-Lucent, has come out with an overlay product for software-defined networking (SDN) inside and outside the data center and a plan for SDN-enabled wide-area networks for the enterprise. Carrying out that vision would show that the SDN concept can spread far beyond the data center.

    Nuage’s Virtualized Services Platform, which executives hinted at in January, incorporates a controller, virtual routing and switching, and a virtualized services directory. It operates in layers two through four of the network stack, building tunnels between virtual machines running in the same server rack or even in different racks at different data centers as easily and quickly as one cell phone can connect to another one, said Lindsay Newell, head of portfolio marketing for Alcatel-Lucent’s networks group. It works with cloud-management software from OpenStack, CloudStack and VMware.

    Trials of the Nuage software begin this month. Trial customers include telecommunications service providers SFR and Telus, as well as the University of Pittsburgh Medical Center. Nuage (French for “cloud”) will make the product commercially available within three months or so, Newell said.

    As soon as next year, he said, Nuage will follow up with software for establishing software-defined virtual private networks. It should be able to connect multiple enterprise network locations, including private or public data centers, through a virtual, programmable wide-area network, avoiding the need for expensive and time-consuming manual provisioning.

    That’s later. For now, Nuage will have to prove that its overlay product can deliver on its promises and do more than what it calls first-generation SDN offerings from Big Switch, Nicira and others.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Enterprises want SDN even if they aren’t sure what it is

    Most network administrators at enterprises in North America are talking about software-defined networking. Most think they have a pretty good handle on it. But not every one of them knows what SDN actually is, even though the term has been thrown around for three years or more. Far from it.

    Most network admins know what they want to do with SDN, which virtualizes physical networking gear so administrators can tweak their networks without unplugging boxes and moving cables. Admins want to deploy applications and services more quickly. Many also want to use it to cut down on mistakes and get new customers running as soon as possible. But problems loom. Admins expect SDN to cost a lot of money and possibly lead to security concerns. And they don’t think their colleagues know about SDN in the first place.

    Such are the contradictions that resulted from a February and March survey of 237 enterprise network decision makers, sponsored by the Swedish SDN company Tail-f Systems. The results point to the lack of real knowledge about what SDN is and how to implement it. There’s still no shortage of hype, though, and with vendors using different definitions and trying to drive their own standards, it’s hard for everyone to agree on common terms.

    So far, 65 percent of survey respondents said their companies are in SDN trials, have implemented SDN or are implementing it now. That number will keep going up. A February GigaOM Research report (subscription required) forecasted the global SDN market will swell from $320 million in 2014 to $2.45 billion in 2018. But how soon the hype will go away and genuine understanding will replace it is anyone’s guess.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Keen on proving its SDN with IaaS providers, Midokura raises $17.3M

    In 2010, the founders of Midokura set out to become the top provider of public Infrastructure as a Service (IaaS) clouds in Japan, but they realized networking challenges stood in the way. So they decided to focus on networking, and in the meantime the software-defined networking (SDN) space got hot. Now Midokura is trying to rack up more large-scale customers, particularly existing IaaS clouds, to validate its software’s capabilities.

    Toward that end, the company has secured $17.3 million in Series A funding, bringing the total the company has raised to $18.6 million. The Innovation Network Corporation of Japan led the round, which also included participation from NEC Capital Solutions and NTT Docomo Innovations Ventures.

    Midokura’s software virtualizes the network to let OpenStack cloud-management software rapidly create virtual networks and manage virtual routing and switching. Unlike some SDN startups, Midokura’s software does not use the OpenFlow networking protocol. Midokura executives see Nicira as a major competitor, and to a lesser extent Big Switch, even though those companies incorporate OpenFlow in varying degrees. Since releasing a beta version of its overlay in October, the company has found a customer in one large unnamed technology company in the United States, said Midokura’s chief strategy officer, Ben Cherian.

    Midokura executives are changing roles. Dan Mihai Dumitriu, chief technology officer and a co-founder, is becoming CEO, while the CEO until this point, co-founder Tatsuya Kato, is becoming board chairman.

    Midokura has managed to develop a product and take on venture funding after realizing its IaaS dreams weren’t going to pan out — not immediately, anyway — and changing course. Now, with Microsoft, Rackspace and especially Amazon becoming larger IaaS providers, the move looks like a good one. If Midokura can sign up a few IaaS customers, it could capture a foothold in the SDN market, which does not yet have clear leaders.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Welcome to the golden age of enterprise IT — and get used to it: It’ll be here for a while

    In case you didn’t notice, it’s the golden age of enterprise startups. And some investors believe it will continue for another seven years or more. No one is completely sure what will come after that, but it’s worth reflecting on what brought us here and what the dynamics are of this enterprise golden age.

    The pattern has become clear with recent funding deals for application-performance management (APM) provider New Relic ($80 million), APM player AppDynamics ($50 million) and data-copy reducer Actifio ($50 million), to name a few. Last year, VCs made 164 investments in big data startups (more than ever before), passing out $1.39 billion in total.

    Earlier this month, a big-data startup, SiSense, demonstrated its product by releasing figures — from CrunchBase, Wikibon, Nasdaq and other sources — showing that enterprise startups captured 40 percent more venture dollars per round in 2012 ($9.85 million) than they did in 2011. Meanwhile, consumer-focused web startups captured 45 percent less per round last year ($5.6 million) than then did in 2011.

    “What we found is enterprise startups definitely are more in vogue per round than in 2011,” said David Feinleib, a GigaOM Research analyst who looked at the findings. “Web startups are out of favor or falling out of favor.”

    Causes of the golden age

    Venture capitalists disagree about the exact set of disruptions or trends that set the stage for the golden age. But most agree on the big ones: bring your own device (BYOD), the adoption of cloud computing, and the growth of data sets and the need to make sense of it all.

    There are smaller trends, too, including software-defined networking (SDN), flash storage and cyberthreats. Just one would be enough to let in a bunch of new investing. But add them together and mash them up — such as BYOD and cyberthreats, or big data plus BYOD — and you have entirely new sets of funding opportunities.

    The rise of the enterprise accelerator

    A related development is the slow but steady rise of accelerators specializing in preparing enterprise startups for growth and investments. Current programs include Acceleprise, the Citrix Startup Accelerator, the Microsoft Accelerator for Windows Azure, a similar program in Israel and TechStars Cloud. Plus, Upstart Labs, the startup accelerator based in Portland, Ore., is starting to focus mostly on enterprise this year and now has venture backing.

    In the past few months I’ve attended the first and second demo days of the Alchemist Accelerator, a enterprise accelerator in the San Francisco Bay Area. Ravi Belani, its managing director, said enterprise-focused VCs are excited about the events because until now, there hasn’t been a place for them to go when their consumer-oriented colleagues have attended Y Combinator events. “Finally there’s a place they can go where they can fund and meet entrepreneurs,” said Belani, a former associate at DFJ. With programs like Belani’s, it’s easier for VCs to find companies that are more prepared to take on funding, so they can spend more time making bigger deals and meeting their own objectives.

    The programs also have helped founders who have loads of talent but less knowledge about business strategy and fundraising. Of the nine startups that went through the first class, five have made venture deals, he said.

    How long will it last?

    Belani and most other investors I spoke with believe the enterprise golden age started in or around 2010 and will last until about 2020.

    It’s only natural to expect the trend to last for at least a decade. ”The 1980s was about the whole PC era,” said Navin Chaddha, managing director of the Mayfield Fund and the lead in the firm’s $6.1 million Series A investment in software-defined storage company SwiftStack. “The 1990s was about the internet. The 2000s was about virtualization.”

    Now Chaddha sees Software-as-a-Service (SaaS) applications taking over, mobile devices entering the workplace, IT moving from on-premise hardware to cloud computing and big data adding up and needing strong analysis. With those megatrends, IT needs to spend. And so VCs are excited about funding enterprise startups “for a decade or the next two decades,” Chaddha said.

    What comes next?

    Lots of people want to know what will come after the golden age of enterprise startups. Shirish Sathaye, a general partner at Khosla Ventures, thinks it’s possible that blends of consumer and enterprise technology will become the next big thing. “We already see that happening with the consumerization of enterprise IT,” he said.

    But really, he can’t be sure. The year 2020 is a long time from now. Call him in 2017, and he might have a better idea.

    Feature image courtesy of Shutterstock user f9photos.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • 3 shades of latency: How Netflix built a data architecture around timeliness

    Like other companies operating at webscale, Netflix knows that processing and serving up lots of data — some to customers, some for use on the backend — doesn’t have to happen either right away or never. It’s more like a gray area, and Netflix detailed the uses for three shades of gray — online, offline and nearline processing — in a post on its tech blog on Wednesday.

    The Netflix way of processing data online, offline and nearline.

    The Netflix way of processing data online, offline and nearline.

    The whole point of its data architecture is to tackle latency by pointing workloads and tasks toward systems designed to work at their speed. People love to think about Hadoop when they think about web data, but the reality is that relying solely on batch processing means data can get stale and applications probably don’t include the newest user input.

    Netflix uses online processing for receiving information from users in real time and serving up responses right away, such as looking at a new rating or some other customer action to change the set of movies shown to the customer. Real-time processing works best when algorithms are relatively simple and when data is on the smaller side. The data feeding in to computations must also be available right away.

    Nearline processing happens when the data needs to be computed in real time but can be stored for serving up at a later point in time. This option makes sense when computations are more complex and are amenable to a more-traditional database-oriented approach. Netflix uses a variety of databases, including MySQL, the NoSQL Cassandra database and its own homemade EVcache system.

    Offline processing in Netflix’s world might also be called batch processing — think bigger and longer-term Hadoop jobs. It also fits for compute-heavy projects to train new models that will come into use at a later date. And it’s a backup for situations when real-time processing isn’t possible.

    This online-nearline-offline approach is fairly common among web companies that understand that different applications can tolerate different latencies. LinkedIn has built its data infrastructure with the same general theory in mind. Facebook, too, has thought deeply about this. The social network recently detailed a new memcached-like data store called McDipper that foregoes DRAM for flash in order to cut costs for tasks that can live with slightly higher latency.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Startup Chronon looks to replace logging and record apps instead

    Look out, Splunk. Chronon, a startup that just picked up $30,000 in seed funding through the second class of the Alchemist Accelerator, wants to take a different approach to keeping and reviewing log files — it wants to ditch them in favor of something that takes much less time to deploy, allowing for quicker data review.

    Like a Tivo or other digital video recorder, Chronon records live data on the use of Java programs and lets developers and business people check out the resulting databases and performance charts later.

    Companies can use Chronon on top of log-monitoring software such as Splunk, Sumo Logic and Loggly, but founder and CEO Prashant Deva expects customers of both products to look at logs less and less. Gradually, Deva believes his sort of push-to-use code for recording and replaying will replace logging altogether.

    Chronon, based in San Mateo, Calif., has already signed up Disney, HSBC, Nokia, Sony, Pearson and other companies as customers. It wants to take on more funding and further build out its product. But Splunk has gone public and gotten lots of attention. Persuading many enterprises to try another, different model won’t be easy.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Wal-Mart is arming itself for fierce retail battles with better search, social streams and more

    Wal-Mart develops its digital initiatives and products not in the recesses of a 200,000-square-foot Supercenter or at corporate headquarters in northwest Arkansas. Innovations for the Walmart.com website come out of San Bruno, Calif., and Wal-Mart executives and media people didn’t try to hide a foosball tournament, a pingpong game and a hip-looking workforce during a media tour on Tuesday of the Wal-Mart eCommerce San Bruno facility. It’s proud of those status details and is actively pushing the startup narrative.

    walmart 2Wal-Mart eCommerce is the largest startup in the world, executives said. Employees participate in hack days and work in small teams to launch digital products in months. But with 10,000 stores internationally, Wal-Mart is the largest retailer by revenue in the world — Amazon’s annual net sales is about 13 percent of Wal-Mart’s — and therefore has much to lose. Fierce online retail competition from Amazon, Google and others pose challenges, as customers have no problem buying from a different company if the price and the product are right.

    As same-day shipping and other retail breakthroughs become possible, the Wal-Mart eCommerce crew is taking steps to do as good of a job or better of meeting customers’ retail needs online as it does in store, and it’s also trying out the sorts of projects that could secure repeat business from a more up-market crowd.

    How? Employees in San Bruno are adding features to desktop, tablet and mobile web presentations. They’re letting internal buyers take advantage of real-time social-media activity, sales information and other data. And they’re building out an international development platform that will be common to all the tech stacks under the Wal-Mart Stores umbrella, enabling quicker deployment of more useful searching and browsing tools and other online features.

    What customers see online

    Last year Wal-Mart premiered a fine-tuned search engine called Polaris. The search engine has since increased sales resulting from search engine use by 20 percent, said Joel Anderson, president and CEO of the United States Walmart.com site.

    If customers want to browse instead of search, they will soon have more ways to do so. A new homepage being rolled out lets customers check out hot items in different categories. A more personalized experience showing the top buys in geographical regions could be coming, too. (Amazon, for its part, surfaces new and popular items in categories and makes suggestions based on customers’ product-browsing history and stated preferences.)

    Category-specific pages that came out on Walmart.com last year display the products available through the site that got the most pins on Pinterest. Later this year the model will become more powerful and widely available. Customers will be able to see the top products and colors being pinned on Pinterest, for example.

    Screen shot 2013-03-27 at 10.03.27 AMThe Pinterest method shows site users the sorts of products they might not necessarily think of when they think of buying from Wal-Mart. (A different @Walmart Labs initiative, the Goodies subscription service for delivery of cool gourmet foods, could also appeal to a different demographic of customers than the usual Wal-Mart shoppers, as my colleague Ki Mae Heussner wrote last year.)

    Wal-Mart has been throwing resources behind search-engine optimization, too, to make sure consumers can spot its websites and products in the first place.

    A forthcoming Walmart.com mobile app will let customers speak into their mobile devices or scan barcodes on products at home to start shopping lists. When customers enter stores, the app will display the aisles where desired products sit on shelves. It also will allow customers to summon associates for assistance, apply coupons, scan items for prices and store digital copies of receipts. The existing Scan & Go feature, which entails scanning everything while walking around a store and then using a smartphone to transfer the order to a register, has caught on with customers, and Wal-Mart is tripling the number of stores where it’s possible, to more than 200. @Walmart Labs acquisition of Grabble laid the groundwork for this technology, and now Wal-Mart is building on it.

    What Wal-Mart’s buyers see

    Walmart socialWal-Mart’s buyers can get a sense of what they should stock online and in stores by checking out pins on Pinterest. Top pins feed in to a new social-media analytics dashboard for buyers. So do reports from Twitter that engineers have carved out from the entire Twitter firehose, allowing for better insight into consumers’ thoughts on products. They can see when the number of tweets on, say, gel nail polish peaked and see which colors were the most popular in which locations.

    On top of that, Wal-Mart’s buyers have access to sales data, which they can overlay on top of Twitter graphs. That way, they can anticipate demand and respond to it with supply.

    The social feeds show buyers the desires of more social media-savvy customers, so buyers can venture beyond traditional top sellers with a bit more assurance of what customers say they want.

    Integrating the back end

    For years, engineers on the back end of Walmart.com, Sam’s Club and other online entities of Wal-Mart Stores subsidiaries have worked with a wide variety of software, and now they are unifying it all on one simple in-house platform called Pangaea.

    Engineers are “trying to set up systems that will connect every product in the world with every customer in the world,” said Jeremy King, senior vice president and chief technology officer of Wal-Mart Global eCommerce. And that means technology needs to be uniform for desktop sites, mobile sites and stores.

    As online and offline orders come in around the globe, Wal-Mart needs to be able to track inventory and move around products on the fly. That ability has been introduced to Sam’s Club and other properties, and it’s now being expanded.

    Different sites have different methods of operating shopping carts, checkout, payment, fraud, tax, personalization and wish lists, often relying on third-party offerings. “All these sort of core capabilities … are being built,” King said. But there needs to be wiggle room. In some countries, customers don’t plunk down the total amount for a big purchase, such as a television; rather, they pay in installments. Sites need to be able to handle a variety of payment structures.

    Pangaea will also enable Wal-Mart subsidiary sites around the world to take advantage more easily of the algorithms and user-experience improvements coming out on Walmart.com.

    The advantages can benefit operations internally, and customers can get better service, too, said Sri Subramaniam, vice president of @Walmart Labs.

    Online work apparently paying off

    At first, the whole startup vibe evident at Wal-Mart eCommerce is jarring, because it’s so different from the conventional Wal-Mart retail experience. But the Wal-Mart annual report released Tuesday suggests that the company is heading in the right direction. The company posted $466.1 billion in net sales in its fiscal year that ended on Jan. 31. The United States portion, comprising 59 percent of net sales, was up almost 4 percent over the previous year, beating the roughly 1.5 percent rise from the fiscal year ending Jan. 31, 2011, to the one ending in 2012. To ensure that Wal-Mart Stores doesn’t lose customers to Amazon and others, the company will have to keep improving its digital front and back ends. Scalability challenges make it interesting to watch on the tech side, and Wal-Mart competitors could pick up some ideas, too. The question is what customers will think of it all.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Tidemark beefs up visualizations with infographics instead of bar charts and line graphs

    Tidemark is putting a fresh spin on its cloud-based enterprise performance management applications with infographic-like visualizations of financial planning data, showing that the consumerization trend popular with IT is fit for finance and operations staffers, too.

    Sidestepping static dashboards, Tidemark’s shareable Storylines feature lends itself to adoption by people beyond the chief financial officer, according to company founder and CEO, Christian Gheorghe. The idea was born a few months ago, Gheorghe said.

    He was “frustrated with the fact that we have had all these years to understand businesses better with all kinds of analytical tools, and yet most people can create a fantasy football pool to see (who can) win faster than they can understand how their company is doing from a performance perspective,” he said. The result — infographic formats based around company health, workforce, profitability and other areas — is a “home experience brought to work, not the other way around,” Gheorghe said.

    Tidemark takes less time to implement than legacy offerings from IBM and Oracle — 90 days instead of six to nine months, Gheorghe said. Tidemark, which has raised $35 million from Andreessen Horowitz and Greylock Partners, among others, competes for marketshare in the cloud enterprise performance management space alongside Adaptive Planning, Anaplan and Host Analytics. Whether infographics are the best means or not, the appealing presentation calls to mind the cool visualizations more common in business intelligence, and Tidemark’s competitors might have to come up with their own answers to keep up.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Wal-Mart to try out locker system to let customers buy online, pick up offline

    Wal-Mart Stores Inc. executives know that more people want to buy stuff outside of stores. As part of its multi-pronged Wal-Mart eCommerce strategy, the retailer with 4,000 stores in the United States and 10,000 worldwide has been taking orders online and letting customers pick up their purchases in store for the past few years. Now it’s adding another option that other large online retailers have been working on, too: pick it up from a locker at the store, with no Wal-Mart employee to go through.

    The locker method will debut this summer at fewer than a dozen stores, probably in one geographic region in the United States. “Really, it’s to test and learn,” said Jeff McAllister, senior vice president of Wal-Mart U.S. Innovations, during a media day at the company’s Wal-Mart eCommerce facility in San Bruno, Calif., on Tuesday. It’s also a way that retailer can match its online rival, Amazon.com. Amazon has been offering customers lockers in several cities, to serve customers who order goods but might not have doormen or a front porch where they can receive packages.

    With this program, Wal-Mart will box up the items a customer purchases and squeeze the box into a locker that fits just right, McAllister said. Then Wal-Mart will send the customer a message with a code that will allow the customer to open the locker. The customer will initially have a couple of weeks to pick up the purchased items.

    This is clearly a bid to burnish its online experience, because most people are close enough to visit a physical Wal-Mart store if they wanted to. Two thirds of people in the United States are within five miles of a Wal-Mart store.

    Like other Wal-Mart eCommerce initiatives, if it works, it will expand. If it doesn’t, it won’t. Like similar locker experiments from Amazon, Overstock and others, it’s worth watching as a shot at making retail fit the digital age.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.