Author: Jordan Novet

  • Druva makes endpoint backup software available for private clouds

    Druva is launching a private-cloud version of its cloud-based InSync backup service for corporate documents on desktop computers, laptops, tablets and smartphones.

    Like the company’s main public cloud backup version for enterprises, the private cloud offering of InSync includes automatic backup of customer documents, secure file sharing, data loss prevention, remote wipe and other features. It’s been in beta for four or five months, said Jaspreet Singh, a co-founder and the company’s CEO.

    Druva software incorporates a de-duplication feature that can save precious real estate in cloud storage. To illustrate, a file — a Druva customer’s spreadsheet of potential sales leads, let’s say — might already be stored in the customer’s cloud. If that’s the case, and one employee modifies a few cells on a smartphone, the Druva software’s de-duplication function on the smartphone will detect it and only update the parts of the document in the cloud that have been changed. The company makes claim to 90 percent bandwidth and storage savings with the de-duplication element.

    After signing up with Druva, one unnamed customer dropped its storage from 13 or 14 petabytes on average to 1.5 petabytes, Singh said.

    Why start offering a private-cloud option? It’s a matter of appealing to businesses that want the advantages of having so much data on premise for security and financial reasons but want the flexibility of the cloud, Singh said. The company plans to add more features for the private cloud version of InSync in the coming months.

    Druva, which has offices in Sunnyvale, Calif., London and Pune, India, faces competition from the Connected software from Autonomy, which Hewlett-Packard acquired in 2011, and Symantec’s Backup Exec 2012. Dropbox and Box also play in the space.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Think Big Analytics wants to help companies make the most of Hadoop

    A big-data-analytics consulting company launches Tuesday with $3 million in seed funding, under the name Think Big Analytics. Established in 2010, Think Big seeks to help companies start making the most of their data in the most cost-effective ways.

    Former Cisco executive Dan Scheinman led the round of funding for Think Big, which is based in Mountain View, Calif. WI Harper Group joined in the round, too. The company will primarily use the seed funding to add people to its data science and data engineering teams.

    When Think Big first started, many customers didn’t know what big data was, said Ron Bodkin, Think Big’s CEO and a co-founder. Use cases have evolved since then, he said, and demand for big data analytics has grown.

    Think Big has already worked with NetApp and Quantcast, as well as a U.S. telecommunications company. Use cases vary from early-stage adoption to quests for better efficiency. Think Big has helped a large retailer institute a big data architecture to give tailored recommendations to customers based on their input during in-store visits, phone calls and mobile interactions. Think Big guided a different retailer as it transferred from a legacy Teradata Corp. appliance to a Hadoop cluster, thereby dropping query times from six hours to four minutes.

    Think Big employees have a few ways to respond to the demand. They could assist a client in prioritizing use cases before working with a Hadoop vendor such as Cloudera or Hortonworks, Bodkin said. A client could also send as many as 20 developers to a three-day hands-on Think Big course covering Hadoop, MapReduce, Hive, Pig and other topics. Courses for non-developers and executives are also available.

    Other companies don’t block Think Big’s development as much as reliance on tried and true methods for analyzing data, said Bodkin, who will speak at our Structure:Data conference in March.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • IBM adds analytics-specific boxes to PureSystems line

    IBM is again expanding its PureSystems line of converged hardware, this built specifically for analytics. When the  company debuted the first PureSystems, which combine compute, networking and storage in April,  Big Blue said had spent three years and $2 billion developing the line.

    The company’s new PureData System for Analytics features Netezza technology, which enabling analytics inside databases. The New York Stock Exchange is already using the system to spot trading peculiarities that might warrant investigations, said Phil Francisco, vice president of big data product management at IBM.

    “They keep track of every element of trading activity on their trading floors daily,” Francisco told me. “… And they’re able to do analysis on seven years of data.” The system also monitors systems-level data and shows if there’s enough capacity to handle major changes in trading volumes.

    To capitalize on growth markets and do business with companies with lower IT infrastructure budgets, IBM is also releasing a miniature version of the PureApplication system for quickly deploying applications.

    Also new is a PureFlex converged system — with compute, networking and storage all in one — targeting managed service providers. The system precludes setup and system administration, which can cut costs. It also simplifies the process of adding capacity on infrastucture in the data center, Francisco said.

    IBM arrived to the converged-hardware party well after Oracle  unveiled its Exadata Database Machine, a database appliance and several specialized boxes, including the Big Data ApplianceEMC is also in this market with the Greenplum Data Computing Appliance.

    In October, following the introduction of the PureData line, my colleague Stacey Higginbotham questioned  whether big data needs a specialized box. The real development, she wrote, was not the technological achievement but the acknowledgment that providing easy-to-use services is important.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Monetizing the personal cloud could involve a coffee shop — and banks

    Could you host your data in a personal cloud and make money off it, too? A San Francisco-based company called The Respect Network is building the foundation to do just that, by letting you permit businesses to access certain personal information.

    For example, at a coffee shop, as you pay for a latte, you’d use your smartphone to scan the QR code on a sticker by the cash register and sign up for custom offers in exchange for your email address.

    The Respect Network plans to start operating its platform in the first quarter of next year, said Drummond Reed, the company’s managing director.

    “Everyone wins,” Reed said. “You get this incredible service, (the coffee shop) gets this deeper, trusted relationship, and away we go.”

    Respect Network Managing Director Drummond Reed

    Respect Network Managing Director Drummond Reed

    When I heard about the Respect Network at the Personal Data Ecosystem Consortium‘s San Francisco meetup last week, the personal-privacy angle intrigued me. Instead of being pulled into data sharing by signing up for, say, Facebook, the Respect Network wants individual users to have control and give the OK for each data grab. That’s neat.

    But as I talked with Reed, the company’s business model sounded just as original. It resembles that of credit card networks such as Visa and MasterCard. They receive processing fees for moving money from buyers to sellers and eventually pass on rebates to consumers. Similarly, the coffee shop and other Respect Network business clients will pay an annual fee for access.

    The Respect Network starts off as a peer-to-peer network on which people can share documents on the fly that can be updated in real time, sort of like Google Drive. Each user’s documents will stay on a personal cloud, whether he or she operates it or someone else does. The idea is to host user data free of charge, just as Google Drive has always been free for consumers. But instead of agreeing to a large company’s terms of service and privacy policy, the personal-cloud network will run on the Respect Trust Framework, which mandates that a user’s personal information and data cannot be shared without his or her permission.

    The business model will work on top of the peer-to-peer network. I will be able to go to a Respect Network-enabled business — a coffee shop, a grocery store or some other consumer-facing business — and agree to let it send me the custom offers and see certain personal data. The business pays the Respect Network, which pays some money to the company maintaining my personal cloud, which in turn can give me a rebate for posting data in the first place.

    “It fundamentally boils down to the merchant paying for the value of having these very convenient electronic payment options for you, the customer,” Reed told me later.

    It turns out banks are interested in getting involved with the Respect Network, possibly as providers of the personal cloud, Reed said. They see commonalities between credit-card networks and the Respect Network.

    To be sure, other companies have been looking to monetize the personal cloud, but methods vary. Instead of paying a publisher for access to exclusive content, consumers can opt to let Enliken send publishers data about their internet searches and other information. An application called Xenapto lets entrepreneurs host business documents and transfer them to potential investors, and it can also facilitate investments. Personal, which stores users passwords and credit-card information to accelerate the sign-up process, is planning a premium service for consumers and businesses alike. Meanwhile, my colleague Derrick Harris has argued that Facebook could pay users for sharing their data, to respond to the notion that it profits off user data.

    Despite the complexity of the Respect Network’s business model, it could make for a whole new business opportunity for cloud infrastructure providers. If banks or other companies decide to host Respect Network users’ personal clouds, they won’t be able to do it without hardware.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • IBM picks up another big-data analytics company, Star Analytics

    IBM is buying still more analytics expertise, picking up the software portfolio of Star Analytics, a company with expertise in automating the movement and management of workloads in hybrid computing environments.

    The company’s Star Command Center is certified to run on Amazon Web Services, Oracle on Demand and Microsoft Azure infrastructure.

    According to IBM’s statement announcing the deal on Friday, Star’s software helps “automatically integrate essential information, reporting applications and business intelligence tools across their enterprises, on premise or from cloud computing environments.”

    As my colleague Derrick Harris wrote last month, the buzz around big data could be more about automation than discovering insights on the data.

    Since 2005, IBM has spent $16 billion in 35 acquisitions of companies that deal in big data or analytics, a spokeswoman wrote in an email. Terms of this deal were not disclosed.

    In 2011, IBM picked up the supply-chain and contract-management analytics company Emptoris. And last year it bought Varicent Software, which analyzes sales data, and Vivisimo, a provider of federated discovery and navigation software.

    Currie Boyle, an IBM distinguished engineer who works on analytics of unstructured content, and other data luminaries will speak at GigaOM’s Structure:Data conference in New York in March.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Data centers haven’t just changed computing, they’ve changed communities

    The technology inside data centers sparks immense industry analysis and speculation. After the media leave, though, data center towns themselves can be ignored, even as they change.

    I saw this up close while working as a business reporter at the Bulletin daily newspaper in Bend, Ore., a 45-minute drive from Prineville, where Facebook and Apple (aapl) have been constructing data centers and where other companies have considered building similar structures.

    To me, the arrival of the data centers turned Prineville’s folksy mayor, Betty Roppe, into an advocate and a celebrity. What’s more, her city has become known as a destination for cloud computing.

    Like Prineville, data center meccas Loudoun County, Va., and Quincy, Wash., have also gone through changes — an economic-development focus, a housing boom and at least one environmental issue.

    Prineville, Ore.

    Nine thousand two hundred fifty-three people called Prineville home in 2010, according to U.S. Census data, and the mayor, Betty Roppe, doesn’t make it seem bigger than it is. She’s proud of her city for attracting Apple and Facebook, and at the same time she’s not afraid to be honest about not being tech-savvy.

    Sen. Jeff Merkley of Oregon, left, and Prineville Mayor Betty Roppe

    Sen. Jeff Merkley of Oregon, left, and Prineville Mayor Betty Roppe

    “I’m kind of in that senior-citizen group that’s not as comfortable with computer systems,” said Roppe, adding that while she has signed up for Facebook, she doesn’t use cloud-storage products such as Dropbox and Evernote.

    But on Facebook, Roppe is a friend of Ken Patchett, the manager of Facebook’s Prineville data center. She’s even taking care of Patchett’s border collie now that Patchett lives in a place where dogs are not allowed, she said.

    In 2011 she appeared alongside a Facebook executive to promote Facebook business profiles. She came on stage at the Facebook data center’s grand opening that year, suggesting local approval of the social networking site’s presence. And she’s gone to Washington, D.C., three or four times to show support for federal legislation that would ensure water and hydroelectric power access for the city — two resources that data centers covet.

    The city of Prineville, is considering the construction of a roundabout to ease traffic near the Apple and Facebook data centers.

    The city of Prineville is considering the construction of a roundabout to ease traffic near the Apple and Facebook data centers.

    Reporters from the New York Times, Dow Jones Newswires and Data Center Knowledge have quoted her, while The Economist, Wired, GigaOM and other media outlets have visited her city.

    In addition to the sudden media interest, the city has seen other changes. Prineville city planning staffers have dealt with legal matters stemming from executives’ concerns about regulatory issues in play in California. And additional traffic on nearby state Highway 126 has caused the Oregon Department of Transportation to enforce a 45 mph speed limit, down from 55 mph, east and west of the roads leading to the Facebook and Apple data centers. Prineville officials are even discussing the construction of a roundabout to ease traffic (see map).

    Quincy, Wash.

    When people in Prineville talk about wanting to add more data centers, they were sometimes thinking of Quincy, Wash.

    Quincy, a five-hour drive north of Prineville, is home to about 7,000 people and five data centers — Microsoft, Yahoo, Intuit, Dell and Sabey Corp. — with a sixth from Vantage Data Centers in development.

    Quincy Mayor Jim Hemberry

    Quincy Mayor Jim Hemberry

    Over the years, the increased sales tax revenue has helped the city build a library, purchase a new ladder truck for the fire district serving Quincy and add a bevy of additional equipment for the Quincy Police Department, said Mayor Jim Hemberry.

    Property tax revenue from the data center operations has allowed the city to add employees, even through the economic recession, the mayor said.

    The data center cluster “hasn’t been an issue that has affected the (Quincy) population in any negative way, in my opinion,” Hemberry said.

    If anything, the rise of Quincy as a data center hub has brought attention from other industries, contributing further to the city’s employment base and economic diversity.

    Plus, he said, “We’ve had a lot of new housing starts.” In a typical year, five to 10 homes are built. Now, it’s more like 400 to 500.

    But just because a data center goes up in city doesn’t mean the mayor becomes an advocate. Patty Martin, a former mayor of Quincy, has become a prominent critic of data centers’ nearby backup diesel generators, which appear to cause air pollution. She has challenged Washington’s Department of Ecology on its decision to grant permits for Microsoft to build more generators. Hemberry declined to comment on the issue.

    Loudoun County, Va.

    AOL was the first company to construct a data center in Loudoun County, Va., in 1997. Then came Equinix Inc. and MCI Worldcom, which Verizon Communications Inc. acquired. But only in the past six years has northern Virginia become a hot spot for data centers, and collocation specifically, said Buddy Rizer, assistant director of the county’s economic-development department. (North Carolina, which shares a border with Virginia, has also seen an influx in data centers, as my colleague Katie Fehrenbacher reported in a four-part series.)

    Buddy Rizer, assistant director of Loudoun County's economic-development department

    Buddy Rizer, assistant director of Loudoun County’s economic-development department

    Rizer himself has gone from a general-purpose economic-development staffer with an IT bent to focusing nearly exclusively on data center retention and recruitment. Rather than send out county supervisors to communicate with data center operators, the elected officials have Rizer take care of it.

    Today the county boasts 8 million square feet of data centers in operation or under construction and sees as much as 70 percent of all internet traffic flying through its facilities, according to a county fact sheet. Big cloud providers such as Amazon Web Services and Rackspace keep servers in Ashburn, among other places.

    Given all of that progress, Rizer said he travels to other states and countries to talk about the county’s achievements in data-center development.

    When asked where the data centers lie inside the county, Rizer said, “Primarily they are in Ashburn, but, even more concentrated than that, they’re in place that we call Data Center Alley, up and down Loudoun County Parkway and in the area of Waxpool.” And yes, he did come up with the name Data Center Alley.

    Over the years, the county has streamlined the process of building a data center there with a “Fast Track for Priority Commercial Development.” Staffers have lined up the right zoning for potential development sites, and county supervisors have showed support for expanding exemptions of Virginia’s sales and use taxes on new computer equipment.

    On top of it all, the county has moved most of its documents to a private cloud, Rizer said.

    “If you’re out selling yourself as a technology location, you want to make sure that you can walk the walk and talk the talk,” he said.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Facebook experiments with small-scale software-defined networking

    Martin Casado, co-founder and chief technology officer of Nicira, which VMware bought last year for $1.26 billion, has gotten tired of people discussing and agreeing on the importance of virtualizing networks. He wants to get out there and start solving networking problems with network virtualization.

    He said as much on Wednesday at a Churchill Club forum entitled “Is Software-Defined Networking the Next Revolution?” at Ericsson’s office in San Jose, Calif.

    It turns out Facebook wants to make networking more efficient, too. Najam Ahmad, director of network engineering at Facebook, said so while sharing the stage with Casado and others at the event. Ahmad is looking for potential solutions to problems such as getting applications to send packets onto servers’ network-interface cards and getting confirmation that that’s happened. No more lost packets that you don’t know are lost.

    “We’re just starting out (with software-defined networking),” Ahmad said. “I wouldn’t say by any means that we’re using it wide-scale. We’re looking at use cases and developing that more prototype stage in that sense.”

    But use cases are hard to come by, even though AT&T, eBay, Fidelity Investments, NTT and Rackspace have implemented Nicira’s Network Virtualization Platform, as my colleague Stacey Higginbotham reported. But software-defined networking and network virtualization hasn’t exactly gone mainstream. As Casado himself put it, the conversation needs to move toward actual use cases and how to change people’s lives with network virtualization, just as server virtualization has changed people’s lives — or at least IT.

    For now, the hype around SDN continues. We’d like to see some network virtualization use cases, too.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Ex-Googler launches security startup NetCitadel

    NetCitadel, a company based in Mountain View, Calif., emerges today from stealth mode with a network-security virtual appliance that intends to simplify the security complexities of cloud computing.

    NetCitadel Co-Founder and Chief Engineer Vadim Kurland’s experience on Google’s network-operations team inspired the new product, said Mike Horn, NetCitadel’s CEO and another co-founder. As is the case with other companies operating large data centers, Google’s cloud infrastructure was often in flux. “They’re really these dynamic environments that are changing frequently,” Horn said. As Kurland developed network security to fit the Google infrastructure, he figured other companies faced similar challenges and might want external providers to take care of that responsibility, Horn said.

    The process of rolling out a security update over a security network is tedious. For example, a system administrator must find out about a firewall change request, such as adding another server. Then he or she needs to figure out the impact of the change, update the firewall or firewalls, deploy the change and make sure everything was done correctly.

    Enter NetCitadel’s Security Orchestration Platform, a virtual appliance that automates that process, Horn said. As a result, network-security staff can focus on more critical matters than manual, time-consuming policy changes. Besides firewalls, the Security Orchestration Platform can also manage a network’s routers and switches.

    Comparable offerings from Cisco and Juniper Networks support only cloud instances but not virtualization, or vice-versa, and they don’t accommodate network-security devices from other suppliers, Horn said.

    NetCitadel was formed in 2010 and raised capital from venture-capital company NEA in 2011 – Horn wouldn’t say how much. It has since taken on 25 employees. Current customers include financial-services institutions, retailers and a university, Horn said. An annual subscription costs $25,000 or more per year.

  • Building for scale: Boundary processes 5 terabytes of data daily

    Looking back on his first year at the helm of the network monitoring company, Boundary CEO Gary Read (pictured) sees progress by all sorts of metrics, but still has his eye on the future.

    Whereas application-performance-management providers such as AppDynamics and New Relic help clients zero in on problematic code inside applications, Boundary looks out for issues slowing down the network, including the application itself.

    Based in San Francisco, Boundary has more than doubled its workforce, going from 12 employees to 28 since Read joined the company last January. Since launching in November 2011, Boundary has grown its clientele to 600 customers, 76 of whom pay for the company’s services. And the amount of data flowing into the company’s infrastructure from clients has grown substantially, from less than half a terabyte to more than 5 terabytes daily. For comparison Facebook stores 1 percent of that amount every 24 hours, according to a November post from the social networking company.

    Amount of network and application data Boundary has processed per day in recent months

    Amount of network and application data Boundary has processed per day in recent months

    Annual revenue has gone up, too, although Read declined to provide figures.

    Some infrastructure hiccups have accompanied all that growth.

    “So definitely as you continue to scale the system, it’s very difficult to test a system like this to unlimited scalability, and so as you continue to push more and more data then it will show itself in different parts of our platform, and different tiers in the application may start to run out of horsepower, or we may start to hit certain limitations in particular areas,” Read said. “We’ve seen that twice already … . In one case, we had to use solid-state disks instead of physical disks. … In another case, we’ve had to add more servers and more processing thru that infrastructure, to deal with us starting to get to capacity limits in what we could be processing.”

    More challenges could lie ahead, Read said.

    As more clients sign up for monthly or yearly contracts, more data is entering the equation. Boundary could open a second data center, exclusively for paid users, to offer better service, Read said. A service that processes one gigabyte is available to users free of charge.

    “With just the sheer volume of data being dealt with so very, very quickly, and so much insight given quickly into that data, this is something where we’re really starting to move into ground that’s never been attacked before,” Read said. And as that data grows it seems so far Read has been able to scale Boundary’s infrastructure. So far he said that the amount the company spends on infrastructure is declining as a percentage of revenue, which is to be expected since the company over-provisioned on its hardware in the early days.

    However, as time passes and Read contemplates adding more capacity, he’s confident that Boundary can continue to scale both the infrastructure and its business model. Given how useful network monitoring can be for cloud-based applications Boundary should prepare for more customers and more competition, such as the newly launched Lyatiss.