Author: David Meyer

  • Academic social network ResearchGate raises $20M, filing shows

    The academic collaboration startup ResearchGate has picked up $20 million in equity-based funding, an SEC filing from last week shows. The news was first reported by the German startup blog Gruenderszene.

    ResearchGate, based out of Berlin and Cambridge, Mass., is one of a handful of large academic social networks that is trying to help researchers around the world connect and collaborate. Another example is Mendeley, which got got bought by Elsevier (see disclosure) a month ago, to the consternation of many users.

    ResearchGate has previously had A and B funding rounds, where we knew who was involved (Benchmark Capital and Accel Partners typically feature) but didn’t know the amount. This time, we know the amount but not who bought the equity.

    While Mendeley is now part of Elsevier’s evil empire — we’re talking popular perception here — ResearchGate has more of a rebellious tinge to it. The company has a rather inspired approach to overcoming the copyright restrictions that so irk academics: having realised that researchers are allowed to publish their papers on their personal websites without breaking the copyright terms of the big academic publishers, ResearchGate tweaked its terms a while back so that users’ profiles count as their personal websites.

    In other words, not only is ResearchGate a forum for connecting academics, but it is also increasingly serving as an open access repository for the sort of research that should, given the public funding that generally makes it possible in the first place, be freely accessible. Users can also share experiment-derived raw data with one another.

    ResearchGate recently started trying to make money, offering the eyeballs of its 2.7 million users to recruiters and conference promoters. It’s a safe bet that the money raised in the last week or two will at least partly go towards boosting the company’s sales force.

    Disclosure: Reed Elsevier, the parent company of science publisher Elsevier, is an investor in GigaOmniMedia, the company that publishes GigaOM.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Intel’s McAfee buys Finnish firewall specialist Stonesoft for $389M

    McAfee has bought Finnish network security outfit Stonesoft for $389 million in cash. It’s the biggest purchase the U.S. giant has made since it was itself bought by Intel for $7.68 billion back in 2010.

    Although it’s always been very important, network security seems to be attracting an increasing amount of intention these days, largely due to high-profile hacks. The Stonesoft acquisition, should it go through the usual regulatory hoops, will give Intel a boost in the areas of firewalls, evasion prevention systems and secure VPN services.

    Here’s how McAfee president Michael DeCesare put it in a statement on Monday morning:

    “With the pending addition of Stonesoft’s products and services, McAfee is making a significant investment in next-generation firewall technology. These solutions anticipate emerging customer needs in a continually evolving threat landscape.”

    McAfee will blend Stonesoft’s services with its own existing portfolio, in particular its IPS Network Security Platform and its Firewall Enterprise product, and it looks like Stonesoft’s “next-generation” firewall will continue to be a product in its own right. It’s not yet clear what will happen to the parts of Stonesoft’s portfolio that weren’t mentioned in the statement, such as its intrusion prevention system and management center software.

    In the statement, Stonesoft CEO Ilkka Hiidenheimo noted that “the combination of the two companies allows Stonesoft to benefit from McAfee’s global presence and sales organization of over 2,200 employees, best-in-class threat research and technology synergies.”

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Amazon taps Germany for cloud and machine learning engineers

    Amazon has announced the launch of a new development center for cloud technologies in Germany, with locations in both Berlin and Dresden.

    According to a statement from the company, the 70-plus engineers that Amazon will hire will work on technologies for supporting various hypervisors, management tools and operating systems. This is effectively a major expansion of the development team Amazon has already had in Germany since buying Berlin-based Peritor last year – a purchase that led to the release of the OpsWorks devops toolkit this February.

    The engineers, who will be hired over the next year, will also develop machine learning technologies to be used across Amazon’s business.

    “Locating the development of key parts of the Amazon Web Services cloud in Germany speaks to the broad set of talent here and the investment we are making in the country,” the managing co-directors of the new Amazon Development Center Germany, Ralf Herbrich and Chris Schlaeger, said in a statement.

    Amazon’s big European data center is located in Dublin, Ireland, although it also has a couple of edge locations in Germany (Frankfurt, to be precise) for content delivery purposes. The company also already has teams of AWS sales and business personnel in Germany.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Ericsson trials HetNet-friendly ‘City Site’: Would you like ads with your base station?

    As cities get more populated and data usage increases, cracks start to show in traditional mobile network layouts – they just can’t handle the load. Many see the solution in so-called heterogeneous networks, or HetNets, which involve a range of different cell types rather than simply relying on the macro-cells we know and love (or loathe, depending whose skyline they’re ruining).

    Ericsson is a keen HetNet proponent and the Swedish networking giant has just launched a commercial trial of what it calls the City Site “integrated solution” in Nanning, the capital of China’s Guangxi region, alongside China Mobile. The four-meter-high (13-foot) package includes a standard Ericsson base station in this case, along with an integrated multidirectional antenna.

    The “Omni Antenna” in question is rather short-range (up to a couple of hundred meters) and relatively close to the ground, which fits in nicely with what Ericsson is trying to achieve here: network densification, a central tenet of HetNet architecture.

    HetNets will need to involve not only a variety of cell sizes and types – from macro-cells to pico-cells to Wi-Fi offload points — but also cells at different levels and layers, in order to solve the challenges presented by specific locations. Tall buildings are a challenge when you’re trying to serve thousands of people on street-level, and this kind of thing may be part of the solution.

    But densification isn’t the only thing that’s going on here. Ericsson’s City Site design also allows add-on modules for video ad screens, clocks, touchscreen real-time information displays and so on. The company told me this could “provide high performance broadband coverage together with fulfilling a city’s needs for de-clutter, aesthetics and add-on applications like information or advertising.”

    This is a good indicator of how we can expect to see our ever-increasing mobile broadband requirements change the cityscapes around us. I’m not sure it really amounts to de-cluttering, though — ads aside, there’s something to be said for discreetly sticking cells on lampposts and other existing street furniture.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Jolla swaps out its CEO yet again, this time bringing in a logistics veteran

    Jolla, the Finnish company that hopes to make big things out of the also-ran MeeGo operating system, has just announced its second change of CEO in just seven months.

    When I first sat down with Jolla in September of last year, I was talking to CEO Jussi Hurmola. A month later, Hurmola was out, moving to a strategic role around Sailfish, Jolla’s MeeGo-derived OS. He was replaced by Marc Dillon, who went on to lead the official unveiling of Sailfish OS.

    As of Monday, Dillon will be Jolla’s new head of software development (a role he was already carrying out, anyway). The new CEO, Tomi Pienimäki, was previously CTO then CIO at Itella Corporation, a Finnish logistics outfit.

    At first glance, it looks like Jolla has opted to go for a more business-centric leader – Dillon is quite a developer evangelist type – now that it’s signed serious deals with the likes of Chinese handset distributor D.Phone and Finnish carrier DNA, and is preparing to reveal its handset later this month.

    Here’s what Pienimäki said in a statement on Friday:

    “Jolla is a great company with an exciting and promising future. I truly believe we can make a difference and bring something unique to the consumers. My task is to listen very closely to our customers and further build the collaboration network. I also want to ensure that our team can fully concentrate on the most important task: bringing the first device to the market this year.”

    Jolla Chairman Antti Saarnio thanked Dillon for his “inspirational leadership in the CEO role during the past months” and Dillon himself said he was excited to “be able to give 100 percent attention to what I love – working on the product with the Jolla team.”

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Intel banks on enterprise mobile app development again, leading $9M FeedHenry round

    The Irish mobile app development and deployment outfit FeedHenry (which was one of GigaOM’s Mobilize Launchpad finalists a couple years back) has just scored a respectable $9 million, Intel-led funding round.

    VMware was already an investor along with Kernel Capital and Enterprise Ireland, and all three have participated in the new round as well. Intel has however taken the lead this time with ACT Venture Capital also joining in.

    FeedHenry serves business customers that want to develop and deploy in-house mobile apps. In February the company partnered up with Telefonica, allowing the bundling of FeedHenry’s platform with the telco’s infrastructure-as-a-service platform, Instant Servers, for the benefit of European customers.

    According to Marcos Battisti, Intel Capital’s managing director for Western Europe and Israel, the investment will help FeedHenry expand internationally:

    “The mobile application market segment for enterprise is at a tipping point and those companies delivering a comprehensive solution that provide both an end to end mobile development strategy and a way to implement applications easily and securely will be at the forefront of the market segment.”

    FeedHenry’s rivals include firms such as Antenna Software and SAP. The Irish firm’s particular selling point is flexibility, allowing deployment of its Mobile Application Platform to public, private and hybrid clouds. Apps developed on the platform can also be built once then rolled out to iOS, Android, BlackBerry and Windows Phone devices.

    The company also provides “backend-as-a-service” functionality, with server-side code based on Node.js, as well as app management tools and analytics. There’s been considerable activity in that BaaS market with Salesforce.com and Rackspace adding mobile backend capabilities there and Facebook buying Parse just last week.

    It should be noted that this is far from the first investment Intel Capital has made in this space. Just this January, it also put $4.6 million into enterprise mobile app deployment firm Apperian. And, in February, parent company Intel bought the mobile app development tools division of AppMobi.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Skobbler’s online-offline ForeverMap2 app is now available on iOS

    Skobbler, the Berlin-based outfit that largely provides tools for incorporating OpenStreetMap-based functionality into other companies’ apps, has launched a major revamp of its own iOS play, ForeverMap.

    ForeverMap provides both online and offline mapping functionality, based on OpenStreetMap. The first iteration used a much older version of Skobbler’s technology and required the user to download a rather hefty 1.5GB of data covering all territories, regardless of where they intended to use the app.

    The new version, ForeverMap 2, has been available for Android since November, but is now also out there for iPhone and iPad. It aims to provide the best of both online and offline worlds – to allow proper offline search and routing, unlike Google Maps, and to allow online functionality, unlike travel apps such as CityMaps2Go. Its coverage is global, and users can choose to download offline maps on a per-country basis.

    “We know that modern users want one map that handles all of their needs, and we believe we’ve delivered this with a genuinely innovative hybrid solution based on the popular, rich and dynamic OpenStreetMap,” Skobbler CTO Philipp Kandal said in a statement.

    However, while ForeverMap2 is a consumer play, it is also a showcase of Skobbler’s GeOS SDK and NGx mapping engine. As co-founder Marcus Thielking told me, it gives the company’s prospective white label clients – who are largely in the automotive business – “a good template” for what they could build on top of the product.

    “It’s the perfect pitching tool – we just give it to them and they can see what we can do,” he said. Previously announced customers for Skobbler’s technology include the city guide app outfit Triposo.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Salesforce finally solidifies European data center plans

    Salesforce.com will set up its first European data center in the UK next year, the enterprise software-as-a-service firm said on Thursday.

    The company has come under criticism for not having a European data center in the past, largely due to compliance issues – Salesforce is part of the EU-U.S. Safe Harbor framework, which means it’s allowed to handle European citizens’ personal data, but many customers would prefer the certainty that a locally sited data center allows. (We will be discussing such issues at our Structure:Europe conference in London on 18-19 September, by the way.)

    Salesforce said last year that it hoped to open a data center in the UK in 2013, but this appears to have been pushed back a little now. According to a statement today, the new data center – the firm’s sixth — will be completed in 2014 in partnership with NTT Communications’ local arm, NTT Europe.

    In a statement, Salesforce CEO Marc Benioff said Europe had provided the greatest revenue growth – 38 percent — for the company in the 2013 fiscal year:

    “We are doubling down on Europe with the announcement of our new data centre in the UK, which will support continued customer success in EMEA.”

    Robin Balen, NTT Europe’s wholesale data center business chief, added that the new facility would be “powered 100 percent by renewable energy sources.”

    Innovation Challenge

    Meanwhile, Salesforce has also teamed up with a group of European venture capital firms – Notion capital, Octopus Investment and MMC Ventures – to launch a €5 million ($6.6 million) Innovation Challenge for startups.

    Startups are invited to pitch their enterprise cloud apps that could run (surprise!) on Salesforce’s platform. There will be pitching events through Europe between September and November, and the winners will get seed funding. Apps will need to be at least in the beta stage, with demonstrable “traction, customer success and user adoption.”

    “This is a unique opportunity for innovative start-ups in the enterprise app market here in Europe to receive commercial support to allow them to compete on a global stage,” Octopus principal Luke Hakes said in a statement.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Metadata-centric enterprise content management firm M-Files scores €6M in funding

    The Finnish enterprise content management platform M-Files, which places an unusual amount on emphasis on metadata, has closed a €6 million ($7.85 million) Series A round that was led by DFJ Esprit and also took in cash from the Finnish government.

    M-Files does away with traditional folder structures, relying entirely on metadata to help people find documents (think iPhone rather than Windows), and it works across cloud, on-premise and hybrid installations. The service has been around for a good 8 years and has picked up some very serious customers indeed, ranging from AstraZeneca and Pandora to Northrop Grumman and the UN Environment Program.

    “Everyone says that metadata is important, but quite often it’s something users have to add when they are storing documents,” CEO Miika Mäkitalo told me on Tuesday.

    “Our M-Files system is a metadata architecture, which gives excellent benefits for end users… You can [organize] documents by customer name, project, proposal date and so on, and if you choose to find a document you will always find the latest version – they might be different paths but they all lead to the same document.”

    As for the fresh cash infusion, this will be used to push M-Files further into the U.S. in particular, Mäkitalo said: “We want to continue growing aggressively. We want our channel sales function to skyrocket in the coming years.” He added that, although the U.S. and Finland account for most of M-Files’s existing customers (of which there are thousands), there are many elsewhere in Europe and in Asia, too.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Google uses Finnish data center as springboard for startup outreach

    It seems Google has a habit of using its European data centers as keys to the local startup community. Having already used its Belgian data center as a springboard for local jobs events and cultural tie-ins in that country, the firm is now doing much the same in south-eastern Finland.

    Google bought an old paper mill in Hamina several years ago, and converted it into a data center that is, interestingly, cooled by seawater. The company is now in the process of extending the facility at a cost of €150 million ($196 million) — design nerds should note that this is being done by converting a machine hall originally designed by the architect Alvar Aalto. And now Google has also struck a deal with Aalto University and regional development agency Cursor.

    According to a Google blog post, Google’s backing will allow Aalto University to better support local startup accelerators, and also help “improve the use of the internet” by small businesses in the region. The university is already a backer of the Startup Sauna program and various other entrepreneurial initiatives, so we can now expect to see more in this vein.

    Google’s push is supposed to “show the way from our industrial past to our digital future,” according to the post, and indeed both the Belgian and Finnish data centers are sited in areas left somewhat depressed after the death of older industries – mining in the case of the St Ghislain facility and paper milling in the case of Hamina.

    It’s good PR for Google, of course, but there is validity to the conceit — and it’s also quite a clever way to keep an eye on the ideas that local developers and engineers are coming up with.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Nokia to invest in array camera outfit Pelican Imaging, report says

    Nokia’s investment arm is set to back Pelican Imaging, a Californian company that takes an unusual approach to smartphone cameras, according to a report from Bloomberg.

    We’ve covered Pelican before. The startup has designed an “array camera” that is essentially composed of multiple smaller cameras, all organized in an array. Why do this? Because when you get to the tiny form factor required of handset cameras, the sensor is so small that you’re highly limited in terms of the number of megapixels you can squeeze out of it before image noise becomes unbearable.

    Of course, if using many tiny cameras was that easy, we’d see it done in smartphones already. The secret sauce lies in the software used to bring the multiple resulting images together, and it’s that part of Pelican’s work that seems to have attracted the interest of Nokia Growth Partners.

    “It’s very complicated to do this algorithmically and Pelican is one of the companies that has mastered this technology,” partner Bo Ilsoe was quoted in the piece as saying.

    Imaging is central to Nokia’s current handset strategy, from its top-end Lumia Windows Phone smartphones down to its cheap Asha phones — all these devices have clever photo-taking features of one kind or another. The standout model there, so far, is the Symbian-toting 808 PureView, which uses a technique called oversampling to create a 41-megapixel image (the same technology is apparently coming to the Lumia line, too).

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • BT pushes its Cloud Compute IaaS platform into China, India and beyond

    BT may be (perhaps surprisingly) the third-biggest infrastructure-as-a-service provider in the world, but it clearly isn’t satisfied. On Monday, the British telecoms giant made a push into major new territories by announcing the upcoming launch of BT Cloud Compute in China, India, Germany, Mexico and Argentina.

    The corporate-focused service has already been up and running for a while in the U.S., U.K., Spain, Brazil, Colombia, France, Italy, Singapore, Hong Kong, Belgium, the Netherlands and Luxembourg, which is how it got to that number-three spot.

    BT Cloud Compute has been built in-house at the company’s Adastral Park R&D center, in collaboration with the likes of Cisco and Citrix. The service runs out of 45 data centers around the world, which is handy when dealing with various countries’ compliance requirements.

    BT is particularly keen on talking up its resiliency and 99.95 percent “expected” service level, as well as the fact that customers can run their services across both BT’s public cloud facilities and their own private clouds and in-house infrastructure.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Nokia and SAP team up on TwoGo ride-sharing platform

    SAP has launched a cloud-based corporate ride-sharing platform called TwoGo, with Nokia providing the location component.

    The service is for companies that want to quickly roll out a ride-sharing scheme for their employees – as is generally the case with such schemes, the advantages range from greater environment-friendliness to lower petrol costs and the need for fewer parking places. As Peter Graf, SAP’s sustainability chief, put it in a statement:

    “We’ve combined our mobile and cloud technologies into a carpooling solution to help provide immediate economic, environmental and social benefits to companies and their employees. As such, we expect TwoGo to not only help people and businesses save money and greenhouse gas emissions, but to also connect people more closely with each other and with the company they work for.”

    TwoGo works on the web and on mobile devices. Employees can enter their travel preferences, after which Nokia’s Here platform kicks in to display likely matches. Here is (in this writer’s opinion) Nokia’s big hedge against a post-hardware future, and this deal is significant for taking the location-based services platform into the enterprise. “We believe that location will be the new frontier of technology across industries,” Nokia mapping chief Christof Hellmis said in the statement.

    Although it is particularly well-suited to large enterprises — the travel giant Thomas Cook is the first announced customer, having taken part in the beta program – SAP is also pitching TwoGo at smaller companies, as employees of neighbouring businesses can share rides too.

    Handily, TwoGo also works with the likes of Microsoft Outlook and Google Apps (anything iCal-compatible will do) so that ride schedules can be integrated with corporate calendars. SAP has been using TwoGo internally for almost two years, and claims to have “generated more than $5 million in value for the company” through fuel and maintenance savings, lower travel expense reimbursements, cutting down on emissions and, of course, getting more employees talking to one another as they travel to work and back.

    According to a separate blog post from Nokia, TwoGo is currently available for licensing by companies in the U.S. and Germany, with other countries coming online soon.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Chat apps have overtaken SMS by message volume, but how big a disaster is that for carriers?

    There’s a reason why mobile carriers are scared of third-party messaging apps such as WhatsApp, and here it is: people are now sending more messages over these services than they are text messages.

    We now know this for a fact, courtesy of analysts at Informa. As Europe’s digital chief, Neelie Kroes, greeted the news on Monday morning:

    Informa says 2012 saw nearly 19 billion messages sent over these apps each day around the world, versus 17.6 billion SMS messages. The analyst house reckons the contrast will be even starker in 2014, with 21 billion text messages projected, against almost 50 billion app-based messages.

    As you will note, this suggests that SMS volumes will continue to increase, at least in the short term. Nonetheless, it is clear that the big growth is to be found in, er, the data coffee – spurred along by the likes of Nokia, which is now selling phones with dedicated WhatsApp keys.

    However, things may not be as bleak for the mobile operators as they seem.

    Hazy picture

    First off, while the volumes of non-SMS messages has overtaken that of traditional texts, the user numbers remain significantly lower – although how much lower is a bit unclear.

    According to Informa analyst Pamela Clark-Dickson, there were 3.5 billion SMS users in 2012. Regarding the chat apps, Clark-Dickson only took 6 into account, namely WhatsApp, BlackBerry Messenger, Viber, Nimbuzz, Apple’s iMessage and KakaoTalk. At the end of 2012, she said, there were 586.3 million users of these platforms, but that’s not taking into account other giants such as Facebook Messenger for Android (somewhere between 100-500 million installations) and China’s TenCent (around 300 million users).

    Even if there were, let’s say, a billion chat app users, the disparity between message volume and user numbers shows that people who use these “over-the-top” (OTT) apps use them more frequently than those who use SMS – specifically, the average OTT app user sends 32.6 messages a day, and the average SMS user just 5 texts. This stands to reason because OTT apps are generally free to use, so we should therefore be wary of assuming that every OTT message represents a “lost” SMS from a revenue perspective, in much the same way as it’s illogical to claim that a free “pirated” song download represents a lost sale.

    Those chat app users are probably also SMS users, because – for example – WhatsApp is of little use when you’re trying to message someone on a different platform (or someone with a basic dumbphone). There, SMS is and remains the great leveller: any mobile phone can use it. This is particularly important for some enterprises.

    Whither Joyn?

    And then we have a big unanswered question: even when SMS tails off, how big a chunk of the IP-based messaging market will the carriers themselves own?

    Thing is, Informa’s analysis of the market does not include projections for Joyn, the industry-wide drive to create a common, interoperable messaging and file-sharing platform that works on all (or at least most) operators’ devices — Joyn has only just kicked off, so there are no real takeup figures from which to extrapolate. Precedent suggests that the mobile industry is incapable of acting in concert, but that doesn’t mean it can’t buck the trend when its back is against the wall.

    “Mobile operators do have the opportunity to provide their own IP-based messaging applications,” Clark-Dickson noted.

    And then we have services such as Telefonica’s Tu Go and Rogers’s One Number that extend traditional handset functionality onto the desktop. These services heavily blur the line between SMS and IP-based messaging – if the carriers can pull off this sort of thing while monetizing it in some way, what does it matter whether the medium used is technically SMS or something else?

    Also don’t forget that carriers can build offerings around these third-party apps. For example, WhatsApp has partners with 3 Hong Kong and RCom, which sell flat-rate bundles specifically for WhatsApp use while at home or roaming. It may break the principle of net neutrality, but it’s a tactic some carriers are employing.

    Either way, though, what’s clear is the speed at which all this is happening. The SMS is 20 years old and chat apps have only been around for around 5 years. Although we should take care when predicting the results, the trend of IP-based messaging replacing SMS certainly appears unstoppable.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • UK prepares for white space broadband rollout in 2014

    The UK is about to get a serious pilot of white space radio. Yes, there’s already been an industry-led pilot in Cambridge, but that was really about the technology itself – the pilot coming up this autumn is being led by the telecoms regulator Ofcom, and the idea here is to test out the processes around using white spaces across the country.

    In other words, the UK is now gearing up for a proper rollout next year (if everything goes well in the trial), with potential uses including rural broadband and the internet of things. Here’s what Ofcom chief executive Ed Richards said in a statement on Friday:

    “Ofcom is preparing for a future where consumers’ demand for data services will experience huge growth. This will be fuelled by smartphones, tablets and other new wireless applications.

    “White space technology is one creative way that this demand can be met. We are aiming to facilitate this important innovation by working closely with industry.”

    The term “white spaces” refers to the gaps between heavily-used radio frequency bands. These are buffer zones that were deliberately left empty in order to stop the services using these various bands – generally TV broadcast services — from interfering with one another.

    However, a few years ago people started playing around with the idea of using white spaces for digital communications. They tend to be low-frequency, which makes them ideal for sending data over long distances, and their exploitation now seems quite viable, depending on which bands are already in use in a particular geographical area.

    This is why databases of frequencies and coverage are absolutely crucial to white space usage – unlike with Wi-Fi, which can be used anywhere without a license, Ofcom wants to make sure that devices using white space frequencies only do so when they can avoid interfering with surrounding bands. The devices, which will generally use cognitive radio technology in order to hop between frequencies as needed, will therefore need to get clearance from an Ofcom-approved database before they can start transmitting.

    Those of you who have been following white space technology will know that Google is compiling its own database of frequency usage, and this is the sort of database that Ofcom might theoretically approve for official use in the UK.

    Google’s database is currently being tested in the U.S. by the FCC, the American counterpart to Ofcom, and is also being deployed in a white space trial in Cape Town, South Africa. Meanwhile, Microsoft has already conducted tests in the UK (the Cambridge pilot) and Singapore, and is now active in Kenya. Other, similar initiatives are underway in Finland, Ireland and France.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Wikipedia is now drawing facts from the Wikidata repository, and so can you

    Wikidata, a centralized structured data repository for facts and Wikimedia’s first big new project in the last 7 years, is now feeding the foundation’s main project, Wikipedia.

    The Wikidata project was kicked off around a year ago by the German chapter of Wikimedia, which is still steering its gradual development. For Wikipedia, the advantage is simple and powerful — if there’s a central, machine-readable source for facts, such as the population of a city, then any update to that data can be instantly reflected across all the articles in which the facts are included.

    To posit a morbid example: a singer may have dozens or even hundreds of language versions of her Wikipedia entry and, if she were to die, the addition of a date of death to the Wikidata database would immediately propagate across all those versions, with no need to manually update each one (yes, I can also see how this might go horribly wrong).

    Indeed, Wikidata is now being used as a common data source for all 286 Wikipedia language versions. Here’s the under-development “item” page for Russia, if you want to see what Wikidata looks like in practise.

    Wikidata Russia

    But the really interesting thing with Wikidata is that it’s not just for Wikipedia – although it’s worth remembering that its API is still under development, the database can be used by anyone as it is published under a Creative Commons 0 public domain dedication. Here’s how Wikidata project director Denny Vrandečić put it in a statement:

    “It is the goal of Wikidata to collect the world’s complex knowledge in a structured manner so that anybody can benefit from it, whether that’s readers of Wikipedia who are able to be up to date about certain facts or engineers who can use this data to create new products that improve the way we access knowledge.”

    There are already some pretty cool (if bare-bones) examples of what people can do with Wikidata. One is GeniaWiki, which is trying to map the family relationships between famous people (the first and so far only example is that of the Bach family), while a Tree of Life project is trying to put together a viable, Wikidata-based “taxonomy of all life”.

    It’s worth noting that the initial funding for Wikidata’s development has come from Google, the Gordon and Betty Moore Foundation, and the Allen Institute for Artificial Intelligence. Ultimately, Wikidata is precisely the sort of venture that is needed to feed the nascent semantic web and AI movement.

    It’s far from the only venture in this space – I’d also recommend keeping a close eye on Google’s Knowledge Graph, which powers Google Now, and Wolfram|Alpha, which partly powers Siri – but all these (often intertwined) projects are essentially trying to do the same thing: to turn facts into something that machines can understand.

    And that, in conjunction with advances in natural language processing and machine learning, will ultimately help us converse with machines. These are the building blocks of artificial intelligence and the future of search, and Wikidata’s very permissive license should act as an open invitation to anyone dabbling in this space.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Tasks for teams: Wunderlist Pro is out for Apple devices and the web

    Wunderlist Pro is finally here, adding functionality on top of the task management app to better suit team use. The consumer-focused free app — which is inching ever closer to being an Evernote rival — has also been spruced up.

    The features of Wunderlist Pro will come as no surprise, as 6Wunderkinder accidentally revealed them last month, but here’s the gist anyway: tasks can be assigned among friends or colleagues, and subtasks can now be created. This should make Wunderlist Pro an effective replacement for the axed Wunderkit, which was a project management counterpart to Wunderlist’s task manager.

    Wunderlist Pro AssignThe Pro version costs $4.99 a month or $49.99 annually, and is available now for iOS devices, OS X and the web. 6Wunderkinder tells me the Android and Windows versions will follow in a week’s time.

    “Wunderlist Pro allows you to easily delegate to-dos and effectively track the progress of each task, yet this is just the beginning. There is still a whole lot more to come,” 6Wunderkinder CEO Christian Reber said in a statement.

    The first installment of that “whole lot more” will be the ability to attach files to tasks, which can then be shared for collaborative work. Meanwhile, the sharing functionality of the original Wunderlist has also received a boost through the addition of an “action bar” that allows one-click access to email and share lists.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Google’s search concessions to the EU are now out and up for comment

    The European Commission has formally announced the measures that Google has offered to take in order to settle a major antitrust investigation into its practices. It now wants “interested parties” to have their say on the proposals over the next month, after which it will decide whether to make them legally binding on Google.

    The case followed complaints by Microsoft and others over Google’s treatment of rivals’ web services in its search results. These companies argue that Google favors its own services, which are not clearly marked as such, and also that it unfairly locks advertisers onto its platform and scrapes content from third-party search and comparison sites without consent.

    A recent leak outlined the terms of the proposed settlement deal, but here’s the official version:

    To address these concerns, Google offers for a period of 5 years to:

    (i) – label promoted links to its own specialised search services so that users can distinguish them from natural web search results,
    – clearly separate these promoted links from other web search results by clear graphical features (such as a frame), and
    – display links to three rival specialised search services close to its own services, in a place that is clearly visible to users,

    (ii) – offer all websites the option to opt-out from the use of all their content in Google’s specialised search services, while ensuring that any opt-out does not unduly affect the ranking of those web sites in Google’s general web search results,
    – offer all specialised search web sites that focus on product search or local search the option to mark certain categories of information in such a way that such information is not indexed or used by Google,
    – provide newspaper publishers with a mechanism allowing them to control on a web page per web page basis the display of their content in Google News,

    (iii) no longer include in its agreements with publishers any written or unwritten obligations that would require them to source online search advertisements exclusively from Google, and

    (iv) no longer impose obligations that would prevent advertisers from managing search advertising campaigns across competing advertising platforms.

    Authorities in the U.S. more-or-less cleared Google over similar complaints, but it’s important to note that Google’s share of the search market there is around 67 percent, whereas in the EU it’s around 90 percent. This gives it stronger market power in Europe, and forces the regulators’ hand somewhat (as do local laws).

    A Q&A document, which outlines the Commission’s concerns in detail, points out that “it does not seem likely that another web search service will replace [Google] as European users’ web search service of choice.”

    “In this context, it is important for the Commission to intervene in order to ensure that Google’s prominent market position in web search does not affect the possibility for other competitors to innovate in neighbouring markets, including in the long-term,” the document states.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Heroku comes to Europe, but data protection issues remain

    Heroku has opened up a European region to complement its existing U.S. region, in order to cut down on the latency experienced by customers running their apps from the platform for the benefit of European users. However, that doesn’t make Heroku entirely compliant with European data protection law – yet.

    In a blog post, Heroku’s Zeke Sikelianos said the platform-as-a-service oufit had been seeing great demand from the non-U.S. world, and its second region was now live as a public beta, following a private beta with customers such as Swedish television network TV4.

    “Deploying our app closer to our users in Heroku’s Europe region gave us a 150ms improvement in web performance. Based on this win for our users, we’re moving all of our apps to the Europe region,” the post quoted TV4 CTO Per Åström as saying.

    The European region, which runs out of Amazon’s Irish data center, comes with all the same features as the U.S. region. Over 60 add-ons are already available for the region, such as Heroku Postgres and ClearDB, and others are on their way. The company has introduced heroku fork to its command-line interface in order to ease the migration of apps from the U.S. region, by copying relevant data and configuration variables.

    Data location

    European data protection laws are more stringent than those in the U.S., so the two parties have set up a Safe Harbor program for American companies whose services involve the handling of EU citizens’ personal data. Heroku still isn’t part of that program, so technically it’s still not kosher to run services for EU citizens on the platform, even though it’s now using an EU data center.

    “Heroku is not yet a registered participant in the Safe Harbor program,” the post read. “We’ve laid the groundwork for becoming Safe Harbor certified and expect to have it soon.

    “The Europe region public beta is designed to let you build high-performance apps for European users. It does not currently address data residency or jurisdiction concerns. You should assume that some portions of your app and its data will be in, or pass through, data centers located in the U.S.”

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Deutsche Telekom’s ‘anti-net-neutrality’ plans alarm German government

    Users of Deutsche Telekom’s mobile services are used to the concept of data caps, but its fixed-line customers? Not so much. This is part of the reason why the German government is reportedly upset about the telco’s plans to drop flat-rate pricing for its DSL services – the most alarming part, however, is that Telekom apparently wants to exempt its own services from the cap.

    We’re into classic net neutrality territory here. As the company announced a few days ago, Telekom’s customers will be able to stream films from the carrier’s own T-Entertain service without any problem, but streaming a film from a rival would count towards the cap – effectively meaning Telekom’s caps will discriminate in favor of its own products. And all services, activists argue, should be treated equally on the open internet.

    Concerned citizens have already set up a Change.org petition that has garnered around 30,000 signatures at the time of writing, but now the German government itself has weighed in. This isn’t just a regulatory thing – the government is Telekom’s biggest shareholder, too.

    Der Spiegel claims to have seen a letter from Philip Rösler, the federal economics and technology minister, to Deutsche Telekom chief Rene Obermann, in which Rösler warns that the government and competition regulators will “very carefully follow ongoing developments with regard to a possible differential treatment of [Telekom’s] own and rival services under the aspect of net neutrality.”

    In a statement, Telekom claimed that “net neutrality is partly confused in the debate with a free internet culture” and that “T-Entertain is not a regular internet service, but a television service for which the customers pay separately.”

    “Regular internet services are not subject to discrimination,” Telekom added, while noting that the alternative to introducing the caps would have been to raise the flat-rate tariffs for all customers.

    Discriminatory caps

    Telekom’s proposed changes work like this: customers on the slowest DSL lines (up to 16Mbps) will get capped at 75GB a month; those on up-to-50Mbps plans will face a 200GB cap; an up-to-100Mbps plan will max out at 300GB; and an up-to-200Mbps plan at 400GB. After that, speeds will be throttled to 384Kbps, although customers could also pay extra for more usage at normal speeds. The carrier claims its customers typically use 15-20GB a month.

    On the face of it, these caps do appear reasonable, given the data volumes consumed by the average user, and they are supposedly aimed at stopping people from consuming extremely high data volumes at the standard rate — Telekom says only 3 percent of its customers will be affected. However, as those in the telecoms industry know all too well, data usage is only going one way: up, up, up.

    Ultimately, it’s the principle of the thing that seems to be the problem here. Once you establish a precedent that certain services can be freely used while others cannot, you potentially raise the barriers to entry for new players. After all, with Telekom being Germany’s biggest ISP, would you set up a competitor to T-Entertain once the discriminatory caps are in place?

    Yes, Germans are already used to data caps on mobile, and indeed Telekom itself has a cellular-centric agreement with Spotify that exempts traffic from that service from counting towards caps for customers on certain tariffs. The principle is already broken there. However, the way out of that for a Telekom mobile user who favors a rival to Spotify, is to offload as much traffic as they can onto their home Wi-Fi connection. If they’re also with Telekom for fixed-line services, as many are, now they’re going to face caps there too.

    So, with traffic volumes set to keep on growing on all fronts, it’s not hard to see why many of Telekom’s critics are spoiling for a fight.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.