Category: News

  • ServicePower lets firms manage their workforces in the cloud [Q&A]

    ServicePower — a mobile workforce management software provider — is seeing more and more companies turning to a workforce model that relies on a mix of full-time employees, third-party contractors, and independent technicians being brought together and managed seamlessly in one place using the power of the cloud.

    I chatted with Mark Duffin, CEO and president of ServicePower, about the changes he’s seen recently, the data his firm collects, and why cloud deployment has become so important to his company and its clients.

    BN: For the benefit of any readers not familiar with ServicePower, can you tell me about the platform and what it does?

    MD: ServicePower Technologies provides a mobile field management software platform that enables clients to mix labor resources — utilizing full-time employees, contractors, and third-party resources while easily controlling, monitoring and analyzing scheduling and operations.

    Our clients use the platform to manage customers, inventory and finances. It enables them to schedule, optimize and dispatch jobs to field resources, while collecting real time job status information, and asset location data. Our platform manages time cards for employees, payments to contractors, and also provides the end-to-end business intelligence to manage the whole process with an eye towards cost control, increased profit margin, increased productivity and utilization, cycle time control and enhanced customer satisfaction.

    BN: If you’re dealing with scheduling and delivery, is this a tool that could be used by Amazon, for example?

    MD: Any organization that needs field resources to perform service, installation, delivery, inspections, etc., can use our field management platform to ensure that they have resources with the correct skills, in the right geography to execute against its customer commitments.

    Our platform enables retailers and e-tailers, for instance, to focus on selling product, rather than manually determining the best route for a series of deliveries. Our software platform will determine the best field resource from amongst employee, third party contractor and on demand technician labor pools. It will communicate the schedule of jobs as well as the real time status of the event, via our mobile application, which can be used from any handheld device capable of browsing the web. It also will pay contractors for work performed, and generate business intelligence that can then feed the field management process.

    So, companies like Walmart, or Amazon, for instance, can use the ServicePower platform as an extension of product sales, enabling them to drive incremental services sales, as well as build customer ‘stickiness’ since they can offer the product, as well as delivery, initial installation and service in the future.

    BN: ServicePower started life as a scheduling tool, but now it also manages operations, inventory, digital assets and tracks data in all of them. Was it always the plan to support so many services and to enter the world of big data?

    MD: ServicePower’s vision evolved over the last five years to a complete end-to-end solution, rather than independent products that address organizations with a single labor resource pool.

    We’ve learned that in order for our clients to become best-in-class field service organizations, they must be able to seamlessly tap into multiple labor resource pools, enabling them to achieve metrics around cost and margin control, demand seasonality, productivity and utilization, cycle time, and customer satisfaction.

    Our goal has always been to provide the best field management tools. The completed platform now enables clients to get all functionality they need from a single source vendor.

    All field service organizations are operating in a big data environment, due to the increase in the collection of data, through sources like mobile devices, software logs, and wireless networks.

    Our field management platform gives them the tools they need to use that data to plan, execute and analyze.

    BN: What type of data do you collect and how do you use it?

    MD: Our platform collects:

    • Field resource skills
    • Field resource geography, including a the level of knowledge of an area for employed or dedicated field resources
    • Field resource efficiency (how well they can execute what they’re scheduled to do related to job length)
    • Travel times from location to location
    • Mileage costs, fuel utilization
    • Driving behavior
    • Cost of overtime
    • Cost of a particularly routed set of jobs, versus the cost of a differently routed job set
    • Total job cost
    • Shift adherence
    • SLA adherence
    • Schedule adherence
    • Centralized dispatch management throughput improvements
    • Decreased rework/truck rolls
    • Productivity
    • Utilization
    • Capacity
    • Labor rates
    • Parts usage
    • Quality scores
    • Part failure rates
    • Cycle time
    • Job status
    • GPS location

    The data is used to drive not only the scheduling behavior of the platform, in terms of who is scheduled for which jobs, but also uses the data to drive planning and forecasting, and continuous execution improvements.

    BN: You’ve recorded job complete times across multiple industries? Any trends?

    MD: The trend is that job times differ greatly around different industries but travel times by and large do not. Nor should job times differ by training for an employee. Interestingly, we have found that in urban areas, travel times often reduce when utilizing our solution. The real importance with job times isn’t that each industry is different but that we’ve seen common best practices which improve job times and ultimately customer satisfaction. Those standards are to accurately status jobs, report accurate travel times, compare performance of the workforce based on seniority and training, and optimize frequently by more than just travel time. By doing so, the organization can increase capacity, plan for emergency work and meet customer SLA’s (service-level agreements).

    BN: What makes ServicePower worth paying for as an IT solution?

    MD: ServicePower embraced cloud technologies long before the rest of the market did. We recognized that cloud based deployment drove aggregation of jobs within the third-party contractor markets, while increasing adoption of technology due to low cost of entry for independent, often cash-strapped contractors.

    By offering our entire platform as a cloud solution, we’ve also driven cost out of organizations that would have historically paid for the infrastructure to support on premise deployments. We’re managing the hardware and software, within the cloud, with the appropriate redundancy and failover safeguards, to provide our clients with the best tools, at the lowest costs.

    ServicePower provides clients with the ability to forecast, execute and analyze its operation, while mixing field labor resources to achieve its most desirable field service metrics, yielding a best-in-class field service operation.

    BN: Why is cloud-based technology so important to ServicePower?

    MD: As stated above, cloud technology is very important to today’s field service organization. Cloud deployment, if implemented with rigid redundancy and failover capabilities, enables organizations to utilize cutting edge technology at the lowest cost possible, while still able to realize improvements in productivity and utilization, margin and customer satisfaction.

    Cloud deployment drives aggregation and adoption in the contractor markets, while also providing them with cost savings. The addition of our new HTML5 mobile application and S2 Suite business management software, both cloud-based technologies, further enhance our ability to provide our clients with the best possible, least costly mixture of field labor resources to achieve their business metrics.

    BN: When choosing a hybrid/HTML5 mobile solution, what factors did you take into account?

    MD: ServicePower has been in the mobile application space for many years. With the proliferation of devices and operating systems, it became apparent that our clients, as well as and perhaps most importantly, our third-party contractor networks, required a solution that supports evolving mobile strategies.

    So many clients are moving away from outfitting their own technicians with a single device, instead implementing a BYOD (Bring-Your-Own-Device) strategy. That trend, plus the need to secure the same level of real-time job status information from third-party contractors required technology that would work on just about any device a technician may already have in hand.

    Working in conjunction with the largest device manufacturers in the world, particularly Apple, ServicePower’s team of developers created a browser product that takes advantage of elements found in native apps, to ensure a consistent, OS centric user experience called ServiceMobility. Apple users interact with what looks and works like an iTunes app. Android users interact with an app that looks and feels like a Google Play app.

    The ServiceMobility product suite supports key enterprise requirements such as disconnected data, signature capture and barcode scanning, across the product set, and is structured to support our wide range of client needs.

    BN: What is one industry that is not currently using your platform, but could benefit greatly from it?

    MD: ServicePower has just scratched the surface of the telecom industry. The supply chain is so extensive, with companies ranging from VARs (value-added resellers) to telecom carriers, yet each link in the chain utilizes field resources to execute.

    But, ServicePower is uniquely positioned to provide a multi-labor channel solution that is relevant and necessary for telecom supply chain members to execute, whether they provide equipment installation or repairs, infrastructure or consumer based voice and data services.

    Our solution enables the telecom industry to optimize their own employed field resources, but also utilize third-party contractors to address geography, skill, and capacity or catastrophe events.

    Photo Credit: everything possible/Shutterstock

  • Syros Pharmaceuticals Inks $30M Series A

    Syros Pharmaceuticals, which is focused on treatments for cancer and other diseases, has inked $30 million in Series A financing. Company co-founders ARCH Venture Partners and Flagship Ventures led the round. The money will go toward development of novel gene control medicines.

    PRESS RELEASE

    Syros Pharmaceuticals, a newly launched company harnessing breakthroughs in gene control to revolutionize the treatment of cancer and other diseases, today announced that it has completed a $30 million Series A financing led by company Co-founders ARCH Venture Partners and Flagship Ventures. Syros will use the capital to accelerate the discovery and development of novel gene control medicines. The journal Cell today published two manuscripts1 from Syros Scientific Co-founders based upon the discovery of gene control regulators called Super-Enhancers.

    Syros Pharmaceuticals was co-founded by ARCH Venture Partners and Flagship’s VentureLabs unit working with three world leaders within the field of gene regulation and translational medicine: Richard A. Young, Ph.D., Whitehead Institute, Massachusetts Institute of Technology; James “Jay” E. Bradner, M.D., Harvard Medical School, Dana Farber Cancer Institute, Broad Institute; and Nathanael Gray, Ph.D., Harvard Medical School, Dana Farber Cancer Institute. Other investors include the WuXi PharmaTech Corporate Venture Fund, and undisclosed private investors.

    “It is increasingly clear that much of human diseases lies in the switches that control genes rather than the genes themselves,” said Dr. Young. “We now can map the regulatory circuits in all human cells, including the critical switches in cancer and other diseases. This offers a promising new way to treat disease.”

    Industry veteran Nancy Simonian, M.D. will lead the company as Chief Executive Officer. She has a proven track record of value creation in biotechnology, most recently as Chief Medical Officer at Millennium Pharmaceuticals and, previously, as Vice President of Clinical Development at Biogen. Nancy led efforts that revolutionized treatments for patients with multiple sclerosis, multiple myeloma and mantle cell lymphoma. At Millennium and Biogen, Nancy oversaw all phases of drug development, including more than 15 INDs, NDAs/BLAs for Velcade and Avonex, and medical affairs. Syros management also includes Scott Rakestraw, Ph.D., Chief Business Officer; Gregg Beloff, J.D., Chief Financial Officer; Christian Fritz, Ph.D., Vice President, Biology; and, Kevin Sprott, Ph.D., Senior Director, Chemistry.

    “Discovery of the switches for genes critical in disease opens a completely new approach to helping people with serious illnesses such as cancer,” said Dr. Simonian. “We are thrilled to have Drs. Young, Bradner and Gray as our outstanding scientific founders and together will translate these breakthroughs in gene control into transformative treatments for patients.”

    Syros Signs Exclusive Technology Licensing Agreements

    Syros has executed exclusive, worldwide licensing agreements with the Whitehead Institute and the Dana Farber Cancer Institute. Licenses cover Syros’ lead scientific platform components, including Super-Enhancer technologies and methods.

    Syros Names Board of Directors, Scientific Advisory Board

    Drs. Simonian, Young and Bradner are joined on the Syros Board of Directors by Nobel Prize Winner Phillip Sharp, Ph.D., a world leader of research in molecular biology and biochemistry and Institute Professor at the Massachusetts Institute of Technology, Douglas Cole, M.D., General Partner of Flagship Ventures, and Robert Nelsen, Co-founder and Managing Director of ARCH Venture Partners. The company is also supported by a Scientific Advisory Board (SAB) comprising world-renowned experts within the fields of oncology, gene control, chemistry, and drug discovery. In addition to the three Scientific Founders of the company, SAB Members include: Bradley Bernstein, M.D., Ph.D., Massachusetts General Hospital, Harvard Stem Cell Institute, Harvard Medical School, Howard Hughes Medical Institute, Broad Institute; Scott Biller, Ph.D., Agios Pharmaceuticals; Gerard Evan, Ph.D., FRS, FMedSci, University of Cambridge, European Molecular Biology, UK Academy of Medical Sciences, Royal Society; William Kaelin, M.D., Dana Farber Cancer Institute and Brigham and Women’s Hospital, Harvard Medical School and Harvard Cancer Center, Howard Hughes Medical Institute; Stefan Knapp, Ph.D., Target Discovery Institute, Structural Genomics Consortium, University of Oxford; Mark Murcko, Ph.D., Former CTO, Vertex Pharmaceuticals, Northeastern Lecturer, Massachusetts Institute of Technology; Aviv Regev, Ph.D., Massachusetts Institute of Technology, Broad Institute, Howard Hughes Medical Institute, Klarman Cell Observatory; Phillip Sharp, Ph.D., Koch Institute for Integrative Cancer Research, Massachusetts Institute of Technology; Roger Tung, Ph.D., Concert Pharmaceuticals; Chris Vakoc, M.D., Ph.D., Cold Spring Harbor Lab; and Bob Weinberg, Ph.D., Whitehead Institute, Massachusetts Institute of Technology, MIT Ludwig Center for Molecular Oncology.

    Syros’s founding investors have been involved in establishing and building leading biopharmaceutical companies, such as Agios Pharmaceuticals, Receptos, Quanterix, ModeRNA, Aveo, Tetraphase, Avedro, Selecta Bioscience, Illumina, Aviron, Alnylam, Ikaria, Adolor, Caliper Life Sciences, Kythera, Xenoport, Array Biosciences, decode Genetics, and others

    “Occasionally, truly breakthrough biology has the potential for near-term and direct impact on therapeutics,” said Dr. Cole. “Syros Pharmaceuticals has a rare opportunity to establish entirely new strategies to treat cancer and other serious diseases based on the revolutionary insights emerging from the scientific founders’ labs. It has been a privilege to work with our co-founders to launch this exciting venture.”

    “Syros brings together an exceptionally experienced management team, once-in-a-decade breakthrough science, some of the best scientific and drug discovery minds in the business, and the resources to move fast,” said Mr. Nelsen.

    About Syros Pharmaceuticals

    Syros Pharmaceuticals is a life sciences company harnessing breakthroughs in gene control to revolutionize the treatment of cancer and other diseases. Syros’ proprietary platform identifies the master switches for disease genes, opening a whole new approach to novel therapeutics. Syros’ initial focus is in cancer, but the company platform will also be applicable to other therapeutic areas. The Company’s founders are pioneers in gene control research and translation. Co-founded and backed by Flagship Ventures and ARCH Venture Partners, Syros Pharmaceuticals is located in Watertown, MA. For more information, visit www.syros.com.

    About ARCH Venture Partners

    ARCH Venture Partners is a premier provider of seed- and early-stage capital for technology firms, with a special competence in co-founding and building technology firms from startup. ARCH invests primarily in companies co-founded with leading scientists and entrepreneurs, concentrating in innovations in life sciences, physical sciences, and information technology. ARCH enjoys special recognition as a leader in the successful commercialization of technologies developed at academic research institutions and national laboratories. The company manages seven funds totaling over $1.5 billion and has invested in the earliest venture capital rounds for more than 150 companies over 27 years. Portfolio companies where ARCH was a co-founding or early investor include Illumina, Aviron, Agios, Impinj, Xenoport, Alnylam, Ikaria, Kythera, Sapphire Energy, Microoptical Devices, New Era of Networks, Netbot, Trubion Pharmaceuticals, Adolor, Nanosys, Receptos, Caliper Life Sciences, Ahura, Xtera, Array Biopharma, Everyday Learning Corporation, Nanophase Technologies, R2 Technologies, and deCode Genetics, among others. More information is available at www.archventure.com.

    About Flagship Ventures

    Realizing entrepreneurial innovation is the mission of Flagship Ventures. The firm operates through two synergistic units: VentureLabsTM, which invents and launches transformative companies, and Venture Capital, which finances and realizes innovative early-stage companies. Founded in 2000 and based in Cambridge, Massachusetts, Flagship Ventures manages over $900 million in capital. The Flagship team innovates and invests in three principal business sectors: therapeutics, health technologies and sustainability/clean technology. Past successful Flagship portfolio companies include: Accuri Cytometers (acquired by Becton, Dickinson and Company), Adnexus (acquired by Bristol-Myers Squibb), Hypnion (acquired by Eli Lilly), AVEO (NASDAQ: AVEO), BG Medicine (NASDAQ: BGMD), Tetraphase (NASDAQ:TTPH), and Morphotek (acquired by Eisai). Additional notable portfolio companies include: AeroDesigns, Affinnova, Agios, BIND Therapeutics, Joule Unlimited, Receptos, Quanterix, and Moderna Therapeutics.

    The post Syros Pharmaceuticals Inks $30M Series A appeared first on peHUB.

  • Reuters – Ridgemont Equity Raises $735M Fund

    A private equity firm spun off from Bank of America Corp in 2010 has raised its first fund, allowing it to continue making investments in mid-sized companies as the No. 2 U.S. bank pulls back from the business, Reuters reported. Ridgemont Equity Partners executives told Reuters that the firm received total commitments of $735 million from institutional investors in the United States, Europe and Asia. They include AlpInvest Partners Inc and the State of Wisconsin Investment Board, but not Bank of America. Charlotte, North Carolina-based Ridgemont raised the capital at a time when dollars are scarce for first-time funds.

    (Reuters) – A private equity firm spun off from Bank of America Corp in 2010 has raised its first fund, allowing it to continue making investments in mid-sized companies as the No. 2 U.S. bank pulls back from the business.

    Ridgemont Equity Partners executives told Reuters that the firm received total commitments of $735 million from institutional investors in the United States, Europe and Asia. They include AlpInvest Partners Inc and the State of Wisconsin Investment Board, but not Bank of America.

    Charlotte, North Carolina-based Ridgemont raised the capital at a time when dollars are scarce for first-time funds. Only 28 reached a final close in the first quarter, the lowest number in any quarter from 2008 to 2013, according to Preqin, which tracks private equity investments.

    The new funds accounted for just 6 percent of the $67 billion raised by all funds during the period, compared with 20 percent at the peak.

    “It’s almost always challenging for first-time funds, and then you add in a tough fund-raising market in general,” said Travis Hain, who is on Ridgemont’s executive committee with Walker Poole and Trey Sheridan. “But if you have the right story, you’ll find support.”

    Ridgemont is an unusual case because the firm’s principals have been making investments together since 1993, injecting more than $3 billion into 115 companies. Before the spinoff, the firm was known as Banc of America Capital Investors and received its seed money from the bank.

    Bank of America has been winding down its private equity business as Chief Executive Brian Moynihan looks to streamline the company and follow new rules. The U.S. Dodd-Frank financial reform law limits much capital banks can invest in private equity funds.

    Bank of America sold a $1.9 billion portfolio to insurer AXA SA’s private equity arm in 2010. The bank had $1 billion in private equity investments at the end of the fourth quarter, down from $5.7 billion at the beginning of 2010.

    Bank of America’s goal is to sell its investments over time as the company focuses on its core businesses, spokesman Jerry Dubrowski said.

    Ridgemont still manages undisclosed private equity investments for the bank, but they are in “run-off” mode, Hain said.

    Being independent has advantages for Ridgemont, Hain said. The firm is not part of a larger bureaucracy and does not face the same regulatory restrictions as a bank. It also has a more focused investment strategy, sticking to mid-sized companies in four sectors ranging from energy to telecommunications, he said.

    So far, the firm, which has 27 employees, has committed about half of its new fund to investments in nine companies. This month, the bank closed an investment in a software and marketing company called Simpleview Inc. Ridgemont’s investments typically range from $25 million to $75 million.

    AlpInvest partner Chris Perriello said the investor started building a relationship with the Ridgemont executives before they left the bank and as they were taking steps to become independent. “We felt they had a high-quality team, a strong record,” he said.

    Officials at the Wisconsin investment board were not immediately available for comment.

    Not every bank is backing away from private equity investments. Wells Fargo & Co has said it can still make investments through two funds that are subsidiaries . Goldman Sachs Group Inc, meanwhile, is trying to do private equity deals alongside investors that keep their funds in separately managed accounts.

    The post Reuters – Ridgemont Equity Raises $735M Fund appeared first on peHUB.

  • Google Play, Apple’s App Store Might Face “Legal Undertakings” In OFT’s Investigation Of Freemium Games For Kids

    kids on tablets

    The freemium kids’ app party that has seen some parents left with hefty bills because of their kids’ use of games could be heading for a sticky end — at least in the U.K. The Office of Fair Trading has announced a six-month investigation into whether children are being “unfairly pressured or encouraged to pay for additional content in ‘free’ web and app-based games”.

    The OFT says in a press release that it cannot identify the companies that are subject to investigation but a spokesman confirmed to TechCrunch  it is contacting Apple and Google as part of this process — being the proprietors of the two largest app stores: the iTunes App Store and Google Play.

    Once the investigation has concluded — and if the OFT is  unhappy with what it learns and the discussions it’s had — the spokesman said it “can seek legal undertakings from court”.  Companies subsequently ignoring any court directions could face “an unlimited fine”, he added.

    The OFT is concerned that developers are designing children’s content to deliberately encourage kids to make payments after the initial free download/access. It’s not citing any examples or naming any problematic apps at this point but it’s not hard to find instances that are likely to have triggered the investigation — such as the five-year-old British boy who accidentally made in-app purchases totalling £1,700 in 15 minutes playing  Zombies vs Ninja. Or the British six-year-old girl who amassed a £900 bill in half an hour on the My Little Pony app.

    The OFT points out that “direct exhortations” (ie strong encouragement) to children to make purchases themselves, or ask another adult to do something that results in a purchase, are unlawful under the Consumer Protection (from Unfair Trading) Regulations 2008. The sort of in-app purchases that might fall foul of the regulation could include membership, virtual currency/rewards, additional levels, faster gameplay and additional game features, it added.

    The OFT said it has written to companies that are offering free web or app-based games asking for information on in-game marketing to children. It is also asking for parents and consumer groups to contact it with information about “potentially misleading or commercially aggressive practices they are aware of in relation to these games”.

    The spokesman said the aim of the investigation is to get more “clarity” about the digital market for kids’ games, and the sorts of behaviours/mechanics apps are utilising, by talking to games developers, app stores, parents and consumer groups.

    The investigation will also specifically consider whether the full cost of games aimed at children is being made clear when they are downloaded/accessed. ”The information [gathered during the investigation] will be used to understand business practices used in this sector, to establish whether consumer protection regulations are being breached and if so what the consumer harm is,” the OFT said today, adding that it “expects to publish its next steps by October 2013″.

    Commenting in a statement, Cavendish Elithorn, OFT Senior Director for Goods and Consumer, added: “The OFT is not seeking to ban in-game purchases, but the games industry must ensure it is complying with the relevant regulations so that children are protected. We are speaking to the industry and will take enforcement action if necessary.”

    The  spokesman stressed that the OFT hopes to be able to solve any issues uncovered through “conversations” with the various companies involved — including Apple and Google — rather than taking the court route . ”We hope this is going to be resolved by talking to the big companies,” he added.

    Google declined to comment on the investigation when contacted by TechCrunch.

    At the time of writing Apple had not responded to a request for comment.

    Both Google’s and Apple’s app stores require developers to sign developer agreements in order to successfully submit apps, and both have been known to remove content that violates these developer guidelines — so app stores are already in the app policing business.

    Google’s Play Store developer guidelines include the following (vague) stipulation, for instance, that could potentially be used to boot freemium kids’ apps that are misleading about the potential costs:

    Developers must not mislead users about the applications they are selling nor about any in-app services, goods, content or functionality they are selling.

    Apple does more policing of its store than Google, with iOS developers required to submit apps for approval prior to publication on the store. “We review all apps to ensure they are reliable, perform as expected, and are free of offensive material”, Apple notes on its developer site,  warning app makers to: “Before submitting your new or updated apps for review, check out the latest App Store Review Guidelines and Mac App Store Review Guidelines.”

    There are  also signs that Cupertino has been looking more closely at some of the problems posed by having kids interact with apps. Earlier this month it relocated age ratings from the bottom of app listings on its store, to the top near the title where they are easier for parents to spot.

    This change is likely to have been triggered by concerns about apps powered by user-generated content that can contain adult material appearing in the app store where children could find them — such as Twitter’s Vine video app — rather than specifically helping parents prevent kids making in-app purchases.

    Here’s the OFT’s summary of the investigation:

    Many children’s web- and app-based games are free to sign up to or download.  Some of those games give players the opportunity to ‘upgrade’ their free accounts through paid-for membership, providing access to parts of the game not available to non-paying players. Others encourage in-game purchases to speed up gameplay or to give access to extra game features.

    The OFT will look into whether those children’s games are in line with the Consumer Protection (from Unfair Trading) Regulations 2008 to ensure that any commercial practices they include are not misleading or aggressive. In particular, the OFT will consider whether children’s web- and app-based games directly encourage children to buy something or to pester their parents or other adults to buy something for them. [see note 1]

    The OFT will gather information on this issue for the next six months and is interested to hear from businesses operating in the market and mobile app platform operators. The OFT will also consult with relevant UK and international regulators.

    The OFT is also keen to hear about potentially misleading or commercially aggressive practices experienced by parents whose children play these games, and also from consumer groups with an interest in this area.

    note 1: The Regulations, under Annex Practice 28, prohibit advertisements from including direct exhortations to children to buy something or to ask their parents or other adults to buy something for them.

  • Bing searches throw up more malware sites than Google

    We all know that search engine results can sometimes serve up malware, but if you’re using Bing you’re five times more likely to get malicious links than if you’re using Google.

    In an 18-month study, independent German lab AV-Test  discovered that all search engines sometimes serve up Trojans and other malware amongst their results despite the search providers’ best efforts to prevent it.

    AV-Test found 5,000 malware links across 40 million websites, so toxic search results are perhaps rarer than you’d think. However, it seems that developers are putting their efforts into SEO techniques so that their results appear higher up the rankings where users are most vulnerable to clicking without thinking.

    Google and Bing proved to be the safest search engines in the study, but of the two it’s Bing that’s more likely to give you a nasty surprise when clicking on a link, delivering 1,285 malicious results to Google’s 272. If it’s any consolation you can feel sorry for the Russians as their search giant Yandex delivered more than ten times as many infected sites as Google.

    Most of the infected sites exploit existing vulnerabilities so you can keep yourself safe by ensuring that your browser and security software are always up to date.

    Photo Credits: maraga/Shutterstock

  • Twitter expands Trends in 160 more locations

    If something really matters on Twitter then it’s on Trends. The little card displayed to the left of the tweets feed in the browser (or inside a tab in the mobile app) shows ten topics that have managed to come out on top as most relevant to worldwide users.

    In order to get even more spooky, Twitter even offers something called “tailored Trends” which delivers a more personal list “based on who you follow and your location”. But if you don’t want to use the feature, you can get trends only for a specific location, an option which Twitter has expanded to include 160 more places worldwide.

    Twitter users are now able to also choose Belgium, Greece, Norway, Poland, Portugal and Ukraine, as well as in excess of 130 new cities worldwide, in countries that are already supported in Trends, to get their dose of trending topics for a certain location.

    Just head over to the Trends card, click on “Change” and choose your preferred location to get started. If you don’t like it, you can always click on “Get tailored Trends” to go back to the standard way of viewing the Trends.

    Photo Credit: Julien Tromeur/Shutterstock

  • Le Dimmer helps you avoid distractions and stay focused on your PC work

    If you’re working on some important PC task and want to avoid distractions, then maximizing your program window is usually a good place to start. But if you need to monitor several programs — or the window just can’t be maximized — then Le Dimmer may offer a more interesting approach.

    The program is tiny, portable, and has just a single task: after launching, it dims everything on your desktop apart from the current window.

    The effect can be quite restful on the eyes, depending on your applications, with all the usual brightness reduced by a considerable degree. (Although if you run a program with a white background then the contrast becomes greater, so it won’t work for everyone.)

    It all runs entirely automatically, too. There’s nothing to do, no complex interface to navigate, the program just works. All you have to remember is to right-click the system tray and select “Quit” if you decide you don’t need the effect any more, and want to resume normal operations.

    If you do need more control, though, there is a useful command line switch available. Pass the program a number between 0 and 255 (“LeDimmer.exe 150”) and you’re setting the “Dim factor”. 150 is the default; increasing this will darken the dimmed part of the screen even further, while reducing the value will brighten it. (Check the program’s ReadMe.txt for details.)

    Le Dimmer isn’t quite bullet-proof. We changed our screen resolution while running the program, and not only did it not notice, but after a few clicks it also managed to crash Explorer. Oops.

    Still, no data was lost, even with this extreme test. The rest of the time, Le Dimmer worked just fine. And if you’re looking for a way to focus your attention on one task, while not entirely losing track of some others, then the program could prove very useful.

    Photo Credit: kentoh/Shutterstock

  • O2 Refresh uncouples phone and airtime tariffs in the UK for easier upgrades

    When you get a new mobile phone and set up a new contract, you’re tied into it for a set period of time. If you want to upgrade to a new phone partway through your contract you’ll need to pay off at least some — if not all — of the remaining fees, which can prove very costly.

    O2 has come up with a new mobile phone price plan designed to appeal to people who like to always have the latest smartphone.

    Available from O2 stores from Tuesday 16 April, O2 Refresh is actually two separate contracts — one covering airtime (your talktime, texts, data and so on) and another covering the cost of the phone itself.

    Customers firstly choose the airtime plan they want. There are a choice of three price points on offer — £12 a month (600 minutes, unlimited texts and 750MB of data), £17 a month (unlimited minutes, unlimited texts and 1GB of data) and £22 a month (unlimited minutes, unlimited texts and 2GB of data). Buyers then add on the plan for the phone (the HTC One, for example, costs £49.99 up front and £20 per month for two years). When a new model comes out, they only have to pay off whatever’s remaining on the phone plan. The airtime plan can be ended at the same time with no termination fee.

    “Mobile phone technology continues to advance at a rapid pace, yet the way phones are sold has remained largely static,” Feilim Mackle, Sales and Service Director at Telefónica UK explains. “Increasingly our customers are telling us that they don’t want to be tied to the same phone for two years and, with 4G coming to O2 this summer, we want to make it easier for our customers to benefit from the latest technology. For the first time in the UK, O2 Refresh will make it possible to get a new phone part way through a pay monthly contract, at any time — quickly, easily and cost-effectively”.

    The list of phones available on the O2 Refresh plan include HTC One, Sony Xperia Z, Blackberry Z10, Samsung Galaxy S3 and Apple iPhone 5. The Blackberry Q10 and Samsung Galaxy S4 will be added when they become available.

  • Report: Many urban tap water systems loaded with SSRI antidepressant drugs

    As you may recall, the Associated Press (AP) released the results of a groundbreaking investigation it conducted back in 2008 concerning the presence of pharmaceutical drugs in the water supply. In this report, it was revealed that at least 41 million Americans are exposed…
  • Bitcoin price craters as panic selloff claims 75% loss from bubble high

    The bitcoin selloff that began less than 24 hours after I predicted a “disastrous bitcoin crash” has now plummeted nearly 75% from Wednesday’s bitcoin high of $266, wiping out over $1.5 billion in valuation for the crypto currency. As bitcoin skyrocketed in value…
  • Monsanto wins patent battle with DuPont corporation to seize full rights of GM seed patents

    Apparently discontent with its more than $13.5 billion-plus in annual sales, genetic modification kingpin Monsanto has been trying for the past four-or-so years to extract billions more dollars from rival DuPont for alleged patent infringements involving its genetically…
  • Palm oil health craze may push animals to extinction while destroying the environment

    Red palm oil has burst onto the health scene as a miracle food, helping to heal everything from cardiovascular disease to Alzheimer’s to cancer. However, as it becomes more popular worldwide, a dark secret has come to light. Due to its lucrative value, rainforests in…
  • Xbox SmartGlass Comes to Android Tablets

    xbox_smartglass

    Microsoft has finally released a Tablet version of its ever popular Xbox SmartGlass Android app. The app is compatible with Tablets with screen sizes 7″ and larger, and is available for immediate download on the Google Play Store. Hit the break below for the download link.

    As is the case with the Xbox SmartGlass smartphone app, Tablet user’s will be able to:

    • Navigate their Xbox 360 with swipe and tap
    • Use a phone’s keyboard to type to a user’s Xbox 360
    • Browse the Internet on an Xbox 360 with full keyboard and zooming
    • Play, pause, fast forward, rewind, and stop videos and music on an Xbox 360
    • Search the full Xbox catalog of music, video, and games
    • Enjoy rich, interactive experiences from select game and entertainment content creators
    • Track and compare your achievements with your Xbox friends
    • Change up user’s 3D avatars
    • Message Xbox friends
    • Edit Xbox profiles

    The new download is a hefty 18MB, which is well worth it for all of the “numerous design and usability improvements” Microsoft promises in this version. One user noted that the Tablet version will enable the app to stay awake during extended periods of inactivity, eliminating the need to wait for the app to reestablish a connection when brought out of standby mode. A welcome bug fix, especially for those of us that want our information at the ready at a moments notice.

    QR Code generator
    Play Store Download Link

    Come comment on this article: Xbox SmartGlass Comes to Android Tablets

  • Accidental Empires, Part 20 — Counter-Reformation (Chapter 14)

    Twentieth in a series. “Market research firms tend to serve the same function for the PC industry that a lamppost does for a drunk”, writes Robert X. Cringely in this installment of 1991 classic Accidental Empires. Context is universal forecast that OS/2 would overtake MS-DOS. Analysts were wrong then, much as they are today making predictions about smartphones, tablets and PCs. The insightful chapter also explains vaporware and product leak tactics IBM pioneered, Microsoft refined and Apple later adopted.

    In Prudhoe Bay, in the oilfields of Alaska’s North Slope, the sun goes down sometime in late November and doesn’t appear again until January, and even then the days are so short that you can celebrate sunrise, high noon, and sunset all with the same cup of coffee. The whole day looks like that sliver of white at the base of your thumbnail.

    It’s cold in Prudhoe Bay in the wintertime, colder than I can say or you would believe — so cold that the folks who work for the oil companies start their cars around October and leave them running twenty-four hours a day clear through to April just so they won’t freeze up.

    Idling in the seemingly endless dark is not good for a car. Spark plugs foul and carburetors gum up. Gas mileage goes completely to hell, but that’s okay; they’ve got the oil. Keeping those cars and trucks running night and pseudoday means that there are a lot of crummy, gas-guzzling, smoke-spewing vehicles in Prudhoe Bay in the winter, but at least they work.

    Nobody ever lost his job for leaving a car running overnight during a winter in Prudhoe Bay.

    And it used to be that nobody ever lost his job for buying computers from IBM.

    But springtime eventually comes to Alaska. The tundra begins to melt, the days get longer than you can keep your eyes open, and the mosquitoes are suddenly thick as grass. It’s time for an oil change and to give that car a rest. When the danger’s gone — when the environment has improved to a point where any car can be counted on to make it through the night, when any tool could do the job — then efficiency and economy suddenly do become factors. At the end of June in Prudhoe Bay, you just might get in trouble for leaving a car running overnight, if there was a night, which there isn’t.

    IBM built its mainframe computer business on reliable service, not on computing performance or low prices. Whether it was in Prudhoe Bay or Houston, when the System 370/168 in accounting went down, IBM people were there right now to fix it and get the company back up and running. IBM customer hand holding built the most profitable corporation in the world. But when we’re talking about a personal computer rather than a mainframe, and it’s just one computer out of a dozen, or a hundred, or a thousand in the building, then having that guy in the white IBM coveralls standing by eventually stops being worth 30 percent or 50 percent more.

    That’s when it’s springtime for IBM.

    IBM’s success in the personal computer business was a fluke. A company that was physically unable to invent anything in less than three years somehow produced a personal computer system and matching operating system in one year. Eighteen months later, IBM introduced the PC-XT, a marginally improved machine with a marginally improved operating system. Eighteen months after that, IBM introduced its real second-generation product, the PC-AT, with five times the performance of the XT.

    From 1981 to 1984, IBM set the standard for personal computing and gave corporate America permission to take PCs seriously, literally creating the industry we know today. But after 1984, IBM lost control of the business.

    Reality caught up with IBM’s Entry Systems Division with the development of the PC-AT. From the AT on, it took IBM three years or better to produce each new line of computers. By mainframe standards, three years wasn’t bad, but remember that mainframes are computers, while PCs are just piles of integrated circuits. PCs follow the price/performance curve for semiconductors, which says that performance has to double every eighteen months. IBM couldn’t do that anymore. It should have been ready with a new line of industry-leading machines by 1986, but it wasn’t. It was another company’s turn.

    Compaq Computer cloned the 8088-based IBM PC in a year and cloned the 80286-based PC-AT in six months. By 1986, IBM should have been introducing its 80386-based machine, but it didn’t have one. Compaq couldn’t wait for Big Blue and so went ahead and introduced its DeskPro 386. The 386s that soon followed from other clone makers were clones of the Compaq machine, not clones of IBM. Big Blue had fallen behind the performance curve and would never catch up. Let me say that a little louder: ibm will never catch up.

    IBM had defined MS-DOS as the operating system of choice. It set a 16-bit bus standard for the PC-AT that determined how circuit cards from many vendors could be used in the same machine. These were benevolent standards from a market leader that needed the help of other hardware and software companies to increase its market penetration. That was all it took. Once IBM could no longer stay ahead of the performance curve, the IBM standards still acted as guidelines, so clone makers could take the lead from there, and they did. IBM saw its market share slowly start to fall.

    But IBM was still the biggest player in the PC business, still had the the greatest potential for wreaking technical havoc, and knew better than any other company how to slow the game down to a more comfortable pace. Here are some market control techniques refined by Big Blue over the years.

    Technique No. 1. Announce a direction, not a product. This is my favorite IBM technique because it is the most efficient one from Big Blue’s perspective. Say the whole computer industry is waiting for IBM to come out with its next-generation machines, but instead the company makes a surprise announcement: “Sorry, no new computers this year, but that’s because we are committing the company to move toward a family of computers based on gallium arsenide technology [or Josephson junctions, or optical computing, or even vegetable computing — it doesn’t really matter]. Look for these powerful new computers in two years.”

    “Damn, I knew they were working on something big,” say all of IBM’s competitors as they scrap the computers they had been planning to compete with the derivative machines expected from IBM.

    Whether IBM’s rutabaga-based PC ever appears or not, all IBM competitors have to change their research and development focus, looking into broccoli and parsnip computing, just in case IBM is actually onto something. By stating a bold change of direction, IBM looks as if it’s grasping the technical lead, when in fact all it’s really doing is throwing competitors for a loop, burning up their R&D budgets, and ultimately making them wait up to two years for a new line of computers that may or may not ever appear. (IBM has been known, after all, to say later, “Oops, that just didn’t work out,” as they did with Josephson junction research.) And even when the direction is for real, the sheer market presence of IBM makes most other companies wait for Big Blue’s machines to appear to see how they can make their own product lines fit with IBM’s.

    Whenever IBM makes one of these statements of direction, it’s like the yellow flag coming out during an auto race. Everyone continues to drive, but nobody is allowed to pass.

    IBM’s Systems Application Architecture (SAA) announcement of 1987, which was supposed to bring a unified programming environment, user interface, and applications to most of its mainframe, minicomputer, and personal computer lines by 1989, was an example of such a statement of direction. SAA was for real, but major parts of it were still not ready in 1991.

    Technique No. 2. Announce a real product, but do so long before you actually expect to deliver, disrupting the market for competitive products that are already shipping.

    This is a twist on Technique No. 1 though aimed at computer buyers rather than computer builders. Because performance is always going up and prices are always going down, PC buyers love to delay purchases, waiting for something better. A major player like IBM can take advantage of this trend, using it to compete even when IBM doesn’t yet have a product of its own to offer.

    In the 1983-1985 time period, for example, Apple had the Lisa and the Macintosh, VisiCorp had VisiOn, its graphical computing environment for IBM PCs, Microsoft had shipped the first version of Windows, Digital Research produced GEM, and a little company in Santa Monica called Quarterdeck Office Systems came out with a product called DesQ. All of these products — even Windows, which came from Microsoft, IBM’s PC software partner — were perceived as threats by IBM, which had no equivalent graphical product. To compete with these graphical environments that were already available, IBM announced its own software that would put pop-up windows on a PC screen and offer easy switching from application to application and data transfer from one program to another. The announcement came in the summer of 1984 at the same time the PC-AT was introduced. They called the new software TopView and said it would be available in about a year.

    DesQ had been the hit of Comdex, the computer dealers’ convention held in Atlanta in the spring of 1984. Just after the show, Quarterdeck raised $5.5 million in second-round venture funding, moved into new quarters just a block from the beach, and was happily shipping 2,000 copies of DesQ per month. DesQ had the advantage over most of the other windowing systems that it worked with existing MS-DOS applications. DesQ could run more than one application at a time, too — something none of the other systems (except Apple’s Lisa) offered. Then IBM announced TopView. DesQ sales dropped to practically nothing, and the venture capitalists asked Quarterdeck for their money back.

    All the potential DesQ buyers in the world decided in a single moment to wait for the truly incredible software IBM promised. They forgot, of course, that IBM was not particularly noted for incredible software — in fact, IBM had never developed PC software entirely on its own before. TopView was true Blue — written with no help from Microsoft.

    The idea of TopView hurt all the other windowing systems and contributed to the death of Vision and DesQ. Quarterdeck dropped from fifty employees down to thirteen. Terry Myers, co-founder of Quarterdeck and one of the few women to run a PC software company, borrowed $20,000 from her mother to keep the company afloat while her programmers madly rewrote DesQ to be compatible with the yet-to-be-delivered TopView. They called the new program DesqView.

    When TopView finally appeared in 1985, it was a failure. The product was slow and awkward to use, and it lived up to none of the promises IBM made. You can still buy TopView from IBM, but nobody does; it remains on the IBM product list strictly because removing it would require writing off all development expenses, which would hurt IBM’s bottom line.

    Technique No. 3. Don’t announce a product, but do leak a few strategic hints, even if they aren’t true.

    IBM should have introduced a follow-on to the PC-AT in 1986 but it didn’t. There were lots of rumors, sure, about a system generally referred to as the PC-2, but IBM staunchly refused to comment. Still, the PC-2 rumors continued, accompanied by sparse technical details of a machine that all the clone makers expected would include an Intel 80386 processor. And maybe, the rumors continued, the PC-2 would have a 32-bit bus, which would mean yet another technical standard for add-in circuit cards.

    It would have been suicide for a clone maker to come out with a 386 machine with its own 32-bit bus in early 1986 if IBM was going to announce a similar product a month or three later, so the clone makers didn’t introduce their new machines. They waited and waited for IBM to announce a new family of computers that never came. And during the time that Compaq and Dell, and AST, and the others were waiting for IBM to make its move, millions of PC-ATs were flowing into Fortune 1000 corporations, still bringing in the big bucks at a time when they shouldn’t have still been viewed as top-of-the-line machines.

    When Compaq Computer finally got tired of waiting and introduced its own DeskPro 386, it was careful to make its new machine use the 16-bit circuit cards intended for the PC-AT. Not even Compaq thought it could push a proprietary 32-bit bus standard in competition with IBM. The only 32-bit connections in the Compaq machine were between the processor and main memory; in every other respect, it was just like a 286.

    Technique No. 4. Don’t support anybody else’s standards; make your own.

    The original IBM Personal Computer used the PC-DOS operating system at a time when most other microcomputers used in business ran CP/M. The original IBM PC had a completely new bus standard, while nearly all of those CP/M machines used something called the S-100 bus. Pushing a new operating system and a new bus should have put IBM at a disadvantage, since there were thousands of CP/M applications and hundreds of S-100 circuit cards, and hardly any PC-DOS applications and less than half a dozen PC circuit cards available in 1981. But this was not just any computer start-up; this was IBM, and so what would normally have been a disadvantage became IBM’s advantage. The IBM PC killed CP/M and the S-100 bus and gave Big Blue a full year with no PC-compatible competitors.

    When the rest of the world did its computer networking with Ethernet, IBM invented another technology, called Token Ring. When the rest of the world thought that a multitasking workstation operating system meant Unix, IBM insisted on OS/2, counting on its influence and broad shoulders either to make the IBM standard a de facto standard or at least to interrupt the momentum of competitors.

    Technique No. 5. Announce a product; then say you don’t really mean it.

    IBM has always had a problem with the idea of linking its personal computers together. PCs were cheaper than 3270 terminals, so IBM didn’t want to make it too easy to connect PCs to its mainframes and risk hurting its computer terminal business. And linked PCs could, by sharing data, eventually compete with minicomputer or mainframe time-sharing systems, which were IBM’s traditional bread and butter. Proposing an IBM standard for networking PCs or embracing someone else’s networking standard was viewed in Armonk as a risky proposition. By the mid-1980s, though, other companies were already moving forward with plans to network IBM PCs, and Big Blue just couldn’t stand the idea of all that money going into another company’s pocket.

    In 1985, then, IBM announced its first networking hardware and software for personal computers. The software was called the PC Network (later the PC LAN Program). The hardware was a circuit card that fit in each PC and linked them together over a coaxial cable, transferring data at up to 2 million bits per second. IBM sold $200 million worth of these circuit cards over the next couple of years. But that wasn’t good enough (or bad enough) for IBM, which announced that the network cards, while they are a product, weren’t part of an IBM direction. IBM’s true networking direction was toward another hardware technology called Token Ring, which would be available, as I’m sure you can predict by now, in a couple of years.

    Customers couldn’t decide whether to buy the hardware that IBM was already selling or to wait for Token Ring, which would have higher performance. Customers who waited for Token Ring were punished for their loyalty, since IBM, which had the most advanced semiconductor plants in the world, somehow couldn’t make enough Token Ring adapters to meet demand until well into 1990. The result was that IBM lost control of the PC networking business.

    The company that absolutely controls the PC networking business is headquartered at the foot of a mountain range in Provo, Utah, just down the street from Brigham Young University. Novell Inc. runs the networking business today as completely as IBM ran the PC business in 1983. A lot of Novell’s success has to do with the technical skills of those programmers who come to work straight out of BYU and who have no idea how much money they could be making in Silicon Valley. And a certain amount of its success can be traced directly to the company’s darkest moment, when it was lucky enough to nearly go out of business in 1981.

    Novell Data Systems, as it was called then, was a struggling maker of not very good CP/M computers. The failing company threw the last of its money behind a scheme to link its computers together so they could share a single hard disk drive. Hard disks were expensive then, and a California company, Corvus Systems, had already made a fortune linking Apple IIs together in a similar fashion. Novell hoped to do for CP/M computers what Corvus had done for the Apple II.

    In September 1981, Novell hired three contract programmers to devise the new network hardware and software. Drew Major, Dale Neibaur, and Kyle Powell were techies who liked to work together and hired out as a unit under the name Superset. Superset — three guys who weren’t even Novell employees — invented Novell’s networking technology and still direct its development today. They still aren’t Novell employees.

    Companies like Ashton-Tate and Lotus Development ran into serious difficulties when they lost their architects. Novell and Microsoft, which have retained their technical leaders for over a decade, have avoided such problems.

    In 1981, networking meant sharing a hard disk drive but not sharing data between microcomputers. Sure, your Apple II and my Apple II could be linked to the same Corvus 10-megabyte hard drive, but your data would be invisible to my computer. This was a safety feature, because the microcomputer operating systems of the time couldn’t handle the concept of shared data.

    Let’s say I am reading the text file that contains your gothic romance just when you decide to add a juicy new scene to chapter 24. I am reading the file, adding occasional rude comments, when you grab the file and start to add text. Later, we both store the file, but which version gets stored: the one with my comments, or the one where Captain Phillips finally does the nasty with Lady Margaret? Who knows?

    What CP/M lacked was a facility for directory locking, which would allow only one user at a time to change a file. I could read your romance, but if you were already adding text to it, directory locking would keep me from adding any comments. Directory locking could be used to make some data read only, and could make some data readable only by certain users. These were already important features in multiuser or networked systems but not needed in CP/M, which was written strictly for a single user.

    The guys from Superset added directory locking to CP/M, they improved CP/M’s mechanism for searching the disk directory, and they moved all of these functions from the networked microcomputer up to a specialized processor that was at the hard disk drive. By November 1981, they’d turned what was supposed to have been a disk server like Corvus’s into a file server where users could share data. Novell’s Data Management Computer could support twelve simultaneous users at the same performance level as a single-user CP/M system.

    Superset, not Novell, decided to network the new IBM PC. The three hackers bought one of the first PCs in Utah and built the first PC network card. They did it all on their own and against the wishes of Novell, which just then finally ran out of money.

    The venture capitalists whose money it was that Novell had used up came to Utah looking for salvageable technology and found only Superset’s work worth continuing. While Novell was dismantled around them, the three contractors kept working and kept getting paid. They worked in isolation for two years, developing whole generations of product that were never sold to anyone.

    The early versions of most software are so bad that good programmers usually want to throw them away but can’t because ship dates have to be met. But Novell wasn’t shipping anything in 1982-1983, so early versions of its network software were thrown away and started over again. Novell was able take the time needed to come up with the correct architecture, a rare luxury for a start-up, and subsequently the company’s greatest advantage. Going broke turned out to have been very good for Novell.

    Novell hardware was so bad that the company concentrated almost completely on software after it started back in business in 1983. All the other networking companies were trying to sell hardware. Corvus was trying to sell hard disks. Televideo was trying to sell CP/M boxes. 3Com was trying to sell Ethernet network adapter cards. None of these companies saw any advantage to selling its software to go with another company’s hard disk, computer, or adapter card. They saw all the value in the hardware, while Novell, which had lousy hardware and knew it, decided to concentrate on networking software that would work with every hard drive, every PC, and every network card.

    By this time Novell had a new leader in Ray Noorda, who’d bumped through a number of engineering, then later marketing and sales, jobs in the minicomputer business. Noorda saw that Novell’s value lay in its software. By making wiring a nonissue, with Novell’s software—now called Netware—able to run on any type of networking scheme, Noorda figured it would be possible to stimulate the next stage of growth. “Growing the market” became Noorda’s motto, and toward that end he got Novell back in the hardware business but sold workstations and network cards literally at cost just to make it cheaper and easier for companies to decide to network their offices. Ray Noorda was not a popular man in Silicon Valley.

    In 1983, when Noorda was taking charge of Novell, IBM asked Microsoft to write some PC networking software. Microsoft knew very little about networking in 1983, but Bill Gates was not about to send his major customer away, so Microsoft got into the networking business.

    “Our networking effort wasn’t serious until we hired Darryl Rubin, our network architect,” admitted Microsoft’s Steve Ballmer in 1991.

    Wait a minute, Steve, did anyone tell IBM back in 1983 that Microsoft wasn’t really serious about this networking stuff? Of course not.

    Like most of Microsoft’s other stabs at new technology, PC networking began as a preemptive strike rather than an actual product. The point of Gates’s agreeing to do IBM’s network software was to keep IBM as a customer, not to do a good product. In fact, Microsoft’s entry into most new technologies follows this same plan, with the first effort being a preemptive strike, the second effort being market research to see what customers really want in a product, and the third try is the real product. It happened that way with Microsoft’s efforts at networking, word processing, and Windows, and will continue in the company’s current efforts in multimedia and pen-based computing. It’s too bad, of course, that hundreds of thousands of customers spend millions and millions of dollars on those early efforts—the ones that aren’t real products. But heck, that’s their problem, right?

    Microsoft decided to build its network technology on top of DOS because that was the company franchise. All new technologies were conceived as extensions to DOS, keeping the old technology competitive—or at least looking so—in an increasingly complex market. But DOS wasn’t a very good system on which to build a network operating system. DOS was limited to 640K of memory. DOS had an awkward file structure that got slower and slower as the number of files increased, which could become a major problem on a server with thousands of files. In contrast, Novell’s Netware could use megabytes of memory and had a lightning-fast file system. After all, Netware was built from scratch to be a network operating system, while Microsoft’s product wasn’t.

    MS-Net appeared in 1985. It was licensed to more than thirty different hardware companies in the same way that MS-DOS was licensed to makers of PC clones. Only three versions of MS-Net actually appeared, including IBM’s PC LAN program, a dog.

    The final nail in Microsoft’s networking coffin was also driven in 1985 when Novell introduced Netware 2.0, which ran on the 80286 processor in IBM’s PC-AT. You could run MS-Net on an AT also but only in the mode that emulated an 8086 processor and was limited to addressing 640K. But Netware on an AT took full advantage of the 80286 and could address up to 16 megabytes of RAM, making Novell’s software vastly more powerful than Microsoft’s.

    This business of taking software written for the 8086 processor and porting it to the 80286 normally required completely rewriting the software by hand, often taking years of painstaking effort. It wasn’t just a matter of recompiling the software, of having a machine do the translation, because Microsoft staunchly maintained that there was no way to recompile 8086 code to run on an 80286. Bill Gates swore that such a recompile was impossible. But Drew Major of Superset didn’t know what Bill Gates knew, and so he figured out a way to recompile 8086 code to run on an 80286. What should have taken months or years of labor was finished in a week, and Novell had won the networking war. Six years and more than $100 million later, Microsoft finally admitted defeat.

    Meanwhile, back in Boca Raton, IBM was still struggling to produce a follow-on to the PC-AT. The reason that it began taking IBM so long to produce new PC products was the difference between strategy and tactics. Building the original IBM PC was a tactical exercise designed to test a potential new market by getting a product out as quickly as possible. But when the new market turned out to be ten times larger than anyone at IBM had realized and began to affect the sales of other divisions of the company, PCs suddenly became a strategic issue. And strategy takes time to develop, especially at IBM.

    Remember that there is nobody working at IBM today who recalls those sun-filled company picnics in Endicott, New York, back when the company was still small, the entire R&D department could participate in one three-legged race, and inertia was not yet a virtue. The folks who work at IBM today generally like the fact that it is big, slow moving, and safe. IBM has built an empire by moving deliberately and hiring third-wave people. Even Don Estridge, who led the tactical PC effort up through the PC-AT, wasn’t welcome in a strategic personal computer operation; Estridge was a second-wave guy at heart and so couldn’t be trusted. That’s why Estridge was promoted into obscurity, and Bill Lowe, who’d proved that he was a company man, a true third waver with only occasional second-wave leanings that could, and were, beaten out of him over time, was brought back to run the PC operations.

    As an enormous corporation that had finally decided personal computers were part of its strategic plan, IBM laboriously reexamined the whole operation and started funding backup ventures to keep the company from being too dependent on any single PC product development effort. Several families of new computers were designed and considered, as were at least a couple of new operating systems. All of this development and deliberation takes time.

    Even the vital relationship with Bill Gates was reconsidered in 1985, when IBM thought of dropping Microsoft and DOS altogether in favor of a completely new operating system. The idea was to port to the Intel 286 processor operating system software from a California company called Metaphor Computer Systems. The Metaphor software was yet another outgrowth of work done at Xerox PARC and ran then strictly on IBM mainframes, offering an advanced office automation system with a graphical user interface. The big corporate users who were daring enough to try Metaphor loved it, and IBM dreamed that converting the software to run on PCs would draw personal computers seamlessly into the mainframe world in a way that wouldn’t be so directly competitive with its other product lines. Porting Metaphor software would also have brought IBM a major role in application software for its PCs—an area where the company had so far failed.

    Since Microsoft wasn’t even supposed to know that this Metaphor experiment was happening, IBM chose Lotus Development to port the software. The programmers at Lotus had never written an operating system, but they knew plenty about Intel processor architecture, since the high performance of Lotus 1-2-3 came mainly from writing directly to the processor, avoiding MS-DOS as much as possible.

    Nothing ever came of the Lotus/Metaphor operating system, which turned out to be an IBM fantasy. Technically, it was asking too much of the 80286 processor. The 80386 might have handled the job, but for other strategic reasons, IBM was reluctant to move up to the 386.

    IBM has had a lot of such fantasies and done a lot of negotiating and investigating whacko joint ventures with many different potential software partners. It’s a way of life at the largest computer company in the world, where keeping on top of the industry is accomplished through just this sort of diplomacy. Think of dogs sniffing each other.

    IBM couldn’t go forever without replacing the PC-AT, and eventually it introduced a whole new family of microcomputers in April 1987. These were the Personal System/2s and came in four flavors: Models 30, 50, 60, and 80. The Model 30 used an 8086 processor, the Models 50 and 60 used an 80286, and the Model 80 was IBM’s first attempt at an 80386-based PC. The 286 and 386 machines used a new bus standard called the Micro Channel, and all of the PS/2s had 3.5-inch floppy disk drives. By changing hardware designs, IBM was again trying to have the market all to itself.

    A new bus standard meant that circuit cards built for the IBM PC, XT, or AT models wouldn’t work in the PS/2s, but the new bus, which was 32 bits wide, was supposed to offer so much higher performance that a little more cost and inconvenience would be well worthwhile. The Micro Channel was designed by an iconoclastic (by IBM standards) engineer named Chet Heath and was reputed to beat the shit out of the old 16-bit AT bus. It was promoted as the next generation of personal computing, and IBM expected the world to switch to its Micro Channel in just the way it had switched to the AT bus in 1984.

    But when we tested the PS/2s at InfoWorld, the performance wasn’t there. The new machines weren’t even as fast as many AT clones. The problem wasn’t the Micro Channel; it was IBM. Trying to come up with a clever work-around for the problem of generating a new product line every eighteen months when your organization inherently takes three years to do the job, product planners in IBM’s Entry Systems Division simply decided that the first PS/2s would use only half of the features of the Micro Channel bus. The company deliberately shipped hobbled products so that, eighteen months later, it could discover all sorts of neat additional Micro Channel horsepower, which would be presented in a whole new family of machines using what would then be called Micro Channel 2.

    IBM screwed up in its approach to the Micro Channel. Had it introduced the whole product in 1987, doubling the performance of competitive hardware, buyers would have followed IBM to the new standard as they had before. They could have led the industry to a new 32-bit bus standard—one where IBM again would have had a technical advantage for a while. But instead, Big Blue held back features and then tried to scare away clone makers by threatening legal action and talking about granting licenses for the new bus only if licensees paid 5 percent royalties on both their new Micro Channel clones and on every PC, XT, or AT clone they had ever built. The only result of this new hardball attitude was that an industry that had had little success defining a new bus standard by itself was suddenly solidified against IBM. Compaq Computer led a group of nine clone makers that defined their own 32-bit bus standard in competition with the Micro Channel. Compaq led the new group, but IBM made it happen.

    From IBM’s perspective, though, its approach to the Micro Channel and the PS/2s was perfectly correct since it acted to protect Big Blue’s core mainframe and minicomputer products. Until very recently, IBM concentrated more on the threat that PCs posed to its larger computers than on the opportunities to sell ever more millions of PCs. Into the late 1980s, IBM still saw itself primarily as a maker of large computers.

    Along with new PS/2 hardware, IBM announced in 1987 a new operating system called OS/2, which had been under development at Microsoft when IBM was talking with Metaphor and Lotus. The good part about OS/2 was that it was a true multitasking operating system that allowed several programs to run at the same time on one computer. The bad part about OS/2 was that it was designed by IBM.

    When Bill Lowe sent his lieutenants to Microsoft looking for an operating system for the IBM PC, they didn’t carry a list of specifications for the system software. They were looking for something that was ready—software they could just slap on the new machine and run. And that’s what Microsoft gave IBM in PC-DOS: an off-the-shelf operating system that would run on the new hardware. Microsoft, not IBM, decided what DOS would look like and act like. DOS was a Microsoft product, not an IBM product, and subsequent versions, though they appeared each time in the company of new IBM hardware, continued to be 100 percent Microsoft code.

    OS/2 was different. OS/2 was strategic, which meant that it was too important to be left to the design whims of Microsoft alone. OS/2 would be designed by IBM and just coded by Microsoft. Big mistake.

    OS/2 1.0 was designed to run on the 80286 processor. Bill Gates urged IBM to go straight for the 80386 processor as the target for OS/2, but IBM was afraid that the 386 would offer performance too close to that of its minicomputers. Why buy an AS/400 minicomputer for $200,000, when half a dozen networked PS/2 Model 80s running OS/2-386 could give twice the performance for one third the price? The only reason IBM even developed the 386-based Model 80, in fact, was that Compaq was already selling thousands of its DeskPro 386s. Over the objections of Microsoft, then, OS/2 was aimed at the 286, a chip that Gates correctly called “brain damaged.”

    OS/2 had both a large address space and virtual memory. It had more graphics options than either Windows or the Macintosh, as well as being multithreaded and multitasking. OS/2 looked terrific on paper. But what the paper didn’t show was what Gates called “poor code, poor design, poor process, and other overhead” thrust on Microsoft by IBM.

    While Microsoft retained the right to sell OS/2 to other computer makers, this time around IBM had its own special version of OS/2, Extended Edition, which included a database called the Data Manager, and an interface to IBM mainframes called the Communication Manager. These special extras were intended to tie OS/2 and the PS/2s into their true function as very smart mainframe terminals. IBM had much more than competing with Compaq in mind when it designed the PS/2s. IBM was aiming toward a true counterreformation in personal computing, leading millions of loyal corporate users back toward the holy mother church—the mainframe.

    IBM’s dream for the PS/2s, and for OS/2, was to play a role in leading American business away from the desktop and back to big expensive computers. This was the objective of SAA—IBM’s plan to integrate its personal computers and mainframes—and of what they hoped would be SAA’s compelling application, called OfficeVision.

    On May 16, 1989, I sat in an auditorium on the ground floor of the IBM building at 540 Madison Avenue. It was a rainy Tuesday morning in New York, and the room, which was filled with bright television lights as well as people, soon took on the distinctive smell of wet wool. At the front of the room stood a podium and a long table, behind which sat the usual IBM suspects—a dozen conservatively dressed, overweight, middle-aged white men.

    George Conrades, IBM’s head of U.S. marketing, appeared behind the podium. Conrades, 43, was on the fast career track at IBM. He was younger than nearly all the other men of IBM who sat at the long table behind him, waiting to play their supporting roles. Behind the television camera lens, 25,000 IBM employees, suppliers, and key customers spread across the world watched the presentation by satellite.

    The object of all this attention was a computer software product from IBM called OfficeVision, the result of 4,000 man-years of effort at a cost of more than a billion dollars.

    To hear Conrades and the others describe it through their carefully scripted performances, OfficeVision would revolutionize American business. Its “programmable terminals” (PCs) with their immense memory and processing power would gather data from mainframe computers across the building or across the planet, seeking out data without users’ having even to know where the data were stored and then compiling them into colorful and easy-to-understand displays. OfficeVision would bring top executives for the first time into intimate — even casual — contact with the vital data stored in their corporate computers. Beyond the executive suite, it would offer access to data, sophisticated communication tools, and intuitive ways of viewing and using information throughout the organization. OfficeVision would even make it easier for typists to type and for file clerks to file.

    In the glowing words of Conrades, OfficeVision would make American business more competitive and more profitable. If the experts were right that computing would determine the future success or failure of American business, then OfficeVision simply was that future. It would make that success.

    “And all for an average of $7,600 per desk,” Conrades said, “not including the IBM mainframe computers, of course.”

    The truth behind this exercise in worsted wool and public relations is that OfficeVision was not at all the future of computing but rather its past, spruced up, given a new coat of paint, and trotted out as an all-new model when, in fact, it was not new at all. In the eyes of IBM executives and their strategic partners, though, OfficeVision had the appearance of being new, which was even better. To IBM and the world of mainframe computers, danger lies in things that are truly new.

    With its PS/2s and OS/2 and OfficeVision, IBM was trying to get a jump on a new wave of computing that everyone knew was on its way. The first wave of computing was the mainframe. The second wave was the minicomputer. The third wave was the PC.

    Now the fourth wave — generally called network computing — seemed imminent, and IBM’s big-bucks commitment to SAA and to OfficeVision was its effort to make the fourth wave look as much as possible like the first three. Mainframes would do the work in big companies, minicomputers in medium-sized companies, and PCs would serve small business as well as acting as “programmable terminals” for the big boys with their OfficeVision setups.

    Sadly for IBM, by 1991, OfficeVision still hadn’t appeared, having tripped over mountains of bad code, missed delivery schedules, and facing the fact of life that corporate America is only willing to invest less than 10 percent of each worker’s total compensation in computing resources for that worker. That’s why secretaries get $3,000 PCs and design engineers get $10,000 workstations. OfficeVision would have cost at least double that amount per desk, had it worked at all, so today IBM is talking about a new, slimmed-down OfficeVision 2.0, which will probably fail too.

    When OS/2 1.0 finally shipped months after the PS/2 introduction, every big shot in the PC industry asked his or her market research analysts when OS/2 unit sales would surpass sales of MS-DOS. The general consensus of analysts was that the crossover would take place in the early 1990s, perhaps as soon as 1991. It didn’t happen.

    Time to talk about the realities of market research in the PC industry. Market research firms make surveys of buyers and sellers, trying to predict the future. They gather and sift through millions of bytes of data and then apply their S-shaped demand curves, predicting what will and won’t be a hit. Most of what they do is voodoo. And like voodoo, whether their work is successful depends on the state of mind of their victim/customer.

    Market research customers are hardware and software companies paying thousands — sometimes hundreds of thousands — of dollars, primarily to have their own hunches confirmed. Remember that the question on everyone’s mind was when unit sales of OS/2 would exceed those of DOS. Forget that OS/2 1.0 was late. Forget that there was no compelling application for OS/2. Forget that the operating system, when it did finally appear, was buggy as hell and probably shouldn’t have been released at all. Forget all that, and think only of the question, which was: When will unit sales of OS/2 exceed those of DOS? The assumption (and the flaw) built into this exercise is that OS/2, because it was being pushed by IBM, was destined to overtake DOS, which it hasn’t. But given that the paying customers wanted OS/2 to succeed and that the research question itself suggested that OS/2 would succeed, market research companies like Dataquest, InfoCorp, and International Data Corporation dutifully crazy-glued their usual demand curves on a chart and predicted that OS/2 would be a big hit. There were no dissenting voices. Not a single market research report that I read or read about at that time predicted that OS/2 would be a failure.

    Market research firms tend to serve the same function for the PC industry that a lamppost does for a drunk.

    OS/2 1.0 was a dismal failure. Sales were pitiful. Performance was pitiful, too, at least in that first version. Users didn’t need OS/2 since they could already multitask their existing DOS applications using products like Quarterdeck’s DesqView. Independent software vendors, who were attracted to OS/2 by the lure of IBM, soon stopped their OS/2 development efforts as the operating system’s failure became obvious. But the failure of OS/2 wasn’t all IBM’s fault. Half of the blame has to go on the computer memory crisis of the late 1980s.

    OS/2 made it possible for PCs to access far more memory than the pitiful 640K available under MS-DOS. On a 286 machine, OS/2 could use up to 16 megabytes of memory and in fact seemed to require at least 4 megabytes to perform acceptably. Alas, this sudden need for six times the memory came at a time when American manufacturers had just abandoned the dynamic random-access memory (DRAM) business to the Japanese.

    In 1975, Japan’s Ministry for International Trade and Industry had organized Japan’s leading chip makers into two groups — NEC-Toshiba and Fujitsu-Hitachi-Mitsubishi — to challenge the United States for the 64K DRAM business. They won. By 1985, these two groups had 90 percent of the U.S. market for DRAMs. American companies like Intel, which had started out in the DRAM business, quit making the chips because they weren’t profitable, cutting world DRAM production capacity as they retired. Then, to make matters worse, the United States Department of Commerce accused the Asian DRAM makers of dumping — selling their memory chips in America at less than what it cost to produce them. The Japanese companies cut a deal with the United States government that restricted their DRAM distribution in America — at a time when we had no other reliable DRAM sources. Big mistake. Memory supplies dropped just as memory demand rose, and the classic supply-demand effect was an increase in DRAM prices, which more than doubled in a few months. Toshiba, which was nearly the only company making 1 megabit DRAM chips for a while, earned more than $1 billion in profits on its DRAM business in 1989, in large part because of the United States government.

    Doubled prices are a problem in any industry, but in an industry based on the idea of prices continually dropping, such an increase can lead to panic, as it did in the case of OS/2. The DRAM price bubble was just that—a bubble—but it looked for a while like the end of the world. Software developers who were already working on OS/2 projects began to wonder how many users would be willing to invest the $1,000 that it was suddenly costing to add enough memory to their systems to run OS/2. Just as raising prices killed demand for Apple’s Macintosh in the fall of 1988 (Apple’s primary reason for raising prices was the high cost of DRAM), rising memory prices killed both the supply and demand for OS/2 software.

    Then Bill Gates went into seclusion for a week and came out with the sudden understanding that DOS was good for Microsoft, while OS/2 was probably bad. Annual reading weeks, when Gates stays home and reads technical reports for seven days straight and then emerges to reposition the company, are a tradition at Microsoft. Nothing is allowed to get in the way of planned reading for Chairman Bill. During one business trip to South America, for example, the head of Microsoft’s Brazilian operation tried to impress the boss by taking Gates and several women yachting for the weekend. But this particular weekend had been scheduled for reading, so Bill, who is normally very much on the make, stayed below deck reading the whole time.

    Microsoft had loyally followed IBM in the direction of OS/2. But there must have been an idea nagging in the back of Bill Gates’s mind. By taking this quantum leap to OS/2, IBM was telling the world that DOS was dead. If Microsoft followed IBM too closely in this OS/2 campaign, it was risking the more than $100 million in profits generated each year by DOS — profits that mostly didn’t come from IBM. During one of his reading weeks. Gates began to think about what he called “DOS as an asset” and in the process set Microsoft on a collision course with IBM.

    Up to 1989, Microsoft followed IBM’s lead, dedicating itself publicly to OS/2 and promising versions of all its major applications that would run under the new operating system. On the surface, all was well between Microsoft and IBM. Under the surface, there were major problems with the relationship. A feisty (for IBM) band of graphics programmers at IBM’s lab in Hursley, England, first forced Microsoft to use an inferior and difficult-to-implement graphics imaging model in Presentation Manager and then later committed all the SAA operating systems, including OS/2, to using PostScript, from the hated house of Warnock— Adobe Systems.

    Although by early 1990, OS/2 was up to version 1.2, which included a new file system and other improvements, more than 200 copies of DOS were still being sold for every copy of OS/2. Gates again proposed to IBM that they abandon the 286-based OS/2 product entirely in favor of a 386-based version 2.0. Instead, IBM’s Austin, Texas, lab whipped up its own OS/2 version 1.3, generally referred to as OS/2 Lite. Outwardly, OS/2 1.3 tasted great and was less filling; it ran much faster than OS/2 1.2 and required only 2 megabytes of memory. But OS/2 1.3 sacrificed subsystem performance to improve the speed of its user interface, which meant that it was not really as good a product as it appeared to be. Thrilled finally to produce some software that was well received by reviewers, IBM started talking about basing all its OS/2 products on 1.3 — even its networking and database software, which didn’t even have user interfaces that needed optimizing. To Microsoft, which was well along on OS/2 2.0, the move seemed brain damaged, and this time they said so.

    Microsoft began moving away from OS/2 in 1989 when it became clear that DOS wasn’t going away, nor was it in Microsoft’s interest for it to go away. The best solution for Microsoft would be to put a new face on DOS, and that new face would be yet another version of Windows. Windows 3.0 would include all that Microsoft had learned about graphical user interfaces from seven years of working on Macintosh applications. Windows 3.0 would also be aimed at more powerful PCs using 386 processors — the PCs that Bill Gates expected to dominate business desktops for most of the 1990s. Windows would preserve DOS’s asset value for Microsoft and would give users 90 percent of the features of OS/2, which Gates began to see more and more as an operating system for network file servers, database servers, and other back-end network applications that were practically invisible to users.

    IBM wanted to take from Microsoft the job of defining to the world what a PC operating system was. Big Blue wanted to abandon DOS in favor of OS/2 1.3, which it thought could be tied more directly into IBM hardware and applications, cutting out the clone makers in the process. Gates thought this was a bad idea that was bound to fail. He recognized, even if IBM didn’t, that the market had grown to the point where no one company could define and defend an operating system standard by itself. Without Microsoft’s help, Gates thought IBM would fail. With IBM’shelp, which Gates viewed more as meddling than assistance, Microsoft might fail. Time for a divorce.

    Microsoft programmers deliberately slowed their work on OS/2 and especially on Presentation Manager, its graphical user interface. “What incentive does Microsoft have to get [OS/2-PM] out the door before Windows 3?” Gates asked two marketers from Lotus over dinner following the 1990 Computer Bowl trivia match in April 1990. “Besides, six months after Windows 3 ships it will have greater market share than PM will ever have. OS/2 applications won’t have a chance.”

    Later that night over drinks, Gates speculated that IBM would “fold” in seven years, though it could last as long as ten or twelve years if it did everything right. Inevitably, though, IBM would die, and Bill Gates was determined that Microsoft would not go down too.

    The loyal Lotus marketers prepared a seven-page memo about their inebriated evening with Chairman Bill, giving copies of it to their top management. Somehow I got a copy of the memo, too. And a copy eventually landed on the desk of IBM’s Jim Cannavino, who had taken over Big Blue’s PC operations from Bill Lowe. The end was near for IBM’s special relationship with Microsoft.

    Over the course of several months in 1990, IBM and Microsoft negotiated an agreement leaving DOS and Windows with Microsoft and OS/2 1.3 and 2.0 with IBM. Microsoft’s only connection to OS/2 was the right to develop version 3.0, which would run on non-Intel processors and might not even share all the features of earlier versions of OS/2.

    The Presentation Manager programmers in Redmond, who had been having Nerfball fights with their Windows counterparts every night for months, suddenly found themselves melded into the Windows operation. A cross-licensing agreement between the two companies remained in force, allowing IBM to offer subsequent versions of DOS to its customers and Microsoft the right to sell versions of OS/2, but the emphasis in Redmond was clearly on DOS and Windows, not OS/2.

    “Our strategy for the 90′s is Windows — one evolving architecture, a couple of implementations,” Bill Gates wrote. “Everything we do should focus on making Windows more successful.”

    Windows 3.0 was introduced in May 1990 and sold more than 3 million copies in its first year. Like many other Microsoft products, this third try was finally the real thing. And since it had a head start over its competitors in developing applications that could take full advantage of Windows 3.0, Microsoft was more firmly entrenched than ever as the number one PC software company, while IBM struggled for a new identity. All those other software developers, the ones who had believed three years of Microsoft and IBM predictions that OS/2′s Presentation Manager was the way to go, quickly shifted their OS/2 programmers over to writing Windows applications.

    Reprinted with permission

    Photo Credit: HomeArt/Shutterstock

  • Germany’s Federal Patent Court Rules in Apple’s Favor and Invalidates Samsung Wireless Standard-Essential Patents (SEPs)

    apple_in_colorGermany’s Federal Patent Court (GFPC) ruled in favor of Apple Wednesday when it invalidated the German part of Samsung’s European Patent Specification, “turbo encoding/decoding device and method for precessing frame data according to QoS” (EP1005726, including proposed amendments), which Samsung stated was essential for UMTS, the 3G wireless standard.

    As is the case in most of these rulings, Samsung has the opportunity to appeal the decision to the German Federal Court of Justice. Samsung has sought injunctions against Apple over this, as well as other numerous SEPs.

    In 2012, a Statement of Objections (SOs) was issued by the European Commission which stated “the pursuit of injunction relief against Apple, a willing licensee, was abusive conduct“. This SO was enough to cause Samsung to withdraw all of its European SEP injunctions against Apple, but not enough to keep them from suing for damages/injunction for non-SEPs.

    This ruling comes merely one week after the GFPC invalidated an Apple slide-to-unlock patent. Of course, Apple is not taking this ruling lying down, and is also making attempts to appeal it, but more than likely will not be successful if the GFPCs reaction to past appeals are any indication.

    Source: Foss Patents

    Come comment on this article: Germany’s Federal Patent Court Rules in Apple’s Favor and Invalidates Samsung Wireless Standard-Essential Patents (SEPs)

  • West Wing Week: 04/12/13 or “We Choose Love”

    This week, the President, Vice President and First Lady continued to call for action to reduce gun violence, while the President announced the Fiscal Year 2014 Budget, conferred the medal of honor, met with UN Secretary General Ban Ki-moon, and held an Easter Prayer breakfast.

     

    read more

  • Why Lenovo has been the only OEM to weather the great PC collapse so far

    Lenovo PC Shipments
    The Wednesday IDC report on the disastrous state of the PC industry had one very interesting tidbit that many overlooked: Namely, that while companies such as HP (HPQ) and Acer (2353) saw their shipments collapse by more than 20% year-over-year, Lenovo (LNVGY) actually held steady and experienced no decline in shipments. According to Businessweek, there are a couple of reasons for this: First, Lenovo has been targeting its sales toward emerging markets such as Brazil and its native China, where demand for new PCs is higher than in the United States, Europe, Japan and Korea. Businessweek also says that Lenovo “makes almost one-third of its products in house, which helps it innovate and get those innovations to the market more quickly” while also allowing it “to rely less on factories that are also making computers for its competitors.” Having completely flat growth may not be ideal for most businesses, of course, but in the 2013 PC market, holding your ground is something of a miraculous triumph.

  • Biggest MetroPCS shareholder changes course, now supports T-Mobile merger

    MetroPCS T-Mobile Merger
    Deutsche Telekom’s latest effort to sweeten its offer to MetroPCS (PCS) shareholders has apparently done the trick as Bloomberg reports that MetroPCS’s biggest shareholder has now tentatively come out in favor of its merger with T-Mobile. Paulson & Co., the hedge fund founded by famous investor John Paulson, said on Thursday that it “intends to vote for the merger as restructured” now that Deutsche Telekom has upped its offer, although the firm said it still needs “to review the revised proxy statement before making a final decision.” With Paulson likely to drop opposition to the acquisition, though, it seems that the final hurdle to MetroPCS and T-Mobile merging is about to be cleared.

  • Dennis Crowley and the cycle of second-guessing

    It was a lovely spring day in San Francisco, which is why it made sense for me to meet up with Dennis Crowley, co-founder of Foursquare to catch up on some sun and talk about the 6.0 version of his software. We sat in an atrium, watched the world go by and talked about the new release.

    Crowley, who like all founders is running a million miles a minute, took a moment to bask in the glow of positive reviews for the new version of the software. Obviously, not everyone likes it, but as a long-time member of Foursquare, I like the simpler interface that marries discovery, search and check-ins for a glance-able and quick interaction.

    There are doubters — actually, there are many who are convinced about the inevitable failure of Foursquare. I am not one of them. I actually like using the service. I am a believer, and I’m not afraid to say it. Because it is indeed the way of the future. Sure, Dennis gets spanked publicly for not doing a good job, but that doesn’t mean he is wrong about the marriage of digital and physical.

    denniscrowley2013

    While some people may have been surprised by this new Foursquare, Crowley and his cohorts have been fairly consistent about their vision of the world and what Foursquare has to do. He and I talked about this three years ago, and it has taken them a long time to get there. There is a ways to go before Dennis can get to his “Harry Potter’s Map” dream.

    The positive reviews and the buzz of the new release are going to last a few days, and then it will be back to the grind for him. The grind that consumes all founders completely. The grind that means managing a big company. The grind that means parting ways with your co-founder. The grind that means dealing with constant naysaying, haters and giants who exist to copy your ideas, poach your people and generally make you miserable.

    Those of you who have started a company know what I am talking about — the constant, daily upheaval of emotions. There are days when you don’t want to get out of bed, when you whimper without tears and then shake it all off because deep down you know you would rather be doing this than something else. Founders live to capture lightening in the bottle: sometimes it works, sometimes it doesn’t, but we still keep trying. And that is the part the non-builders don’t get.

    Building things that are different, inventing the future and creating a real business is a long and often very lonely slog. But you don’t hear about that. Instead what you get is a lot of babble about startups from so-called mentors, advisors and startup gurus. Peel away their sharkskin and you find they have never started a company, and they continue to live in the reflective glory of the company that once employed them. Others are the creation of social media, having struck a pose. And some are born consultants. They find willing listeners among a growing army of entrepreneurs who like enterprenuership as a lifestyle. Sorry guys, entrepreneurship isn’t a lifestyle, it is life.

    This spectacle of technology has attracted fake messiahs, and every day I see this mockery of entreprenuership. I overhear it in coffee shops. I am forced to confront it on social media. And I have to remind myself of Pandora founder Tim Westergren, who sacrificed it all to see his bet finally pay off after more than a decade of struggle. I like to think of Aaron Levie, who returns my email at 3:52 a.m. — a minute after I’ve pinged him. And I think of my friend Paul Evans, who has gambled it all on his company, Shareband.

    Ask Dennis what it is all about, and he will tell you: seeing someone check into a location, finding a tip and then acting on it.

    That moment is what gets you ready for tomorrow — when all hell breaks loose and the second guessing starts all over again.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.