Blog

  • Oracle to Acquire Siris’ Tekelec Global

    Oracle Corporation is to acquire Tekelec Global, a portfolio company of Siris Capital Group. Terms of the transaction were not disclosed and closing is subject to certain closing conditions and regulatory approvals. Tekelec is a provider of network signaling, policy control, and subscriber data management solutions to both wireless and wireline service providers.

    PRESS RELEASE

    On March 25, 2013, Tekelec Global, Inc. (“Tekelec” or the “Company”), a portfolio company of Siris Capital Group, LLC (“Siris”), announced that it has signed a definitive agreement pursuant to which Oracle Corporation (“Oracle”) will acquire the Company. Terms of the transaction were not disclosed and closing is subject to certain closing conditions and regulatory approvals.

    Tekelec is a leading provider of network signaling, policy control, and subscriber data management solutions to both wireless and wireline service providers. The Company’s intelligent mobile broadband solutions enable service providers to manage and monetize mobile data and voice services in LTE, IMS and 3G networks.

    The original “take-private” acquisition of Tekelec in January 2012, by a consortium led by Siris, including affiliates of Comvest Partners, funds and accounts managed by GSO Capital Partners LP, Sankaty Advisors LLC, ZelnickMedia and other Siris limited partners and affiliates, was the result of a targeted search led by the Siris principals and Executive Partners – Merle Gilmore, Roderick Randall and Richard Mace.

    Siris’ investment thesis was focused on the explosive growth in mobile data traffic and identifying telecommunications and technology businesses that were making the transition from providing technology for the existing 2G/3G networks to the higher growth 4G LTE networks. During Siris’ ownership, the Company has successfully implemented operational and structural improvements, including disposing of non-core assets, refocused the Company’s R&D and strategic roadmap, and increased the Company’s visibility and industry profile in the dynamic and expanding 4G LTE Diameter signaling network ecosystem.

    Goldman, Sachs & Co. and Evercore Partners are acting as financial advisors and Simpson Thacher & Bartlett LLP and Smith Anderson are acting as legal counsel to Tekelec.

    The post Oracle to Acquire Siris’ Tekelec Global appeared first on peHUB.

  • Weekly Radar-”Slow panic” feared on Cyrprus, as central banks meet and US reports jobless

    US MARCH JOBS REPORT/THREE OF G4 CENTRAL BANKS THURS/NEW QUARTER BEGINS/FINAL MARCH PMIS/KENYA SUPREME COURT RULING/SPAIN-FRANCE BOND AUCTIONS

    Given the sound and fury of the past fortnight, it’s hard not to conclude that the messiness of the eventual Cyprus bailout is another inflection point in the whole euro crisis. For most observers, including Mr Dijsselbloem it seems, it ups the ante again on several fronts – 1) possible bank contagion via nervy senior creditors and depositors fearful of bail-ins at the region’s weakest institutions; 2) an unwelcome rise in the cost of borrowing for European banks who remain far more levered than US peers and are already grinding down balance sheets to the detriment of the hobbled European economy; and 3) likely heavy economic and social pressures in Cyprus going forward that, like Greece, increase euro exit risk to some degree. Add reasonable concerns about the credibility and coherence of euro policymaking during this latest episode and a side-order of German/Dutch ‘orthodoxy’ in sharp relief and it all looks a bit rum again.

    Yet the reaction of world markets has been relatively calm so far. Wall St is still stalking record highs through it all for example as signs of the ongoing US recovery mount. So what gives? Today’s price action was interesting in that it started to show investors discriminating against European assets per se – most visible in the inability of European stocks to follow Wall St higher and lunge lower in euro/dollar exchange rate. European bank stocks and bonds have been knocked back relatively sharply this week post-Dijsselbloem too. If this decoupling pattern were to continue, it will remain a story of the size of the economic hit and relative underperformance. But that would change if concerns morphed into euro exit and broader systemic fears and prepare for global markets at large to feel the heat again too. We’re not back there yet with the benefit of the doubt on OMTs and pressured policy reactions still largely conceded. But many of the underlying movements that might feed system-wide stresses – what some term a “slow panic” like deposit shifts etc – will be impossible to monitor systematically by investors for many weeks yet and so nervy times are ahead as we enter Q2 after the Easter break.

    Cyprus and European banks aside, next week will be about the US employment report and three of the Big Four central banks meeting Thurs. Will the ECB respond to the banking sector and consumer sentiment threats and ease rates or monetary conditions? It has plenty of real sector and inflation evidence already that Q1 underwhelmed in euro. The BoJ meeting will be as important with new governor Haruhiko Kuroda at the helm for the first time amid intense interest in how he will pursue the bank’s new aggressive reflation mandate.

    Next week’s big events and data points:

    Kenya Supreme Court rules on election outcome Sat

    US/China March final manufacturing PMI Mon

    Australia rate decision Tues

    European March final manufacturing PMI Tues

    EZ/Italy Feb jobless Tues

    UK Feb mortgage and credit data Tues

    German March CPI Tues

    Thailand rate decision Weds

    US ADP jobs/March final services PMIs Weds

    European March final services PMIs Thurs

    Spain/France government bond auction Thurs

    ECB/BOJ/BOE decisions/pressers Thurs

    EZ Feb retail sales Fri

    US March employment report Fri

        

  • Reuters – Nordic Capital Sells Permobil to Investor AB

    Nordic Capital has sold Permobil to Investor AB. The transaction values the company at SEK5,500 million ($842 million).

    Reuters – Nordic capital divests global market leader permobil to investor * Nordic capital- transaction values the company at an enterprise value of SEK5,500 million (EUR 655 million).

    The post Reuters – Nordic Capital Sells Permobil to Investor AB appeared first on peHUB.

  • GTCR’s Aligned Appoints Director of Institutional Investments

    Aligned Asset Managers, a portfolio company of private equity firm GTCR, has appointed R. Barney Walker the director of institutional investments. Walker will drive Aligned’s distribution and client service activities for its member firms. He has extensive industry contacts in the institutional marketplace and will also assist Aligned in sourcing new potential investments.

    PRESS RELEASE

    Aligned Asset Managers, LLC (“Aligned”), a portfolio company of leading private equity firm GTCR, announced today that it named Mr. R. Barney Walker the Director of Institutional Investments. Mr. Walker will drive Aligned’s distribution and client service activities for its member firms. He is focused on implementing Aligned’s strategic marketing plan across all institutional and retail distribution channels. Aligned expects to add additional resources to the distribution activity as it increases its member firms and products. Mr. Walker has extensive industry contacts in the institutional marketplace and will also assist Aligned in sourcing new potential investments.

    Mr. Walker was previously a Senior Vice President and Director of Institutional Investments at Artio Global Management LLC, the former US institutional asset management arm of Julius Baer Holdings, where he joined in 2002 to help build an institutional asset management business. Artio Global Investors Inc. managed over $70 billion at its peak on behalf of its clients. Prior to Artio, Mr. Walker was a Senior New Business/Client Service Manager at Ark Asset Management LLC, where he started his career in 1996. He has a BS in Business Administration from Boston College.

    About Aligned Asset Managers

    Aligned Asset Managers is building a leading multi-strategy asset management platform through substantial equity investments in firms across alternative and traditional asset classes. Aligned provides managers with many options for realizing liquidity, re-equitizing their business and enhancing distribution capabilities while maintaining a stake in the upside growth of the firm. Aligned completed its first investment in The Townsend Group in late 2011.

    About The Townsend Group

    The Townsend Group (“Townsend”) is a global real asset investment advisory firm providing discretionary and non-discretionary investment solutions to more than 90 clients and $115 billion of assets. Townsend’s investment solutions range from strategic advisory services to fully customized discretionary segregated mandates, including primary funds, co-investments and secondaries. With offices in Cleveland, San Francisco, London and Hong Kong, Townsend is able to provide its clients global perspective and locally sourced execution in the real estate, infrastructure, timber and agriculture asset classes.

    About GTCR

    Founded in 1980, GTCR is a leading private equity firm focused on investing in growth companies in the Financial Services & Technology, Healthcare and Information Services & Technology industries. The Chicago-based firm pioneered The Leaders Strategy(TM) – finding and partnering with management leaders in core domains to identify, acquire and build market-leading companies through transformational acquisitions and organic growth. Since its inception, GTCR has invested more than $10 billion in over 200 companies. For more information, please visit www.gtcr.com.

    For more information about Aligned or Townsend, please contact Barney Walker at 203-504-3204 or [email protected].

    SOURCE: Aligned Asset Managers, LLC

    The post GTCR’s Aligned Appoints Director of Institutional Investments appeared first on peHUB.

  • GMT’s MeetingZone Completes Third Bolt-on Acquisition

    GMT Communications Partners‘ portfolio company, MeetingZone, has completed its third bolt-on acquisition in eighteen months. MeetingZone, a global audio/web conferencing and collaboration services provider, has acquired Atia Communications, a UK based provider of Unified Communications (UC) services and a Microsoft Lync 2010 accredited specialist.

    PRESS RELEASE

    GMT is pleased to announce that its portfolio company, MeetingZone, has completed its third bolt-on acquisition in eighteen months.
    MeetingZone, the independent global audio/web conferencing and collaboration services provider, has acquired Atia Communications, the UK based market-leading provider of Unified Communications (UC) services and a Microsoft Lync 2010 accredited specialist.
    The acquisition enables MeetingZone to build on its portfolio of products and services to address the rapidly growing UC market with Atia’s Microsoft Lync solutions, in addition to its current range of Cisco WebEx services.
    UC intelligently combines voice, video, instant messaging, mobile voice and data and other multimedia services in a bespoke way, tailored to individual business needs. Combined with MeetingZone’s existing services, these exciting technologies provide greater flexibility in an efficient and cost-effective way and can revolutionise the way businesses operate.
    Initially Atia will retain its own identity and brand, but both MeetingZone and Atia will actively promote each other’s complementary services.
    GMT acquired MeetingZone in July 2011. In late 2011, MeetingZone completed its first bolt-on acquisition, acquiring Unified Communications Sweden AB, a Stockholm based conferencing provider. This was followed by a second bolt-on acquisition, Confy AB, in June 2012, allowing MeetingZone to expand its geographical footprint and diversify into the Scandinavian conferencing market.
    The MeetingZone group operates in the UK, Germany, Scandinavia, the US and Canada.
    GMT Communications Partners is a European independent private equity group focused exclusively on the media, information, entertainment and telecommunications industries, having actively invested in the European mid-market for the past 20 years.
    As industry practitioners, GMT focuses heavily on developing new strategic directions for established businesses that are able to benefit from new communications technologies. Since inception in 1993, GMT has raised and invested €775 million in 31 companies across 19 countries, exclusively in the European TMT industry.

    The post GMT’s MeetingZone Completes Third Bolt-on Acquisition appeared first on peHUB.

  • Choose Energy Closes Series A

    Choose Energy has received a Series A fundraising round from Kleiner Perkins Caufield & Byers and Stephens Capital Partners. Founded in 2008, Choose Energy educates consumers on their options for residential electricity supply.

    PRESS RELEASE

    Choose Energy, Inc announced today that it has received a Series A fundraising round from Kleiner Perkins Caufield & Byers (KPCB) and Stephens Capital Partners. The capital will be used to accelerate growth, supplement the world-class technology and marketing team, expand the scope of services for retail energy suppliers, and strengthen its market position as the most visited online energy marketplace.

    Founded in 2008, Choose Energy educates consumers on their options for residential electricity supply. It provides a simple, intuitive interface to allow consumers to compare retail electricity plans, filter plans based on term, price, and type, select a plan that they like, and seamlessly enroll online. Choose Energy is also building tools to help retail energy providers compete more effectively, leveraging lessons from telecommunications, travel, media and other web-enabled industries.

    Choose Energy is defining the new norm for choosing energy online and has helped more than 100,000 consumers find and enroll with a new electricity supplier. Currently active in Texas, New York, Ohio, Pennsylvania and Illinois, the Company plans to enter all 19 deregulated energy states and 22 deregulated natural gas states, which combined represent $250 billion of annual spend in the U.S. While a range of plans is offered on the site, more than 40% of Choose’s customers have selected 100% green power plans.

    “Ten years after deregulation, we’re still experiencing a huge communications gap between the consumers of energy and retail energy providers,” said Jerry Dyess, CEO of Choose. “Our goal is to bring transparency to the complex decisions consumers face when choosing a new energy plan in deregulated states. By providing a decision-making destination for consumers that’s akin to choosing a flight online, we drive new sales, improve satisfaction and dramatically cut acquisition costs. The addition of Kleiner Perkins and Stephens supports our vision of a nationwide platform that serves this vibrant energy marketplace.”

    In addition to helping consumers find the energy plan that’s right for them, Choose Energy is developing solutions for Retail Energy Providers to support customer acquisition, compelling offer creation, customer enrollment and ongoing relationship management.

    “Traditional utilities were not built to compete for customers, or to maintain an ongoing dialogue with them,” said David Mount, partner at Kleiner Perkins Caufield & Byers. “Choose is building the solutions that enable retail energy providers to attract and develop lasting relationships with their customers.”

    “Stephens is delighted to be involved in this financing round, and we believe Choose Energy has the potential to be a disruptive player in the deregulated energy marketplace, reducing friction for both consumers and retail energy providers. With this unique combination of silicon valley technology and Texas energy, we see a huge opportunity for transformational growth,” said Justin Courtney, Senior Vice President of Stephens, Inc.

    About Choose Energy
    Since its inception in 2008, ChooseEnergy.com has helped over 100,000 consumers and business owners shop for and switch energy suppliers and plans, with over 1 billion KWH of energy selection occurring through the Choose Energy platform. ChooseEnergy.com is currently available in Texas, New York, Ohio, and now Pennsylvania.

    About Kleiner Perkins Caufield & Byers (KPCB)
    Kleiner Perkins Caufield & Byers (KPCB) has backed entrepreneurs in more than 500 ventures leading to 150 IPOs, 350,000 jobs and a deep strategic network. The firm has helped build pioneering companies like Align, Amazon, Electronic Arts, Genentech, Genomic Health, Google, Intuit, Juniper Networks, Netscape, Symantec, VeriSign and WebMD. KPCB partners serve on the boards of Amazon, Apple, Bloom Energy, Flipboard, Foundation Medicine, Google, Hewlett-Packard, Nest, Square, Tesaro and Zynga, among others. KPCB accelerates the success of entrepreneurs with a team of partners delivering company-building services including strategy, operational scaling, recruiting, business development, product delivery and marketing communications. The firm invests in all stages from seed and incubation to growth companies. KPCB operates from offices in Menlo Park, San Francisco, Shanghai and Beijing.

    The post Choose Energy Closes Series A appeared first on peHUB.

  • Morning Advantage: A Supply Chain Solution to an Age-Old Problem

    Time after time, three issues continue to bedevil global flood relief efforts — lack of advance preparation, lack of attention to floods too small to grab the international spotlight, and lack of economic recovery efforts after the waters abate. Here in the Guardian, two business professors offer a practical solution to all three problems using developing nations’ supply chains.

    The consumer-goods supply chain typically moves from large Western or indigenous manufacturers to distributors to wholesalers to family-owned shops and finally to micro-retailers selling from carts or by the roadside. The professors suggest that social enterprises can short-circuit the process if before floods occur, they work with local governments to earmark spaces — parks, playgrounds, etc. — for temporary warehouse sites. When disaster strikes, the NGO would set up pop-up warehouses in those locations to channel relief directly from the manufacturers to the myriad micro sites. More goods get where they need to go and micro sellers remain in business, ready to resume normal operations when the flood is over and the NGOs fold up their tents.

    HAVE I GOT A DEAL FOR YOU

    Should You Buy That Used Start-Up? (BCG)

    Once a private equity firm is ready to cash out of a start-up, it’s been thought, not much more could be done to increase its value before going public. But that’s not what HHL Leipzig management school and BCG found when they looked at 225 start-ups that PE firms bought from one another between 2006 and 2012. While on average the initial PE firm increased a start-up’s value by 20%, the second firm found plenty of room for improvement, increasing the average value another 24%. Both the primary and secondary buyers added value in the same way — increasing EBITDA through operational improvements (an average of 14% in the first round, 13% the next) and growing revenue by increasing compound annual sales (both by 10%).

    THE $86 BAKED BEAN

    Heinz Goes Upscale (Dezeen)

    Heinz is partnering with designers Bompas & Parr to create a “flavour experience” for its new lines of baked beans. Each of the five new varieties — cheddar cheese, curry, barbecue, fiery chili, and garlic and herbs — gets its own handmade bowl plus a spoon embedded with an MP3 player to provide a “flavour-enhancing” soundtrack as you chew. (Think Punjabi bhangra for the curry, Latin samba for the chili.) The limited edition boxed sets will sell for £57 at Fortum & Mason. As HBR blogger Bill Taylor has noted, there’s value in appealing to all five of your customers’ senses. But I’ll admit I did check the date here to see if this was an April Fool’s joke. — Alison Beard

    BONUS BITS:

    Tech Talk

    Ahoy, Hello, and How Digital Devices Change the Rules of Etiquette (Smithsonian)

    Here’s Where They Make China’s Cheap Android Smartphones (Technology Review)

    Why Don’t We Have Food Replacement Pills? (Popular Mechanics)

    Ahoy, Hello, and How Digital Devices Change the Rules of Etiquette (Smithsonian)

    Here’s Where They Make China’s Cheap Android Smartphones (Technology Review)

    Why Don’t We Have Food Replacement Pills? (Popular Mechanics)

  • Egyptian coastguard arrests divers over major broadband cable cut

    Most times a submarine internet cable gets cut, it’s the result of someone dropping anchor in the wrong place. In the case of the cut off the Egyptian coast, on which my colleague Om Malik reported yesterday, it seems that more deliberate action may have been involved.

    According to the Associated Press, on Wednesday the Egyptian coastguard detained three scuba divers in a dinghy near Alexandria, who were “cutting the undersea cable” of local telco Telecom Egypt. Egyptian news agency MENA identified the affected cable as SMW4: the same one whose cutting caused an internet slowdown in parts of Africa, the Middle East and Asia.

    MENA quoted officials as saying services would be “back 100 percent on Thursday morning” via the use of “alternative feeds”. Telecom Egypt will apparently bear the cost of the repairs, both of this disruption and a separate cable cut last Friday.

    Incidentally, the SMW4 cable (more properly known as South East Asia–Middle East–Western Europe 4 or SEA-ME-WE 4) was also involved in a very serious outage five years ago, which cut the capacity of the main Europe-Middle East connection by 75 percent. This one appears to have been less drastic.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • UCLA study finds heart failure medications highly cost-effective

    A UCLA study shows that heart failure medications recommended by national guidelines are highly cost-effective in saving lives and may also provide savings to the health care system.
     
    Heart failure, a chronic, progressive disease, affects millions of individuals and results in considerable morbidity, the use of extensive health care resources and substantial costs.
     
    Currently online, the study will be published in the April 2 print issue of the Journal of the American College of Cardiology. Researchers studied the incremental health and cost benefits of three common heart failure medications that are recommended by national guidelines developed by organizations like the American College of Cardiology and the American Heart Association.
     
    This is one of the first studies analyzing the incremental cost-effectiveness of heart failure medications and taking into account the very latest information, including the lower costs of generic versions of the medications. Researchers found that the combination of these medical therapies demonstrated the greatest gains in quality-adjusted life years for heart failure patients.
     
    “We found that use of one or more of these key medications in combination was associated with significant health gains while at the same time being cost-effective or providing a cost savings,” said the study’s senior author, Dr. Gregg Fonarow, UCLA’s Eliot Corday Professor of Cardiovascular Medicine and Science and director of the Ahmanson–UCLA Cardiomyopathy Center at the David Geffen School of Medicine at UCLA. “Our findings demonstrate the importance of prescribing these national guideline–directed medical therapies to patients with heart failure.”
     
    The study focused on mild to moderate chronic heart-failure patients who had weakening function in the heart’s left ventricle and symptoms of heart failure, which occurs when the ventricle can no longer pump enough blood to the body’s other organs. With the heart’s diminishing function, fluid can build up in the lungs, so most patients take a diuretic.
     
    The research team used an advanced statistical model to assess the specific incremental and cumulative health- and cost-benefit contributions of three medications, compared with diuretics alone, in the treatment of heart failure patients. The medications studied included angiotensin-converting enzyme inhibitors, aldosterone antagonists and beta blockers.
     
    Researchers found that treatment with one or a combination of these medications was associated with lower costs and higher quality of life when compared to just receiving a diuretic alone. The greatest gain in quality-adjusted life years for patients was achieved when all three guideline-directed medications were provided.
     
    The team calculated different scenarios and found that the incremental cost-effectiveness ratio of adding each medication was less than $1,500 per each quality-adjusted life year for patients. In some scenarios, the medications were actually cost-saving, where heart failure patients’ lives were prolonged at lower costs to the health care system.
     
    The study found that up to $14,000 could be spent over a lifetime on a heart failure disease-management program to improve medication adherence and still be highly cost-effective. 
     
    For the study, cost-effective interventions were defined as those providing good value with a cost of less than $50,000 per quality-adjusted life year, which is the general standard, Fonarow said. Cost-saving interventions are those that not only extend life but also actually save money to the health care system. Such interventions are not only more effective but are less costly.
     
    Fonarow noted that the costs of not effectively taking these key medications would be higher, due to increased hospitalizations and the need for other interventions.
     
    “Given the high health care value provided by these medical therapies for heart failure, reducing patient costs for these medications or even providing a financial incentive to promote adherence is likely to be advantageous to patients as well as the health care system,” Fonarow said. “Further resources should be allocated to ensure full adherence to guideline-directed medical therapies for heart failure patients to improve outcomes, provide high-value care and minimize health care costs.”
     
    The researchers used previous clinical trials and government statistics to help calculate mortality, hospitalization rates and health care costs used in the model.
     
    Fonarow noted that the study offers broad insight into the cost-effectiveness of these medications and that a real-world model would provide an additional perspective.
     
    The costs used in this study were estimates of true costs, and the actual costs in different health care delivery systems may vary.
     
    No outside funding was used for the study. Disclosures are included in the manuscript.
     
    Other study authors included Gaurav Banka from the Ahmanson–UCLA Cardiomyopathy Center and Paul A. Heidenreich from the VA Palo Alto Health Care System in Palo Alto, Calif.
     
    For more news, visit the UCLA Newsroom and follow us on Twitter.

  • Genie Timeline 2013 — three editions, one solid choice for backups

    Genie9 has released Genie Timeline 2013, the latest edition of its easy-to-use backup tool. As previously, the program is available in three editions — Genie Timeline Free 2013Genie Timeline Home 2013 ($39.95) and Genie Timeline Professional 2013 ($59.95) — and all of these gain plenty of features in the new release.

    This starts with the new protection level, for instance. This will immediately highlight any problems — lots of file changes which haven’t been backed up, say, or you’ve run out of free space on the destination drive — giving you a quick and easy view of your backup status.

    All three editions also offer you more restore features. You can search old backups for a particular file name, for instance, or filter the restore view to show new, modified or deleted files.

    Genie Timeline Home 2013 also gains an enhanced disaster recovery engine, as well as the ability to strip deleted files out of your backup.

    As you’d expect, Genie Timeline Professional 2013 has even more. The program now has include and exclude filters, for instance (with regular expression support). You can actually decide when to turn turbo mode on, if you’re in a hurry. The backup folder name can now include a user and computer name, for easier identification, while new low-level options mean you can choose a delayed start for the Genie Timeline service.

    And all three versions have a revamped interface, which is now looking more Windows 8-like than ever.

    The end result is a solid, if not revolutionary change for Genie Timeline Free 2013 and Genie Timeline Home 2013. The restore view filtering is welcome, though, while the include and exclude filters are a great addition to Genie Timeline Professional 2013, and on balance Timeline remains a capable and effective backup tool.

    Photo Credit: Raimundas/Shutterstock

  • Accidental Empires, Part 15 — Clones (Chapter 9)

    Fifteenth in a series. The next chapter in Robert X.Cringely’s 1991 classic, Accidental Empires, looks at the real rise of Microsoft. IBM established the standard hardware, which Compaq successfully “cloned”, and for which developers created software. Cringely explains how standards evolve, using vinyl records as metaphor.

    It was in the clay room, a closet filled with plastic bags of gray muck at the back of Mr. Ziska’s art room, where I made my move. For the first time ever, I found myself standing alone with Nancy Wilkins, the love of my life, the girl of my dreams. She was a vision in her green and black plaid skirt and white blouse, with little flecks of clay dusted across her glasses. Her blonde hair was in a ponytail, her teeth were in braces, and I was sure — well, pretty sure — that she was wearing a bra.

    “Run away with me, Nancy,” I said, wrapping my arms around her from behind. Forget for a moment, as I obviously did, that we were both 13 years old, trapped in the eighth grade, and had nowhere to run away to.

    “Why would I want to run away?” Nancy responded, gently twisting free. “Let’s stay here and have fun with everyone else.”

    It wasn’t a rejection, really. There had been no screams, no slaps, no frenzied pounding on the door by Earl Ziska, eager to throw his 120 pounds of fighting fury against me for making a pass at one of his art students. And she’d used the word let’s, so maybe I had a chance. Still, Nancy’s was a call to mediocrity, to being just like all the other kids.

    Running away still sounded better to me.

    What I really had in mind was not running away but running toward something, toward a future where I was older (16 would do it, I reckoned) and taller and had lots of money and could live out my fantasies with impunity, Nancy Wilkins at my side. But I couldn’t say that. It wouldn’t have been cool to say, “Come with me to a place where I am taller.”

    We never ran anywhere together, Nancy and I. It was clear from that moment in the clay room that she was content to live her life in formation with everyone else’s and to limit her goals to within one standard deviation on the upside of average. Like nearly everyone else in school and in the world, she wanted more than anything else to be just like her best friends. Only prettier, of course.

    Fitting in is the root of culture. Staying here and having fun with everyone else is what allows societies to function, but it’s not a source of progress. Progress comes from discord — from doing new things in new ways, from running away to something new, even when it means giving up that chance to have fun with the old gang. To engineers — really good ones, interested in making progress — the best of all possible worlds would be one in which technologies competed continuously and only the best technologies survived. Whether the good stuff came from an established company, a start-up, or even from Earl Ziska wouldn’t matter. But it usually does matter because the real world, the one we live in, is a world of dollars, not sense. It’s a world where commercial interests are entrenched and consumers typically pay closer attention to what everyone else is buying than to whether what they are buying is any good. In this real world, then, the most successful products become standards against which all other products are measured, not for their performance or cleverness but for the extent to which they are like that standard.

    In the standards game, as in penmanship, the best grades often go to the least interesting people.

    In 1948, CBS introduced the long-playing record album — LP. The new records spun at 33V3 revolutions per minute rather than the 78 RPM that had been the standard for forty years. This slower speed, combined with the fact that the smaller needle allowed the grooves to be closer together than on the old 78s, made it possible to put more music than ever before on each side of a record. The sound quality of the LPs was better, too. They called it “stereo high fidelity.”

    The smaller needle used to play an LP and its light tracking weight meant that records wouldn’t wear out as quickly as they had with the old steel needles. And the light needles meant that LPs could be made out of unbreakable vinyl rather than the thick, brittle plastic that had been used before.

    LPs were better in every way than the old 78s they replaced. Sure, listeners would have to buy new record players, and LPs might cost more to buy, but those were minor penalties for the glories of true high fidelity.

    Also in 1948, at about the same time that CBS was introducing the LP, RCA was across town bringing out the first 45 RPM single. The 45 had a better sound than the old 78s, too, though not as good as the LP and not in stereo. But where the LPs put twenty minutes of music on one record side, the 45s opted for a minimalist solution — one song per side — which made 45s cheaper than the 78s they replaced, and lots cheaper than LPs. Forty-fives worked well in jukeboxes, too, because their large center holes made life easier for robot fingers.

    The 45s were pretty terrific, though you still had to buy a new record player.

    So here it was 1948. One war was over, and the next one was not even imagined, America and American tastes ruled the world. and the record industry had just offered up its two best ideas for how music should be sold for the next forty years. What happened?

    The recording industry immediately entered a four-year slump as Americans, who couldn’t decide what type of record to buy, decided not to buy any records at all.

    What happened to the record industry in 1948 was the result of two major players’ deciding to promote new technical standards at exactly the same time.

    “You’ll sell millions of 45s,” the RCA salesmen told record store owners.

    “Just listen to the music,” said the CBS salesman.

    “Who’s going to pay six bucks for one record?” asked RCA.

    “Think profit margins,” ordered CBS.

    “Think sales volume!”

    Who could think? So they didn’t, and the industry fumbled along until an act of God or Elvis Presley decided which standard would dominate what parts of the business. Forty-fives eventually gained the youth vote, while LPs took the high end of the market. In time, machines were built that could play both types of records, and the two technical standards were eventually marketed in a manner that made them complementary. But that wasn’t the original intention of their inventors, each of whom wanted to have it all.

    Markets hate equality. That was the problem with this battle between LPs and 45s: both were better than the old standard, and each had advantages over the other. In the world of music, circa 1948, it just wasn’t immediately clear which standard would be dominant, so the third parties in the industry did not know how to align themselves. If either CBS or RCA had been a couple of years later, the market would have had a chance to adopt the first new standard and then consider the second. Everybody would have been listening to more music.

    In any major market, there are always two standards, and generally only two, because people are different enough that they won’t all be satisfied with the same thing, yet consumers naturally align themselves into either the “us” or “them” camp. No triangles. Even the Big Three U.S. automakers don’t constitute a triangle because they have all chosen to support the same standard — the passenger automobile. For all the high school bickering I remember about whether a Ford was better than a Chevy, the alternative standard to a Mustang is not a Camaro; it’s a pickup truck.

    Just as there are always two standards, one of those standards is always dominant. Eighty-five percent of the folks who go shopping for a passenger vehicle come home with a car, while 15 percent come home with a truck. Eighty-five percent of the home videocassette recorders in America are VHS, while 15 percent are Betamax. Those numbers  — 85 and 15– keep coming back again and again. Maybe that’s the natural relationship between primary and secondary standards, somehow determined by the gods of consumer products.

    In the personal computer business today, about 85 percent of the machines sold are IBM compatible, and 15 percent are Apple Macintoshes. Sure, there are other brands — Commodore Amigas, Atari STs, and weird boxes built in England that function in ways that make sense only to English minds– and even the makers of these machines complain that somehow they have trouble getting noticed by anything but the hobbyist market. The mainstream American market—the business market — just doesn’t see these machines as computers, even though some of them offer superior features. It’s not that they aren’t good; it’s that they are third.

    When IBM introduced its Personal Computer, the world was ready for a change. The 8-bit computers of the time were doing their best to imitate the battle between LPs and 45s. There just wasn’t much of a qualitative difference between the Apple IIs, TRS-8os, and CP/M boxes of the time, so no one standard had broken out, taking the overall market to new heights with it. The market needed differentiation, and that was provided by the entry of IBM, raising its 16-bit standard.

    Eight-bit partisans looked down their noses at the new PC, said that it was overpriced and underpowered, and asked who would ever need that much memory, anyway. With 3,000 Apple II applications and 5,000 CP/M applications on the market, sheer volume of software would keep IBM and PC-DOS from succeeding, they argued. Their letters of protest in InfoWorld had a note of shrillness, though, as if the writers were suddenly and for the first time aware of their own mortality. That’s the way it is with soon-to-be passing standards. Collectors of 78s sounded that way too until they vanished.

    In the world of standards, ubiquity is the last step before invisibility.

    The new standard was going to be 16-bit computing, that was clear, but what wasn’t immediately clear was that the new standard would be 16-bit computing using IBM hardware and the PC-DOS operating system. Many companies saw as much opportunity to build the new 16-bit standard computing with their hardware and their operating system as with IBM’s.

    There were lots of IBM competitors. There was the Victor 9000, sold by Kidde, an old-line office machine company. The Victor had more power, more storage, more memory, and better graphics than the IBM PC, and for less money. There was the Zenith Z-100, which had two processors, so it could run 8-bit or 16-bit software, and it too was a little cheaper than the IBM PC. There was the Hewlett-Packard HP-150, which had more power, more storage, more memory than the IBM PC, and a nifty touchscreen that let users make choices by pointing at the screen.

    There was the DEC Rainbow 100, which had more power, more storage, and the DEC name. There was a Xerox computer, a Wang computer, and a Honeywell computer. There were suddenly lots of 16-bit computers hoping to snatch the mantle of de facto industry standard away from IBM, through either superior technology or pricing.

    One reason that all these players were trying to take on IBM was that Microsoft encouraged them to. Bill Gates, too, was uncertain that IBM’s PC-DOS would become the new standard, so he urged all the other companies doing 16-bit computers with Intel processors to implement their own versions of DOS. And it was good business, too, since Microsoft programmers were doing the work of making MS-DOS work on each new platform. No matter which company set the standard, Microsoft was determined that it would involve a version of their operating system.

    But there was another reason for Microsoft to encourage IBM’s competitors to commission their own versions of DOS. Charles Simonyi and friends had been working up a suite of MS-DOS applications with these varied platforms specifically in mind. Multiplan, the spreadsheet. Multiword, later called just Word, and all the other early Microsoft applications were designed to be quickly ported to strange operating systems and new hardware.

    The idea was that Bill Gates would convince, say, Zenith, to commission a custom version of MS-DOS. Once that project was underway, it was time to remind Zenith that this new DOS version might not work with all (or any) of the other DOS applications on the market, most of which were customized for the IBM PC.

    Panic time at Zenith headquarters in Illinois, where it became imperative to find some applications quickly that would work with its new version of DOS. Son-of-a-gun, Microsoft just happened to have a few portable applications lying around, written in a pseudocode that could be quickly adapted to almost any computer. They weren’t very good applications, but they sure were portable. And so Zenith, having been encouraged by Microsoft to do hardware incompatible with IBM’s, then suckered into commissioning a custom version of MS-DOS, finally ended up having to pay Microsoft to adapt its applications, too. With all his costs covered, Bill Gates could start to make money even before the first copy of Multiplan or Word for Zenith was even sold.

    This squeeze play happened for every new platform and every new version of MS-DOS and was just the first of many instances when Microsoft deliberately coordinated its operating system and application strategies, something the company continues to claim it never did.

    As for the Victor 9000, the Z-100, the HP-150, the DEC Rainbow 100, and all the other early MS-DOS machines, those computers are gone now, dead and mainly forgotten. We can come up with all sorts of individual reasons why each machine failed, but at bottom they all failed because they were not IBM PC compatible. When the IBM PC, for all its faults, instantly became the number one selling personal computer, it became the de facto industry standard, because de facto standards are set by market share and nothing else. When Lotus 1-2-3 appeared, running on the IBM, and only on the IBM, the PC’s role as the technical standard setter was guaranteed not just for this particular generation of hardware but for several generations of hardware.

    The IBM PC defined what it meant to be a personal computer, and all these other computers that were sorta like the IBM PC, kinda like it, were doomed to eventual failure. They didn’t even qualify as the requisite second standard — the pickup truck rather than the car—because although they were all different from the IBM PC, they weren’t different enough to qualify for the number two spot.

    Even the Grid Compass, the first laptop computer, was a failure because of a lack of IBM compatibility. Brilliant technology but different graphics and storage standards meant that Grid needed a version of 1-2-3 different from the one that worked on the IBM PC. When Grid supplied its own applications with the computer, including a spreadsheet, it still wasn’t enough to attract buyers who wanted their 1-2-3. It was back to the drawing board to develop a second-generation laptop that was IBM compatible.

    Entrepreneurs often lack the discipline to keep their new products tightly within a technical standard, which was why the idea of 100 percent IBM compatibility took so long to be accepted. “Why be compatible when you could be better?” the smart guys asked on their way to bankruptcy court.

    IBM compatibility quickly became the key, and the level to which a computer was IBM compatible determined its success. Some long-established microcomputer makers learned this lesson slowly and expensively. Hewlett-Packard actually paid Lotus to adapt 1-2-3 to the HP-150, but the computer was still doomed by its lack of hardware compatibility (you couldn’t put an IBM circuit card in an HP-150 computer). The other problem with the HP-150 was what was supposed to have been its major selling point—the touchscreen, which was a clever idea nobody really wanted. Not only was it hard to get software companies to make their products work with HP’s touchscreen technology, users didn’t like it. Secretaries, who apparently measure their self-worth by typing speed, didn’t want to take their fingers off the keys. Even middle managers, who were the intended users of the system, didn’t like the touch screen. The technology was clever, but it should have been a tip-off that HP’s own engineers chose not to use the systems. You could walk through the cavernlike open offices at HP headquarters in those days without seeing a single user pointing at his or her touchscreen.

    The best and most powerful computers come from designers who actually use their technologies — whose own tastes model those of intended users. Ivory towers, no matter how high, don’t produce effective products for the real world.

    Down at Tandy Corp. headquarters in Fort Worth, where ivory towers are unknown, Radio Shack’s answer to the IBM PC was the Model 2000, another workalike, which appeared in the fall of 1983. The Model 2000 was intended to beat the IBM PC with twice the speed, more storage, and higher-resolution graphics. The trick was a more powerful processor, the Intel 80186, which could run rings around IBM’s old 8088.

    Because Tandy had its own distribution through 5,000 Radio Shack stores and through a chain of Tandy Computer Centers, the company thought for a long time that it was somehow immune to the influence of the IBM standard. They thought of their trusty Radio Shack customers as Albanians who would loyally shop at the Albanian Computer Store, no matter what was happening in the rest of the world. But Radio Shack’s white-collar customer list turned out to include very few Albanians.

    Bill Gates was a strong believer in the Model 2000 because it was the only personal computer powerful enough to run new software from Microsoft called Windows without being embarrassingly slow. Windows was an attempt to bring a Xerox Alto-style graphical user interface to personal computers. But Windows took a lot of power to run and was a real dog on the IBM PC and the other computers using 8088 processors. For Windows to succeed. Bill Gates needed a computer like the Model 2000. So young Bill, who handled the Tandy account himself, predicted that the computer would be a grand success — something the boys and girls in Fort Worth wanted badly to hear. And Gates made a public endorsement of the Model 2000, hoping to sway customers and promote Windows as well.

    Still, the Model 2000 failed miserably. Nobody gave a damn about Windows, which didn’t appear until 1985, and even then didn’t work well. The computer wasn’t hardware compatible with IBM. It wasn’t very software compatible with IBM either, and the most popular IBM PC programs—the ones that talked directly to the PC’s memory and so worked lots faster than those that allowed the operating system to do the talking for them— wouldn’t work at all. Even the signals from the keyboard were different from IBM’s, which drove software developers crazy and was one of the reasons that only a handful of software houses produced 2000-specific versions of their products. Oh, and the Intel 80186 processor had bugs, too, which took months to fix.

    Today the Model 2000 is considered the magnum opus of Radio Shack marketing failures. Worse, a Radio Shack computer buyer in his last days with the company for some reason ordered 20,000 more of the systems built even when it was apparent they weren’t selling. Tandy eventually sold over 5,000 of those systems to itself, placing one in each Radio Shack store to track inventory. Some leftover Model 2000s were still in the warehouse in early 1990, almost seven years later.

    Still, the Model 2000′s failure was Bill Gates’s gain. Windows was a failure, but the head of Radio Shack’s computer division, Jon Shirley, the very guy who’d been duped by Bill Gates into doing the Model 2000 in the first place, sensed that his position in Fort Worth was in danger and joined Microsoft as president in 1983.

    Big Blue’s share of the personal computer market peaked above 40 percent in the early 1980s. In 1983, IBM sold 538,000 personal computers. In 1984, it sold 1,375,000.

    IBM wasn’t afraid of others’ copying the design of the PC, although nearly the entire system was built of off-the-shelf parts from other companies. Conventional wisdom in Boca Raton said that competitors would always pay more than IBM did for the parts needed to build a PC clone. To compete with IBM, another company would have to sell its PC clone at such a low price that there would be no profit. That was the theory.

    In one sense, nothing could have been easier than building a PC clone, since IBM was so generous in supplying technical information about its systems. Everything a good engineer would need to know in order to design an IBM PC copy was readily available. While it seems like this would encourage copying, it was intended to do just the opposite because a trap lay in IBM’s technical documentation. That trap was the complete code listing for the IBM PC’s ROM-BIOS.

    Remember, the ROM-BIOS was Gary Kildall’s invention that allowed the same version of CP/M to operate on many different types of computers. The basic input/output system (BIOS) was special computer code that linked the generic operating system to specific hardware. The BIOS was stored in a read-only memory chip — a ROM — installed on the main computer circuit board, called the motherboard. To be completely compatible with the IBM PC, a clone machine either would have to use IBM’s ROM-BIOS chip, which wasn’t for sale, or devise another chip just like IBM’s. But IBM’s ROM-BIOS was copyrighted. The lines of code burned into the read-only memory were protected by law, so while it would be an easy matter to take IBM’s published ROM-BIOS code and use it to prepare an exact copy of the chip, doing so would violate IBM’s copyright and incur the legendary wrath of Armonk.

    The key to making a copy of the IBM PC was copying the ROM-BIOS, and the key to copying the ROM-BIOS was to do so without reading IBM’s published BIOS code.

    Huh?

    As we saw with Dan Bricklin’s copyright on VisiCalc, a copyright protects only the specific lines of computer code but not the functions that those lines of code made the computer perform. The IBM copyright did not protect the company from others who might write their own completely independent code that just happened to perform the same BIOS function. By publishing its copyrighted BIOS code, IBM was making it very hard for others to claim that they had written their own BIOS without being exposed to or influenced by IBM’s.

    IBM was wrong. Welcome to the world of reverse engineering.

    Reverse engineering is the science of copying a technical function without copying the legally protected manner in which that function is accomplished in a competitor’s machine. Would-be PC clone makers had to come up with a chip that would replace IBM’s ROM-BIOS but do so without copying any IBM code. The way this is done is by looking at IBM’s ROM-BIOS as a black box —a mystery machine that does funny things to inputs and outputs. By knowing what data go into the black box—the ROM— and what data come out, programmers can make intelligent guesses about what happens to the data when they are inside the ROM. Reverse engineering is a matter of putting many of these guesses together and testing them until the cloned ROM-BIOS acts exactly like the target ROM-BIOS. It’s a tedious and expensive process and one that can be accomplished only by virgins—programmers who can prove that they have never been exposed to IBM’s ROM-BIOS code—and good virgins are hard to find.

    Reverse engineering the IBM PC’s ROM-BIOS took the efforts of fifteen senior programmers over several months and cost $1 million for the company that finally did it: Compaq Computer.

    Compaq is the computer company with good penmanship. There was so little ego evident around the table when Rod Canion, Jim Harris, and Bill Murto were planning their start-up in the summer of 1981 that the three couldn’t decide at first whether to open a Mexican restaurant, build hard disk drives for personal computers, or manufacture a gizmo that would beep on command to help find lost car keys. Oh, and they also considered starting a computer company. The computer company idea eventually won out, and the concept of the Compaq was first sketched out on a placemat at a House of Pies restaurant in Houston.

    All three founders were experienced managers from Texas Instruments. TI was the company that many computer professionals expected throughout the late 1970s and early 1980s eventually to dominate the microcomputer business with its superior technology and management, only that never happened. Despite having the best chips, the brightest engineers, and Texas-sized ambition, the best TI did was a disastrous entry into the home computer business that eventually lost the company hundreds of millions of dollars. Later there was also an incompatible MS-DOS computer that came and went, suffering the same problem of attracting software as all the other rogue machines. Eventually TI produced a modest line of PC clones.

    Unlike most of the other would-be IBM competitors, the three Compaq founders realized that software, and not hardware, was what really mattered. In order for their computer to be successful, it would have to have a large library of available software right from the start, which meant building a computer that was compatible with some other system. The only 16-bit standard available that qualified under these rules was IBM’s, so that was the decision — to make an IBM-compatible PC — and to make it damn compatible— 100 percent. Any program that would run on an IBM PC would run on a Compaq. Any circuit card that would operate in an IBM PC would operate in a Compaq. The key to their success would be leveraging the market’s considerable investment in IBM.

    Crunching the numbers harder than IBM had, the Compaq founders discovered that a smaller company with less overhead than IBM’s could, in fact, bring out a lower-priced product and still make an acceptable profit. This didn’t mean undercutting IBM by a lot but by a significant amount—about $800 on the first Compaq model compared to an IBM PC with equivalent features.

    Compaq, like any other company pushing a new product, still had to ride the edges of an existing market, offering additional reasons for customers to choose its computer over IBM’s. Just to be different, the first Compaq models were 28-pound portables — luggables, they came to be called. People didn’t really drag these sewing machine-sized units around that much, but since IBM didn’t make a luggable version of the PC, making theirs portable gave Compaq a niche to sell in right next to IBM.

    Compaq appealed to computer dealers, even those who already sold IBM. Especially those who already sold IBM. For one thing, the Compaq portables were available, while IBM PCs were sometimes in short supply. Compaq pricing allowed dealers a 36 percent markup compared to IBM’s 33 percent. And unlike IBM, Compaq had no direct sales force that competed with dealers. A third of IBM’s personal computers were sold direct to major corporations, and each of those sales rankled some local dealer who felt cheated by Big Blue.

    Just like IBM, Compaq first appeared in Sears Business Centers and ComputerLand stores, though a year later, at the end of 1982. With the Compaq’s portability, compatibility, availability, and higher profit margins, signing up both chains was not difficult. Bill Murto made the ComputerLand sale by demonstrating the computer propped on the toilet seat in his hotel bathroom, the only place he could find a three-pronged electrical outlet.

    Just like IBM, Compaq’s dealer network was built by Sparky Sparks, who was hired away from Big Blue to do a repeat performance, selling similar systems to a virtually identical dealer network, though this time from Houston rather than Boca Raton.

    By riding IBM’s tail while being even better than IBM, Compaq sold 47,000 computers worth $111 million in its first year—a start-up record.

    With the overnight success of Compaq, the idea of doing 100-percent IBM-compatible clones suddenly became very popular (“We’d intended to do it this way all along,” the clone makers said), and the IBM workalikes quickly faded away. The most difficult and expensive part of Compaq’s success had been developing the ROM-BIOS, a problem not faced by the many Compaq impersonators that suddenly appeared. What Compaq had done, companies like Phoenix Technologies could do too, and did. But Phoenix, a start-up from Boston, made its money not by building PC clones but by selling IBM-compatible BIOS chips to clone makers. Buying Phoenix’s ROM-BIOS for $25 per chip, a couple of guys in a warehouse in New Jersey could put together systems that looked and ran just like IBM PCs, but cost 30 percent less to buy.

    For months, IBM was shielded from the impact of the clone makers, first by Big Blue’s own shortage of machines and later by a scam perpetrated by dealers.

    When IBM’s factories began churning out millions and millions of PCs, the computer giant set in place a plan that offered volume discounts to dealers. The more computers a dealer ordered, the less each computer cost. To make their cost of goods as low as possible, many dealers ordered as many computers as IBM would sell them, even if that was more computers than they could store at one time or even pay for. Having got the volume price, these dealers would sell excess computers out the back door to unauthorized dealers, at cost. Just when the planners in Boca Raton thought dealers were selling at retail everything they could make, these gray market PCs were being flogged by mail order or off the back of a truck in a parking lot, generally for 15 percent under list price.

    Typical of these gray marketeers was Michael Dell, an 18-year-old student at the University of Texas with a taste for the finer things in life, who was soon clearing $30,000 per month selling gray market PCs from his Austin dorm room. Today Dell is a PC clone-maker, selling $400 million worth of IBM compatible computers a year.

    Seeing this gray market scam as incessant demand, IBM just kept increasing production, increasing at the same time the downward pressure on gray market prices until some dealers were finally selling machines out of the back door for less than cost. That’s when Big Blue finally noticed the clones.

    For companies like IBM, the eventual problem with a hardware standard like the IBM PC is that it becomes a commodity. Companies you’ve never heard of in exotic places like Taiwan and Bayonne suddenly see that there is a big demand for specific PC power supplies, or cases, or floppy disk drives, or motherboards, and whumpl the skies open and out fall millions of Acme power supplies, and Acme deluxe computer cases, and Acme floppy disk drives, and Acme Jr. motherboards, all built exactly like the ones used by IBM, just as good, and at one-third the price. It always happens. And if you, like IBM, are the caretaker of the hardware standard, or at least think that you still are, because sometimes such duties just drift away without their holder knowing it, the only way to fight back is by changing the rules. You’ve got to start selling a whole new PC that can’t use Acme power supplies, or Acme floppy disk drives, or Acme Jr. motherboards, and just hope that the buyers will follow you to that new standard so the commoditization process can start all over again.

    Commoditization is great for customers because it drives prices down and forces standard setters to innovate. In the absence of such competition, IBM would have done nothing. The company would still be building the original PC from 1981 if it could make enough profit doing so.

    But IBM couldn’t keep making a profit on its old hardware, which explains why Big Blue, in 1984, cut prices on its existing PC line and then introduced the PC-AT, a completely new computer that offered significantly higher performance and a certain amount of software compatibility with the old PC while conveniently having no parts in common with the earlier machine.

    The AT was a speed demon. It ran two to three times faster than the old PCs and XTs, It had an Intel 80286 microprocessor, completely bypassing the flawed 80186 used in the Radio Shack Model 2000. Instead of a 360K floppy disk drive, the AT used a special 1.2-megabyte floppy, and every machine came with at least a 20-megabyte hard disk.

    At around $4,000, the AT was also expensive, it wasn’t able to run many popular PC-DOS applications, and sometimes it didn’t run at all because the Computer Memories Inc. (CMI) hard disk used in early units had a tendency to die, taking the first ten chapters of your great American novel with it. IBM was so eager to swat Compaq and the lesser clone makers that it brought out the AT without adequate testing of the CMI drive’s controller card built by Western Digital. There was no alternative controller to replace the faulty units, which led to months of angry customers and delayed production. Some customers who ordered the PC-AT at its introduction did not receive their machines for nine months.

    The 80286 processor had been designed by Intel to operate in multi-user computers running a version of AT&T’s Unix operating system called Xenix and sold by Microsoft. The chip was never intended to go in a PC. And in order to run Xenix efficiently, the 286 had two modes of operation—real mode and protected mode. In real mode, the 286 operated just like a very fast 8086 or 8088, and this was the way it could run some, but not all, MS-DOS applications. But protected mode was where the 286 showed its strength. In protected mode, the 286 could emulate several 8086s at once and could access vast amounts of memory. If real mode was impulse power, protected mode was warp speed. The only problem was that you couldn’t get there from here.

    The 286 chip powered up in real mode and then could be shifted into protected mode. This was the way Intel had en-visoned it working in Xenix computers, which would operate strictly in protected mode. But the 286 was a chip that couldn’t downshift; it could switch from real to protected mode but not from protected mode to real mode. The only way to get back to real mode was to turn the computer off, which was fine for a Xenix system at the end of the workday but pretty stupid for a PC that wanted to switch between a protected mode application and a real mode application. Until most applications ran in protected mode, then, the PC-AT would not reach its full potential.

    And not only was the AT flawed, it was also late. The plan had been to introduce the new machine in early 1983, eighteen months after the original IBM PC and right in line with the trend of starting a new microcomputer generation every year and a half. But IBM’s PC business unit was no longer able to bring a product to market in only eighteen months. They’d done the original PC in a year, but that had been in the time of gods, not men, before reality and the way that things have to be done in enormous companies had sunk in. Three years was how long it took IBM to invent a new computer, and the marketing staff in Boca Raton would just have to accept that and figure clever ways to keep the clones at bay for twice as long as they had been expected to before.

    Still, the one-two punch of lowering PC prices and then introducing the AT took a toll on the clone makers, who had their already slim profit margins hurt by IBM’s lower prices while simultaneously having to invest in cloning the AT.

    The market loyally followed IBM to the AT standard, but life was never again as rosy for IBM as it had been in those earlier days of the original PC. Compaq, in a major effort, cloned the AT in only six months and shipped 10,000 of its Deskpro 286 models before IBM had solved the CMI drive problem and resumed its own AT shipments. But in the long term, Compaq was a small problem for IBM, compared to the one presented by Gordie Campbell.

    Gordon Campbell was once the head of marketing at Intel. Like everyone else of importance at the monster chip company, he was an engineer. And as only an engineer could, one day Gordie fell in love with a new technology, the electrically erasable programmable read-only memory, or EEPROM, which doesn’t mean beans to you or me but to computer engineers was a dramatic new type of memory chip that would make possible whole new categories of small-scale computer products. But where Gordie Campbell saw opportunity, the rest of Intel saw only a major technical headache because nobody had yet figured out how to manufacture EEPROMs in volume. Following a long Silicon Valley tradition, Campbell walked away from Intel, gathered up $30 million in venture capital, and started his EEPROM company — SEEQ Technologies. Who knows where they get these names?

    With his $30 million, Campbell built SEEQ into a profitable company over the next four years, led the company through a successful public stock offering, and paid back the VCs their original investment, all without selling any EEPROMs, which were always three months away from being a viable technology. Still, SEEQ had its state-of-the-art chip-making facility and was able to make enough chips of other types to be profitable while continuing to tweak the EEPROM, which Campbell was sure would be ready Real Soon Now (a computer industry expression that means “in this lifetime, maybe”).

    Then one day Campbell came in to work at SEEQ’s stylish headquarters only to find himself out of a job, fired by the company’s lead venture capital firm, Kleiner Perkins Caulfield and Byers. Kleiner Perkins had the votes and Gordie, who held less than three percent of SEEQ stock, didn’t, so he was out on the street, looking for his next start-up.

    What happened to Campbell was that he came up against the fundamental conflict between venture capitalists and entrepreneurs. Like all other high-tech company founders, Campbell mistakenly assumed that Kleiner Perkins was investing in his dream, when, in fact, Kleiner Perkins was investing in Kleiner Perkins’s dream, which just happened to involve Gordie Campbell. Sure SEEQ was already profitable and the VC’s original investment had been repaid, but to an aggressive venture capitalist, that’s just when real money starts to be made. And to Kleiner Perkins, it looked as if Gordie Campbell, for all his previous success, was making some bad decisions. Bye-bye, Gordie.

    Campbell walked with $2 million in SEEQ stock, licked his wounds for a few months, and thought about his next venture. It had to be another chip company, he knew, but the question was whether to start a company to make general-purpose or custom semiconductors. General-purpose semiconductor companies like Intel, National Semiconductor, and Advanced Micro Devices took two to three years to develop chips, which were then sold in the millions for use in all sorts of electronic equipment. Custom chip companies developed their products in only a few months through the use of expensive computer design tools, with the result being high-performance chips that were sold in very small volumes, mainly to defense contractors at astronomical prices.

    Campbell decided to follow an edge of the market. He would apply to general-purpose chip development the computer-intensive design tools of the custom semiconductor makers. Just as Compaq could produce a new computer in six months, Campbell wanted to start a semiconductor company that could develop new chips in that amount of time and then sell millions of them to the personal computer industry.

    The investment world was doubtful. Becoming increasingly convinced that he had been blackballed by Kleiner Perkins, Campbell traveled the world looking for venture capital. His pitch was rejected sixty times. The new company, Chips & Technologies, finally got moving on $1.5 million from Campbell and a friend who was a real estate developer. Nearly all the money went into leasing giant IBM 3090 and Amdahl 470 mainframes used to design the new chips. When that money was gone, Campbell depleted his savings and then borrowed from his chief financial officer to make payroll. Broke again, and with still no chip designs completed, he finally went to the Far East to look for money, financing the trip on his credit cards. On his last day abroad, Campbell met with Kay Nishi, who then represented Microsoft in Japan. Nishi put together a group of Japanese investors who came up with another $1.5 million in exchange for 15 percent of the company. This was all the money Chips & Technologies ever raised — $3 million total.

    At SEEQ, most of the $30 million in venture capital had been spent building a semiconductor factory. That’s the way it was with chip companies, where everyone thought that they could do a better job than the other guys at making chips. But Chips & Technologies couldn’t afford to build a factory. Then Campbell discovered that all the chip makers with edifice complexes had produced a glut of semiconductor production capacity. He could farm out his chip production cheaper than doing it in-house.

    As always, the real value lay in the design—in software— not in hardware. There was nothing sacred about a factory.

    The first C&T product was a set of five chips that hit the market in the fall of 1985. These five chips, which sold then for $72.40, replaced sixty-three smaller chips on an IBM PC-AT motherboard. Using the C&T chip set, clone makers could build a 100 percent IBM-compatible AT clone with 256K of memory using only twenty-four chips. They could buy 100 percent IBM compatibility. Their personal computers could suddenly be smaller, easier to build, more reliable, even faster than a real IBM AT. And because they weren’t having to buy all the same individual parts as IBM, the clone makers could put together AT clones for less than it cost IBM, even with Big Blue’s massive buying power, to build the real thing.

    Chips & Technologies was an overnight success, getting the world back on the traditional track of computers doubling in power and halving in price every eighteen months. Venture capital firms — the same ones that rejected Campbell sixty times in a row — immediately funded half a dozen companies just like Chips.

    The commoditization of the PC-AT was complete, and though it didn’t know it at the time, IBM had lost forever its control of the personal computer business.

  • Pepsi to begin using unlabeled, sweetness-enhancing ‘mystery’ ingredients developed by ‘aborted fetal cells’ company

    Beverage giant PepsiCo has once again partnered with Senomyx, the San Diego, Cal.-based chemical company that gained nationwide attention back in 2011 for using aborted human fetal cells to develop flavor chemicals, to create even more flavor chemicals for its products…
  • Monsanto proves that corporations don’t run the government

    Collectivists have a favorite target. Big bad corporations. This is a complete scam. Why did Goldman Sachs turn out to be the biggest funder of Obama’s 2008 election bid? Why weren’t the corporate banksters who demanded and received those enormous bailouts, under both…
  • GMO controversy heating up as Kashi cereal comes under scrutiny

    Genetically modified crops were designed to be more resistant to drought and pests, but many scientists say it’s too early to know their side effects. Even as 90 percent of soybeans, corn, and canola in the U.S. are grown with transgenic seeds, 93 percent of consumers…
  • Mainstream nutrition myth buster – Here are some of the biggest nutrition lies debunked

    There are ideas about what constitutes “good nutritional habits” in mainstream America, but a lot of them are simply premised on distortions or outright falsehoods. Here are some of the most common, based on current data and research: Stay away from eggs – they aren’t…
  • The American Cancer Society has deceived the American population about aspartame

    The dangers of aspartame and the diseases associated with its usage are insignificant to conventional science and government-sponsored health agencies. In fact, I am certain, you would cringe in disbelief if you read the official statements made by some of the most influential…
  • Why you should replace your amalgam fillings

    Research at the University of Calgary, Faculty of Medicine and other prominent medical schools, have demonstrated that mercury vapor continuously escapes from dental amalgams and 80 percent of this vapor is immediately absorbed through the lungs and into the bloodstream…
  • Treat your rheumatoid arthritis with antioxidants and fish oil

    Antioxidants may be effective in relieving the symptoms of rheumatoid arthritis (RA), research has found. RA is an autoimmune disease, meaning that for unknown reasons the body’s immune system begins to attack and destroy its own tissues. In RA, the most damage occurs…
  • Biometric National ID Card could be mandated on all American workers

    Control – this is what fuels many Congressmen and women. Recently, two U.S. senators met with President Obama, vouching for a biometric national ID card that would force workers to submit to a fingerprint, hand, or iris scan. Any worker failing to comply with this biometric…
  • More than 250 employees axed from Washington nuclear power facility leaking 1,000 gallons per year of radioactive waste

    Federal budget cuts have prompted the layoff of at least 235 workers at the Hanford Nuclear Reservation in southern Washington, a massive 586-square-mile storage site for radioactive waste located near Richland. But according to the Associated Press (AP), aging underground…