Blog

  • Protect your Windows 8 PC with Panda Cloud Antivirus

    Spanish security company Panda Security Ltd has released Panda Cloud Antivirus Free 2.1.1, a minor update to its free cloud-based security tool for Windows. Version 2.1.1 is the first release to secure a Windows 8 compatible logo after passing the eligibility requirements laid down by Microsoft.

    Panda Cloud Antivirus, also available with built-in firewall in Pro form, is primarily a maintenance release, building on the major improvements found in version 2.1, which included real-time protection for Windows 8 Store apps and anti-exploit technology.

    Panda Cloud Antivirus 2.1.1 includes five specific fixes for problems. The first resolves an issue with file-blocking while waiting for scan results to be returned from the cloud. Another bug squashed was one that prevented the GUI console from opening in certain circumstances.

    A number of linguistic errors have also been fixed, while the promotional banner should now be removed in all situations after users upgrade to Panda Cloud Antivirus Pro in the program. The final resolution ensures the correct remote IP address is shown in the Pro version’s firewall popups.

    The update follows on from the release of Panda Cloud Antivirus 2.1, which added anti-exploit technology to both free and Pro versions, providing users with protection against malware that exploits so-called zero-day vulnerabilities in unpatched software such as Java, Adobe and Microsoft Office. Like traditional signature-based malware, suspicious files are sent to the cloud for analysis, but in this case their behavior is tracked to determine whether or not they pose a potential risk.

    Version 2.1 also added real-time protection for Windows 8 Store apps, plus came with performance improvements, a tweaked user interface and a number of unspecified bug fixes.

    Panda Cloud Antivirus Free 2.1.1 is available now as a freeware download for PCs running Windows XP or later. Users can upgrade to the Pro version for $29.95 per year, which adds a built-in firewall, extra protection when using public Wi-Fi networks and automatic USB vaccination.

  • Winners of the UK’s 4G auction announced

    UK Telecoms regulator Ofcom has announced the winners of the auctions for 4G spectrum on the 800MHz and 2.6GHz bands, and it’s a list with no surprises. After more than 50 rounds of bidding the winners are EE, Hutchison 3G UK, Niche Spectrum Ventures (a subsidiary of BT), Telefónica, and Vodafone.

    The Office for Budget Responsibility had expected the auction to raise £3.5bn but in the end it actually raised considerably less — £2.34bn. A fraction of the £22bn the 3G spectrum auction brought in for the Treasury in 2000.

    The two bands, which cover 250MHz of spectrum in total, or the equivalent of two-thirds of the radio frequencies currently used by wireless devices, will give 4G networks widespread coverage and enough capacity to deal with the demand in cities.

    The lower-frequency 800 MHz band, which is ideal for widespread mobile coverage, comes from the so called digital dividend that was made available when analogue terrestrial TV was finally switched off on 24 October 2012. The higher-frequency 2.6 GHz band delivers the capacity needed for faster speeds.

    Telefónica has a contractual obligation attached to one of the 800 MHz lots, and as a result must deliver a mobile broadband service for indoor reception to at least 98 percent of the UK population by the end of 2017.

    Making the announcement, Ed Richards, Ofcom Chief Executive, said:

    This is a positive outcome for competition in the UK, which will lead to faster and more widespread mobile broadband, and substantial benefits for consumers and businesses across the country. We are confident that the UK will be among the most competitive markets in the world for 4G services.

    4G coverage will extend far beyond that of existing 3G services, covering 98 percent of the UK population indoors — and even more when outdoors — which is good news for parts of the country currently underserved by mobile broadband.

    We also want consumers to be well informed about 4G, so we will be conducting research at the end of this year to show who is deploying services, in which areas and at what speeds. This will help consumers and businesses to choose their most suitable provider.

    EE, formerly Everything Everywhere, which was created by the merger of the T-Mobile and Orange businesses in 2010, already offers a 4G service (the UK’s first) on another band. At the end of January it announced it would be expanding its coverage to reach around 45 of the UK population.

    Photo credit: cartoon11/Shutterstock

     

  • The results are in: UK 4G spectrum auction has five winners, raising $3.62B

    The UK’s 4G spectrum auction has raised £2.34 billion ($3.62 billion), with BT and the country’s four main mobile carriers winning new spectrum that will allow them to roll out LTE services.

    The auction took in 250MHz of spectrum in the 2.6GHz band, which is high-bandwidth and good for urban deployments, and 800MHz band, which is lower-bandwidth but longer-distance and better for rural deployments. EE (which already runs 4G on reused 2G spectrum)and Vodafone both won spectrum in both bands, while Three and O2 (Telefonica) each won spectrum in the 800MHz band. Niche Spectrum Ventures (a BT subsidiary) only won 2.6GHz spectrum.

    The reserve price for the auction was £1.3 billion, although the government had budgeted for it to bring in £3.5 billion.

    According to the regulator Ofcom, new services should roll out in about six months’ time, and the whole of the UK will be able to receive 4G services “by the end of 2017 at the latest”. This will partly be helped by an obligation placed on Telefonica to ensure coverage for at least 98 percent of the UK population through its own network alone.

    Ofcom chief executive Ed Richards said this would be good news for parts of the country where mobile broadband is currently scarce:

    “This is a positive outcome for competition in the UK, which will lead to faster and more widespread mobile broadband, and substantial benefits for consumers and businesses across the country. We are confident that the UK will be among the most competitive markets in the world for 4G services.

    Here’s who won what:

    4G auction winners

    It’s worth remembering that this represents BT’s return to the mobile network operator status, after spinning out BT Cellnet (now O2, owned by Telefonica) in 2002.

    More soon…

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Twenty-percent of dollars spent on consumer technology goes to Apple

    No wonder Google wants to open retail stores and sell more gadgets. Apple’s reach is enormous, which is as important for the brand as making mullah — and there is a whole lot of that. According to NPD, the fruit-logo company accounted for 19.9 percent of all US retail consumer technology sales last year, up from 17.3 percent in 2011.

    Samsung ranked second, accounting for 9.3 percent of sales, up 2 points year over year. HP (8.2 percent), Sony (4.4 percent) and Dell (3 percent) rounded out the top five.

    Total retail consumer technology sales fell 2 percent year over year to $143 billion. The top-five brands accounted for 45 percent of sales, up three points from 2011. Apple and Samsung sales together increased by $6.5 billion, while the rest of the market declined by $9.5 billion.

    Top retailers: Best Buy, Walmart, Apple, Amazon, and Staples.

    Apple got big lifts from smartphones and tablets, huge growth categories — up 25 percent and 42 percent, respectively, year over year. iPad and iPhone accounted for 76 percent of all Apple revenue during calendar fourth quarter. The company is tapped into the two most consumer tech categories.

    “Tablets and smartphones have been able to stimulate demand for additional devices, but unfortunately it hasn’t been enough, yet, to sustain positive growth trends”, Stephen Baker, NPD’s vice president of industry analysis, says.

    Sales of the other three top categories — desktop PCs, notebooks and flat-panel TVs — fell 11 percent, 9 percent and 7 percent, respectively. The five categories accounted for 53 percent of consumer technology sales in 2012, up from 49 percent a year earlier.

    “Most market segments have high penetration rates and the demand for additional devices is slowing, or declining”, Baker says. Tablets contributed to PC sales, which is an ongoing trend.

    Photo Credit: Redstarstudio/Shutterstock

  • NTT expands its IaaS geographies and touts its use of SDN

    NTT Communications is expanding its private enterprise cloud outside Asia with the addition of more data centers. The new is more than just a geographic expansion — it represents a full on production use case for software defined networking.

    A subsidiary of the NTT Group, NTT Communications announced its enterprise cloud by way of data centers in Hong Kong and Japan in June 2012. It was billed as “the world’s first cloud service to incorporate OpenFlow,” according to a news release. OpenFlow is a protocol for separating packet forwarding from routing decisions, which can be moved from a switch to a different controller. Such separation has the potential to lower the cost of equipment and create interoperable gear that would allow buyers to program their network infrastructure without resorting to proprietary and complex programming options created by the networking gear vendor.

    Since last June, data centers in California, Virginia and Singapore have joined the NTT Communications lineup, and facilities in Australia, Malaysia and Thailand will come online in March, according to a news release.

    The new data centers will also use software-defined networking to give NTT and its clients more agility and lower costs. Implementing network virtualization in the data centers enables more flexible and automated configuration changes to the network connecting a customer’s servers, even across multiple data centers, according to a presentation NTT Communications executive Yukio Ito gave at last year’s Open Networking Summit.

    NTT isn’t completely new to SDN. Last year it was named as a customer of Nicira’s Network Virtualization Platform, as my colleague Stacey Higginbotham reported. The company was using Nicira controllers to move data sets from data center to data center following the earthquake off the Japanese coast that triggered a tsunami and led to subsequent nuclear accidents.

    While NTT is making a statement with its expansion of SDN-enabled data centers, other companies that run colocation or cloud facilities for enterprises, such as Rackspace and AT&T, could follow suit with similar offerings soon. After all, both of those companies are also Nicira customers, and hosting companies are popular targets for SDN deployments.

    In any case, the rush to deploy software-defined networking in production environments will continue, especially after such a large vendor has gone public. Stacey predicted last month that 2013 would be the year big companies will see that their efforts to prevent network-hardware commoditization are doomed to fail.

    Feature image courtesy of Flickr user bandarji.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Google working with Visa, Mastercard, PayPal to cut off funding for alleged piracy sites

    Google Anti-Piracy Plan
    Google (GOOG) has decided to take the next step in its anti-piracy efforts by working with Visa, Mastercard and PayPal to cut off funding for websites accused of making money off of pirated content, the Telegraph reports. According to the Telegraph, Google is “considering the radical measure so that it can get rid of the root cause instead of having to change its own search results” and would like to “block funding to websites that do not respond to legal challenges.” However, the Telegraph also reports that Google is wary of embracing such a strategy because it “may have unintended consequences, for instance companies using it to stamp out” competitors by accusing them of piracy to cut off their funds.

  • Setting the Record Straight About the Sequester

    In less than two weeks, dangerous across the board budget cuts are slated to take effect, potentially threatening hundreds of thousands of jobs, our national security and our economic recovery. The President has laid out a specific plan with detailed cuts to avoid the sequester and reduce the deficit in a balanced way by cutting spending, reforming entitlements and closing tax loopholes for the wealthiest and big corporations – loopholes not available to the middle class — and Congressional Democrats have put forward a balanced approach as well.

    The only party unwilling to compromise to avoid these devastating cuts are Congressional Republicans, who would rather see our recovery and middle class economic security be put at risk than close one tax loophole for big corporations and the wealthiest.

    Tonight, in an effort to distract from this reality, the Leader of the Republican party took to the opinion pages of the Wall Street Journal to engage in an amazing act of revisionist history. Instead of communicating with the American people – who support a balanced approach to reduce the deficit – about finding a compromise, the Republican Leadership once again launched a series of false attacks instead of putting forward ways to resolve this issue in a bipartisan way.

    So let's set the record straight.

    1. Speaker Boehner asked “What spending are you willing to cut to replace it?” Here they are: The fact is, the President has a detailed, balanced plan with spending cuts. He is willing to make tough choices. Now it’s time for the Speaker to do the same. The Speaker has yet to name one tax loophole he’s willing to close? Not one.
    2. The Speaker said the sequester is "an ugly and dangerous way" to cut spending. We agree. But in the past he’s led Congressional Republicans to threaten the sequester as a political tool. In the Wall Street Journal on January 6, 2013: “Mr. Boehner says he has significant Republican support, including GOP defense hawks, on his side for letting the sequester do its work. ‘I got that in my back pocket,’ the speaker says.”
    3. In that same article in the Wall Street Journal Speaker Boehner boasts about using the sequester as leverage. “Republican willingness to support the sequester, Mr. Boehner says, is ‘as much leverage as we're going to get.’ That leverage, he reasons, is what will force Democrats to the table on entitlements.”
    4. It’s time for Speaker Boehner to explain to the American people what he actually meant. The Speaker claims the sequester was a last minute agreement to resolve the debt limit increase the President wanted. Simply not true. In fact, it was the Speaker who praised the sequester at the time. Following the deal, he said “When you look at this final agreement that we came to with the White House, I got 98 percent of what I wanted. I'm pretty happy.” In fact, the final vote count was 269-161 – with 174 Republicans in favor. Speaker Boehner, Rep. Cantor and Rep. Ryan all voted yes.
    5. Speaker Boehner argues the President has "put forth no detailed plan that can pass Congress". Here’s the plan.  It’s balanced and it includes spending cuts. The President is willing to make tough choices. It’s time for Speaker Boehner and Congressional Republicans to do the same.
    6. Speaker Boehner claims we haven't been serious about entitlement reform. The opposite is true: Here’s the plan. It’s on the table. Now it’s time for Congressional Republicans to come to the table and take a balanced approach to avoid these devastating cuts.
    7. Where is the Republican plan? The GOP bill expired. If they’re confident the draconian cuts will win support in Congress and more importantly – with the American people — they should bring it up for a vote.
  • Nokia launches Music+ service, offers unlimited downloads for $3.99 a month

    Nokia Music+ Announced
    Nokia (NOK) on Tuesday began ramping up its efforts to compete with both Spotify and iTunes by announcing a new Nokia Music+ service that Lumia smartphone owners can subscribe to for $3.99 per month. Music+, which is an enhanced version of the free Nokia Music service the company launched last year, removes the free service’s limits on how many specialized mixes users can download and effectively lets Lumia users have as much offline music on their smartphones as they want. The new subscription service also significantly boosts the sound quality of music and Nokia claims that it “allows you to download music at six times the existing quality” and lets you “set rules to only download high quality when you’re on WiFi.” Nokia says the new service will roll out over the next few weeks.

  • If Apple can’t protect itself from malware, how can you trust it to protect you?

    Apple may be perceived as a bastion of security and users generally feel safe from the plagues that us Windows users suffer, but market share plays a large part in that perception. The bigger target gets more attention. Well, the party may be over, folks, because the fruit-logo company has a problem, and it is one that is incredibly familiar to Windows users — Java. The Oracle software platform may be one of the most exploited ones on computers.

    Today Reuters reports that Apple, a company largely known for never admitting error — think “You’re holding it wrong” — released a statement describing “the widest known attacks targeting Apple computers used by corporations”. The same exploit had been used to attack social networking giant Facebook.

    When Apple workers visited a specific website used by software developers, malicious software infected their computers. But not just Apple employees. Mac users in other locations also were vulnerable.

    The problem is this: Malware writers used Java in the past to attack Macs. Remember last year’s Flashback Trojan, which pulled together Macs into a massive botnet? The newest version of OS X, Mountain Lion, doesn’t include Java by default for a reason. The upgrade goes so far as to remove the Java installed by previous version Lion. So why are computers Apple manages running unpatched Java — or at all?

    If Apple can’t protect itself, how can it protect you? Does the arrogant attitude that Macs are invulnerable to viruses (they are not) run so deep inside Apple? As someone responsible for managing IT infrastructure, I ask these questions from experience. Apple should be the model of Mac security. Clearly it is not.

    I question whether Apple should provide Java at all. Java as common means of attack is too common a story.

    Still, to its credit, Apple responded rather quickly, issuing an update which it claims will deliver better security. The update “uninstalls the Apple-provided Java applet plug-in from all web browsers. To use applets on a webpage, click on the region labeled ‘Missing plug-in’ to go download the latest version of the Java applet plug-in from Oracle”.

    There really is no telling, at this point, the extent of the damage beyond Apple. Surely we’ll know more in the days ahead. Meantime, if you own a Mac, now is a good time to patch up, purge Java and install antimalware.

    Photo Credit: Jirsak/Shutterstock

  • Samsung debuts more affordable Galaxy Camera

    Samsung Galaxy Camera Wi-Fi Only Model
    Samsung (005930) on Tuesday announced a Wi-Fi only version of its Android-powered Galaxy Camera. The device is equipped with a 4.8-inch 720p HD display, a 1.4GHz quad-core Exynos processor and a 16.3-megapixel sensor with a 23-millimeter wide angle lens that is capable of 21x optical zoom. It also includes 1GB of RAM, 8GB of internal storage, a microSD slot and Android 4.1 Jelly Bean. While the company did not announce pricing or availability, the Wi-Fi only model is expected to cost significantly less than comparable models with 3G/4G connectivity that start at around $499. Samsung’s press release follows below.

    Continue reading…

  • T-Mobile goes off-brand with new prepaid service GoSmart

    T-Mobile USA has launched a new prepaid service for the highly budget-minded, and in the process it’s taken a page from Sprint’s book. Instead of making the new plans part of its regular magenta-tinged T-Mobile portfolio, the carrier has created a whole new brand: GoSmart Mobile.

    GoSmart plans start at $30 a month, including unlimited talk and text. For $5 more a month, you can get unlimited Web access, but only on T-Mobile’s 2G GPRS and EDGE networks. For $45 a month, you can upgrade to the 3G HSPA network and get 5 GB of data, after which you’ll be throttled down to 2G speeds. It’s important to note T-Mo is specifically saying 3G here even though its marketing its HSPA+ network as 4G. The implication is that even GoSmart users on the $45 plan won’t get access to the full bandwidth available over its data networks.

    GoSmart is offering only two devices without subsidies, a $50 Alcatel feature phone and a $100 ZTE Android 2.3 device, which is a pretty good bargain for a smartphone. GoSmart’s main go-to-market package appears to be a SIM kit that allows a customer to activate any GSM device they bring to the network. That strategy seems to replicate the SIM services that have proven so successful for Tracfone and other mobile virtual network operators.

    But why is T-Mobile creating an entirely new brand for prepaid when it already sells a lot of contract-free plans under the official T-Mobile? The carrier is probably trying to create two distinct classes of prepaid of service so it can tailor its marketing for each. T-Mobile’s branded prepaid emphasizes T-Mobile HSPA+ data network, bigger and faster data plans and higher-end devices.

    Meanwhile, GoSmart is clearly targeted at the more voice-centric budget user. Sprint follows the same approach, though its segmented its prepaid services into numerous brands, including Virgin Mobile, Boost Mobile and Assurance Wireless.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • NVIDIA unveils new Tegra 4i processor with built-in LTE

    NVIDIA Tegra 4i
    NVIDIA (NVDA) on Tuesday announced its first Tegra processor with an integrated LTE chip. The 2.3GHz quad-core Tegra 4i, which brings the company in closer competition with Qualcomm (QCOM) and its line of Snapdragon CPUs, is equipped with 60 custom GPU cores, a fifth processing core for battery conservation and an integrated NVIDIA i500 LTE modem. It also includes NVIDIA’s Chimera camera technology that is capable of capturing HDR panorama shots without requiring a single-direction sweep. The company calls its the new processor the most efficient, highest performance CPU core on the market, noting that it will provide “amazing computing power, world-class phone capabilities, and exceptionally long battery life.” NVIDIA’s press release follows below.

    Continue reading…

  • All eyes on Tesla as it inches toward profitability in 2013

    The dustup over the New York Times’ negative review of Tesla’s Model S electric car was all anyone could talk about last week. But the business story of how the pioneering auto maker, led by Elon Musk, will become profitable will begin to be revealed on Wednesday afternoon, when Tesla holds its fourth quarter and full year 2012 earnings.

    The end of 2012 was a pivotal time for Tesla. It transitioned from small-scale manufacturing of its Model S electric car — which won Motor Trend’s car of the year award — to its goal of volume manufacturing of 400 cars per week. That rate puts the company on schedule to make 20,000 cars per year, and that increase in production was just a few weeks behind schedule last year.

    2013 is an even more crucial year for Tesla. The company is supposed to hit profitability at some point this year, based on its revenues from sales of the Model S. Analysts expect the company could reach profitability in June.

    The Model S Betas got buffed between rides.

    The Model S Betas being buffed between test rides.

    Tesla’s earnings on Wednesday could shed more light on how close it is to becoming profitable. In the beginning of December, Musk tweeted that Tesla had narrowly become cash flow positive. I’d assume that meant for the quarter the company had more cash coming in than going out (though Musk didn’t specify the time period). So we can watch to see if the cash flow is identified for Q4 in the call.

    We’ll also be watching to see if that production ramp rate was officially hit: 400 cars a week, and 20,000 cars per year. Finally listen for Tesla’s gross margins, which will be an indicator of how fast they’ll become profitable (12.5 percent for Q4, 2012, and a goal of 25 percent for 2013).

    While Tesla has a lot of skeptics, it’s miraculously survived the hardest part of building an independent electric car company already. It held a successful IPO, as well as follow-on public financings, and it hit its production targets (mostly) on time. In contrast, the vast majority of electric car startups that have tried similar things have struggled and bowed-out in recent years, including Fisker, Think, Aptera, Bright Automotive, Better Place, and Coda.

    If you’re still riled up about the New York Times’ test drive, the New York Times’ Public Editor came out with her final assessment on the situation, finding the review had “Problems With Precision and Judgment, but Not Integrity.” Musk appears to be happy with that call. Tesla’s stock was up 6 percent on Tuesday following the finding, and in anticipation of the earnings on Wednesday.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Accidental Empires, Part 9 — Why They Don’t Call It Computer Valley (Chapter 3)

    Ninth in a series. Robert X. Cringely’s brilliant look at the rise of the personal computing industry continues, explaining why PCs aren’t mini-mainframes and share little direct lineage with them.

    Published in 1991, Accidental Empires is an excellent lens for viewing not just the past but future computing.

    ACCIDENTAL EMPIRES — CHAPTER THREE

    WHY THEY DON’T CALL IT COMPUTER VALLEY

    Reminders of just how long I’ve been around this youth-driven business keep hitting me in the face. Not long ago I was poking around a store called the Weird Stuff Warehouse, a sort of Silicon Valley thrift shop where you can buy used computers and other neat junk. It’s right across the street from Fry’s Electronics, the legendary computer store that fulfills every need of its techie customers by offering rows of junk food, soft drinks, girlie magazines, and Maalox, in addition to an enormous selection of new computers and software. You can’t miss Fry’s; the building is painted to look like a block-long computer chip. The front doors are labeled Enter and Escape, just like keys on a computer keyboard.

    Weird Stuff, on the other side of the street, isn’t painted to look like anything in particular. It’s just a big storefront filled with tables and bins holding the technological history of Silicon Valley. Men poke through the ever-changing inventory of junk while women wait near the door, rolling their eyes and telling each other stories about what stupid chunk of hardware was dragged home the week before.

    Next to me, a gray-haired member of the short-sleeved sport shirt and Hush Puppies school of 1960s computer engineering was struggling to drag an old printer out from under a table so he could show his 8-year-old grandson the connector he’d designed a lifetime ago. Imagine having as your contribution to history the fact that pin 11 is connected to a red wire, pin 18 to a blue wire, and pin 24 to a black wire.

    On my own search for connectedness with the universe, I came across a shelf of Apple III computers for sale for $100 each. Back in 1979, when the Apple III was still six months away from being introduced as a $3,000 office computer, I remember sitting in a movie theater in Palo Alto with one of the Apple III designers, pumping him for information about it.

    There were only 90,000 Apple III computers ever made, which sounds like a lot but isn’t. The Apple III had many problems, including the fact that the automated machinery that inserted dozens of computer chips on the main circuit board didn’t push them into their sockets firmly enough. Apple’s answer was to tell 90,000 customers to pick up their Apple III carefully, hold it twelve to eighteen inches above a level surface, and then drop it, hoping that the resulting crash would reseat all the chips.

    Back at the movies, long before the Apple Ill’s problems, or even its potential, were known publicly, I was just trying to get my friend to give me a basic description of the computer and its software. The film was Barbarella, and all I can remember now about the movie or what was said about the computer is this image of Jane Fonda floating across the screen in simulated weightlessness, wearing a costume with a clear plastic midriff. But then the rest of the world doesn’t remember the Apple III at all.

    It’s this relentless throwing away of old technology, like the nearly forgotten Apple III, that characterizes the personal computer business and differentiates it from the business of building big computers, called mainframes, and minicomputers. Mainframe technology lasts typically twenty years; PC technology dies and is reborn every eighteen months.

    There were computers in the world long before we called any of them “personal”. In fact, the computers that touched our lives before the mid-1970s were as impersonal as hell. They sat in big air-conditioned rooms at insurance companies, phone companies, and the IRS, and their main function was to screw up our lives by getting us confused with some other guy named Cringely, who was a deadbeat, had a criminal record, and didn’t much like to pay parking tickets. Computers were instruments of government and big business, and except for the punched cards that came in the mail with the gas bill, which we were supposed to return obediently with the money but without any folds, spindling, or mutilation, they had no physical presence in our lives.

    How did we get from big computers that lived in the basement of office buildings to the little computers that live on our desks today? We didn’t. Personal computers have almost nothing to do with big computers. They never have.

    A personal computer is an electronic gizmo that is built in a factory and then sold by a dealer to an individual or a business. If everything goes as planned, the customer will be happy with the purchase, and the company that makes the personal computer, say Apple or Compaq, won’t hear from that customer again until he or she buys another computer. Contrast that with the mainframe computer business, where big computers are built in a factory, sold directly to a business or government, installed by the computer maker, serviced by the computer maker (for a monthly fee), financed by the computer maker, and often running software written by the computer maker (and licensed, not sold, for another monthly fee). The big computer company makes as much money from servicing, financing, and programming the computer as it does from selling it. It not only wants to continue to know the customer, it wants to be in the customer’s dreams.

    The only common element in these two scenarios is the factory. Everything else is different. The model for selling personal computers is based on the idea that there are millions of little customers out there; the model for selling big computers has always been based on the idea that there are only a few large customers.

    When IBM engineers designed the System 650 mainframe in the early 1950s, their expectation was to build fifty in all, and the cost structure that was built in from the start allowed the company to make a profit on only fifty machines. Of course, when computers became an important part of corporate life, IBM found itself selling far more than fifty — 1,500, in fact — with distinct advantages of scale that brought gross profit margins up to the 60 to 70 percent range, a range that computer companies eventually came to expect. So why bother with personal computers?

    Big computers and little computers are completely different beasts created by radically different groups of people. It’s logical, I know, to assume that the personal computer came from shrinking a mainframe, but that’s not the way it happened. The PC business actually grew up from the semiconductor industry. Instead of being a little mainframe, the PC is, in fact, more like an incredibly big chip. Remember, they don’t call it Computer Valley. They call it Silicon Valley, and it’s a place that was invented one afternoon in 1957 when Bob Noyce and seven other engineers quit en masse from Shockley Semiconductor.

    William Shockley was a local boy and amateur magician who had gone on to invent the transistor at Bell Labs in the late 1940s and by the mid-1950s was on his own building transistors in what had been apricot drying sheds in Mountain View, California.

    Shockley was a good scientist but a bad manager. He posted a list of salaries on the bulletin board, pissing off those who were being paid less for the same work. When the work wasn’t going well, he blamed sabotage and demanded lie detector tests. That did it. Just weeks after they’d toasted Shockley’s winning the Nobel Prize in physics by drinking champagne over breakfast at Dinah’s Shack, a red clapboard restaurant on El Camino Real, the “Traitorous Eight”, as Dr. S. came to call them, hit the road.

    For Shockley, it was pretty much downhill from there; today he’s remembered more for his theories of racial superiority and for starting a sperm bank for geniuses in the 1970s than for the breakthrough semiconductor research he conducted in the 1940s and 1950s. (Of course, with several fluid ounces of Shockley semen still sitting on ice, we may not have heard the last of the doctor yet.)

    Noyce and the others started Fairchild Semiconductor, the archetype for every Silicon Valley start-up that has followed. They got the money to start Fairchild from a young investment banker named Arthur Rock, who found venture capital for the firm. This is the pattern that has been followed ever since as groups of technical types split from their old companies, pick up venture capital to support their new idea, and move on to the next start-up. More than fifty new semiconductor companies eventually split off in this way from Fairchild alone.

    At the heart of every start-up is an argument. A splinter group inside a successful company wants to abandon the current product line and bet the company on some radical new technology. The boss, usually the guy who invented the current technology, thinks this idea is crazy and says so, wishing the splinter group well on their new adventure. If he’s smart, the old boss even helps his employees to leave by making a minority investment in their new company, just in case they are among the 5 percent of start-ups that are successful.

    The appeal of the start-up has always been that it’s a small operation, usually led by the smartest guy in the room but with the assistance of all players. The goals of the company are those of its people, who are all very technically oriented. The character of the company matches that of its founders, who were inevitably engineers—regular guys. Noyce was just a preacher’s kid from Iowa, and his social sensibilities reflected that background.

    There was no social hierarchy at Fairchild — no reserved parking spaces or executive dining rooms — and that remained true even later when the company employed thousands of workers and Noyce was long gone. There was no dress code. There were hardly any doors; Noyce had an office cubicle, built from shoulder-high partitions, just like everybody else. Thirty years later, he still had only a cubicle, along with limitless wealth.

    They use cubicles, too, at Hewlett-Packard, which at one point in the late 1970s had more than 50,000 employees, but only three private offices. One office belonged to Bill Hewlett, one to David Packard, and the third to a guy named Paul Ely, who annoyed so many coworkers with his bellowing on the telephone that the company finally extended his cubicle walls clear to the ceiling. It looked like a freestanding elevator shaft in the middle of a vast open office.

    The Valley is filled with stories of Bob Noyce as an Everyman with deep pockets. There was the time he stood in a long line at his branch bank and then asked the teller for a cashier’s check for $1.3 million from his personal savings, confiding gleefully that he was going to buy a Learjet that afternoon. Then, after his divorce and remarriage, Noyce tried to join the snobbish Los Altos Country Club, only to be rejected because the club did not approve of his new wife, so he wrote another check and simply duplicated the country club facilities on his own property, within sight of the Los Altos clubhouse. “To hell with them,” he said.

    As a leader, Noyce was half high school science teacher and half athletic team captain. Young engineers were encouraged to speak their minds, and they were given authority to buy whatever they needed to pursue their research. No idea was too crazy to be at least considered, because Noyce realized that great discoveries lay in crazy ideas and that rejecting out of hand the ideas of young engineers would just hasten that inevitable day when they would take off for their own start-up.

    While Noyce’s ideas about technical management sound all too enlightened to be part of anything called big business, they worked well at Fairchild and then at Noyce’s next creation, Intel. Intel was started, in fact, because Noyce couldn’t get Fairchild’s eastern owners to accept the idea that stock options should be a part of compensation for all employees, not just for management. He wanted to tie everyone, from janitors to bosses, into the overall success of the company, and spreading the wealth around seemed the way to go.

    This management style still sets the standard for every computer, software, and semiconductor company in the Valley today, where office doors are a rarity and secretaries hold shares in their company’s stock. Some companies follow the model well, and some do it poorly, but every CEO still wants to think that the place is being run the way Bob Noyce would have run it.

    The semiconductor business is different from the business of building big computers. It costs a lot to develop a new semiconductor part but not very much to manufacture it once the design is proved. This makes semiconductors a volume business, where the most profitable product lines are those manufactured in the greatest volume rather than those that can be sold in smaller quantities with higher profit margins. Volume is everything.

    To build volume, Noyce cut all Fairchild components to a uniform price of one dollar, which was in some cases not much more than the cost of manufacturing them. Some of Noyce’s partners thought he was crazy, but volume grew quickly, followed by profits, as Fairchild expanded production again and again to meet demand, continually cutting its cost of goods at the same time. The concept of continually dropping electronic component prices was born at Fairchild. The cost per transistor dropped by a factor of 10,000 over the next thirty years.

    To avoid building a factory that was 10,000 times as big, Noyce came up with a way to give customers more for their money while keeping the product price point at about the same level as before. While the cost of semiconductors was ever falling, the cost of electronic subassemblies continued to increase with the inevitably rising price of labor. Noyce figured that even this trend could be defeated if several components could be built together on a single piece of silicon, eliminating much of the labor from electronic assembly. It was 1959, and Noyce called his idea an integrated circuit. “I was lazy,” he said. “It just didn’t make sense to have people soldering together these individual components when they could be built as a single part.”

    Jack Kilby at Texas Instruments had already built several discrete components on the same slice of germanium, including the first germanium resistors and capacitors, but Kilby’s parts were connected together on the chip by tiny gold wires that had to be installed by hand. TI’s integrated circuit could not be manufactured in volume.

    The twist that Noyce added was to deposit a layer of insulating silicon oxide on the top surface of the chip—this was called the “planar process” that had been invented earlier at Fairchild —and then use a photographic process to print thin metal lines on top of the oxide, connecting the components together on the chip. These metal traces carried current in the same way that Jack Kilby’s gold wires did, but they could be printed on in a single step rather than being installed one at a time by hand.

    Using their new photolithography method, Noyce and his boys put first two or three components on a single chip, then ten, then a hundred, then thousands. Today the same area of silicon that once held a single transistor can be populated with more than a million components, all too small to be seen.

    Tracking the trend toward ever more complex circuits, Gordon Moore, who cofounded Intel with Noyce, came up with Moore’s Law: the number of transistors that can be built on the same size piece of silicon will double every eighteen months. Moore’s Law still holds true. Intel’s memory chips from 1968 held 1,024 bits of data; the most common memory chips today hold a thousand times as much — 1,024,000 bits — and cost about the same.

    The integrated circuit — the IC — also led to a trend in the other direction — toward higher price points, made possible by ever more complex semiconductors that came to do the work of many discrete components. In 1971, Ted Hoff at Intel took this trend to its ultimate conclusion, inventing the microprocessor, a single chip that contained most of the logic elements used to make a computer. Here, for the first time, was a programmable device to which a clever engineer could add a few memory chips and a support chip or two and turn it into a real computer you could hold in your hands. There was no software for this new computer, of course — nothing that could actually be done with it — but the computer could be held in your hands or even sold over the counter, and that fact alone was enough to force a paradigm shift on Silicon Valley.

    It was with the invention of the microprocessor that the rest of the world finally disappointed Silicon Valley. Until that point, the kids at Fairchild, Intel, and the hundred other chipmakers that now occupied the southern end of the San Francisco peninsula had been farmers, growing chips that were like wheat from which the military electronics contractors and the computer companies could bake their rolls, bagels, and loaves of bread — their computers and weapon control systems. But with their invention of the microprocessor, the Valley’s growers were suddenly harvesting something that looked almost edible by itself. It was as though they had been supplying for years these expensive bakeries, only to undercut them all by inventing the Twinkie.

    But the computer makers didn’t want Intel’s Twinkies. Microprocessors were the most expensive semiconductor devices ever made, but they were still too cheap to be used by the IBMs, the Digital Equipment Corporations, and the Control Data Corporations. These companies had made fortunes by convincing their customers that computers were complex, incredibly expensive devices built out of discrete components; building computers around microprocessors would destroy this carefully crafted concept. Microprocessor-based computers would be too cheap to build and would have to sell for too little money. Worse, their lower part counts would increase reliability, hurting the service income that was an important part of every computer company’s bottom line in those days.

    And the big computer companies just didn’t have the vision needed to invent the personal computer. Here’s a scene that happened in the early 1960s at IBM headquarters in Armonk, New York. IBM chairman Tom Watson, Jr., and president Al Williams were being briefed on the concept of computing with video display terminals and time-sharing, rather than with batches of punch cards. They didn’t understand the idea. These were intelligent men, but they had a firmly fixed concept of what computing was supposed to be, and it didn’t include video display terminals. The briefing started over a second time, and finally a light bulb went off in Al Williams’s head. “So what you are talking about is data processing but not in the same room!” he exclaimed.

    IBM played for a short time with a concept it called teleprocessing, which put a simple computer terminal on an executive’s desk, connected by telephone line to a mainframe computer to look into the bowels of the company and know instantly how many widgets were being produced in the Muncie plant. That was the idea, but what IBM discovered from this mid-1960s exercise was that American business executives didn’t know how to type and didn’t want to learn. They had secretaries to type for them. No data were gathered on what middle managers would do with such a terminal because it wasn’t aimed at them. Nobody even guessed that there would be millions of M.B.A.s hitting the streets over the following twenty years, armed with the ability to type and with the quantitative skills to use such a computing tool and to do some real damage with it. But that was yet to come, so exit teleprocessing, because IBM marketers chose to believe that this test indicated that American business executives would never be interested.

    In order to invent a particular type of computer, you have to want first to use it, and the leaders of America’s computer companies did not want a computer on their desks. Watson and Williams sold computers but they didn’t use them. Williams’s specialty was finance; it was through his efforts that IBM had turned computer leasing into a goldmine. Watson was the son of God — Tom Watson Sr. — and had been bred to lead the blue-suited men of IBM, not to design or use computers. Watson and Williams didn’t have computer terminals at their desks. They didn’t even work for a company that believed in terminals. Their concept was of data processing, which at IBM meant piles of paper cards punched with hundreds of rectangular, not round, holes. Round holes belonged to Univac.

    The computer companies for the most part rejected the microprocessor, calling it too simple to perform their complex mainframe voodoo. It was an error on their part, and not lost on the next group of semiconductor engineers who were getting ready to explode from their current companies into a whole new generation of start-ups. This time they built more than just chips and ICs; they built entire computers, still following the rules for success in the semiconductor business: continual product development; a new family of products every year or two; ever increasing functionality; ever decreasing price for the same level of function; standardization; and volume, volume, volume.

    It takes society thirty years, more or less, to absorb a new information technology into daily life. It took about that long to turn movable type into books in the fifteenth century. Telephones were invented in the 1870s but did not change our lives until the 1900s. Motion pictures were born in the 1890s but became an important industry in the 1920s. Television, invented in the mid-1920s, took until the mid-1950s to bind us to our sofas.

    We can date the birth of the personal computer somewhere between the invention of the microprocessor in 1971 and the introduction of the Altair hobbyist computer in 1975. Either date puts us today about halfway down the road to personal computers’ being a part of most people’s everyday lives, which should be consoling to those who can’t understand what all the hullabaloo is about PCs. Don’t worry; you’ll understand it in a few years, by which time they’ll no longer be called PCs.

    By the time that understanding is reached, and personal computers have wormed into all our lives to an extent far greater than they are today, the whole concept of personal computing will probably have changed. That’s the way it is with information technologies. It takes us quite a while to decide what to do with them.

    Radio was invented with the original idea that it would replace telephones and give us wireless communication. That implies two-way communication, yet how many of us own radio transmitters? In fact, the popularization of radio came as a broadcast medium, with powerful transmitters sending the same message — entertainment — to thousands or millions of inexpensive radio receivers. Television was the same way, envisioned at first as a two-way visual communication medium. Early phonographs could record as well as play and were supposed to make recordings that would be sent through the mail, replacing written letters. The magnetic tape cassette was invented by Phillips for dictation machines, but we use it to hear music on Sony Walkmans. Telephones went the other direction, since Alexander Graham Bell first envisioned his invention being used to pipe music to remote groups of people.

    The point is that all these technologies found their greatest success being used in ways other than were originally expected. That’s what will happen with personal computers too. Fifteen years from now, we won’t be able to function without some sort of machine with a microprocessor and memory inside. Though we probably won’t call it a personal computer, that’s what it will be.

    It takes new ideas a long time to catch on — time that is mainly devoted to evolving the idea into something useful. This fact alone dumps most of the responsibility for early technical innovation in the laps of amateurs, who can afford to take the time. Only those who aren’t trying to make money can afford to advance a technology that doesn’t pay.

    This explains why the personal computer was invented by hobbyists and supported by semiconductor companies, eager to find markets for their microprocessors, by disaffected mainframe programmers, who longed to leave their corporate/mainframe world and get closer to the machine they loved, and by a new class of counterculture entrepreneurs, who were looking for a way to enter the business world after years of fighting against it.

    The microcomputer pioneers were driven primarily to create machines and programs for their own use or so they could demonstrate them to their friends. Since there wasn’t a personal computer business as such, they had little expectation that their programming and design efforts would lead to making a lot of money. With a single strategic exception — Bill Gates of Microsoft — the idea of making money became popular only later.

    These folks were pursuing adventure, not business. They were the computer equivalents of the barnstorming pilots who flew around America during the 1920s, putting on air shows and selling rides. Like the barnstormers had, the microcomputer pioneers finally discovered a way to live as they liked. Both the barnstormers and microcomputer enthusiasts were competitive and were always looking for something against which they could match themselves. They wanted independence and total control, and through the mastery of their respective machines, they found it.

    Barnstorming was made possible by a supply of cheap surplus aircraft after World War I. Microcomputers were made possible by the invention of solid state memory and the microprocessor. Both barnstorming and microcomputing would not have happened without previous art. The barnstormers needed a war to train them and to leave behind a supply of aircraft, while microcomputers would not have appeared without mainframe computers to create a class of computer professionals and programming languages.

    Like early pilots and motorists, the first personal computer drivers actually enjoyed the hazards of their primitive computing environments. Just getting from one place to another in an early automobile was a challenge, and so was getting a program to run on the first microcomputers. Breakdowns were frequent, even welcome, since they gave the enthusiast something to brag about to friends. The idea of doing real work with a microcomputer wasn’t even considered.

    Planes that were easy to fly, cars that were easy to drive, computers that were easy to program and use weren’t nearly as interesting as those that were cantankerous. The test of the pioneer was how well he did despite his technology. In the computing arena, this meant that the best people were those who could most completely adapt to the idiosyncrasies of their computers. This explains the rise of arcane computer jargon and the disdain with which “real programmers” still often view computers and software that are easy to use. They interpret “ease of use” as “lack of challenge”. The truth is that easy-to-use computers and programs take much more skill to produce than did the hairy-chested, primitive products of the mid-1970s.

    Since there really wasn’t much that could be done with microcomputers back then, the great challenge was found in overcoming the adversity involved in doing anything. Those who were able to get their computers and programs running at all went on to become the first developers of applications.

    With few exceptions, early microcomputer software came from the need of some user to have software that did not yet exist. He needed it, so he invented it. And son of a gun, bragging about the program at his local computing club often dragged from the membership others who needed that software, too, wanted to buy it, and an industry was born.

    Reprinted with permission

  • Ubuntu announced for tablets, developer preview available for Nexus devices on Thursday [video]

    Ubuntu Mobile Operating System
    Canonical on Tuesday announced a new version of its Ubuntu Linux operating system for tablet devices. The operating system supports multiple user accounts from the home screen, including a “guest mode” option, and uses gestures to navigate around the user interface. It also features a notification center, sharing to all major social outlets, extensive multitasking support and is capable of displaying applications designed for both tablets and smartphones simultaneously, similar to the Snap View feature in Windows 8. Canonical plans to offer a developer preview of Ubuntu on February 21st for Google’s (GOOG) Nexus 7 and Nexus 10 tablets, along with the Nexus 4 and Galaxy Nexus smartphones. A video demonstration of Ubuntu follows below.

    Continue reading…

  • Who’s the best cloud storage provider? Microsoft, says Nasuni; but it still likes Amazon

    So here’s an interesting tidbit. Nasuni, which manages cloud storage for small and midsized businesses, ran a set of exhaustive tests to assess the performance, availability and scalability of five major cloud storage providers. And the winner? Microsoft Windows Azure. Yup. Not Amazon S3, but Azure Blob storage.

    “We ran uptime tests and other tests and the long and short of it is that all the vendors got better, but Azure just leapfrogged. It was the fastest, had the best availability and uptime and was the only provider to never register an error,” Connor Fee, VP of marketing for Natick, Mass.-based Nasuni said in an interview.

    nasunichart2This according to Nasuni’s new State of Cloud Storage 2013 Industry Report, which also evaluated Google, Hewlett-Packard, and Rackspace storage. Nasuni is a pretty good judge of cloud storage provider performance since it assesses the best of the services to use for its customers’ data.  It views the various cloud storage players much as EMC or NetApp views hard drives — a piece of its overall service.

    Azure’s no.1, but Nasuni still dubs Amazon S3 as its primary backend

    Now before we get all wrapped around our axles about this glowing Azure endorsement, it’s important to note that Nasuni still counts on Amazon as its primary storage supplier and will continue to use Azure as a secondary supplier in some cases.

    So if Azure is so great, why stick with Amazon? “One major thing we evaluate is maturity and experience in the market and Amazon still clearly has the most experience and is the most mature player in this space,” Fee said.

    So how to explain Microsoft’s vast improvement? Fee, refers to this Microsoft blog post, which outlines a major upgrade of Azure’s storage layer, as a possible reason. Basically, Microsoft upgraded its storage layer, from a 1 gig to a 10 gig network and from a hierarchial to a flat network. That means it’s faster handling myriad small files.

    Microsoft honed performance on handling lots of itty-bitty files

    Think of it this way: Every time you want to store something to the cloud, you have alert the cloud that you’re about to write to it; then you write to it; then it acknowledges receipt of what you’ve written. “There’s a lot of back-and-forth there,” said Fee. “With very big files, if you have a very fast network connection that’s usually enough. But with small files, all of that chatter matters, so whatever Azure did, they got really, really good at handling small I/O,”

    It’s also important to note that this year’s report differs from Nasuni’s 2102 testing so year over year comparisons aren’t all that useful, although Nasuni was impressed with Azure even then.

    nasunicharts

    Feature photo courtesy of Flickr user Carlos Gutiérrez G.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Just as companies and even armies are becoming media entities, so are governments

    We’ve written about how social media and the “democratization of distribution” that the web allows has turned companies like Tesla into media entities in their own right, and has done the same thing for armies during conflicts like Israel’s recent attacks on the Gaza Strip. In the same vein, social tools allow governments to become media entities as well — and according to a piece at Politico, the Obama administration has adopted those tools with a vengeance. But is that a net benefit for democracy, or an attempt by the government to control the press?

    President Obama, who was celebrated by some as the “first internet president” after taking office — thanks to his use of a BlackBerry, as well as various web-based open government initiatives — has showed even more flair for the web and social media in the past year or so, in part by hosting events such as the Twitter town hall in 2011 and a Reddit “Ask Me Anything” feature during the election campaign last fall, in which he took questions from users of the online community.

    Obama-Reddit

    More open, or more controlling of the media?

    While all of these tools and strategies make the president seem more approachable and human to some, however, to members of the traditional press it is part of an attempt by the Obama government to do an end-run around the media and get its message out directly without any fear of being challenged (although the traditional media seem to have had no problem covering up the existence of a drone base in Saudi Arabia when asked to do so). According to Politico:

    “President Barack Obama is a master at limiting, shaping and manipulating media coverage of himself and his White House [and] the mastery mostly flows from a White House that has taken old tricks for shaping coverage (staged leaks, friendly interviews) and put them on steroids using new ones (social media, content creation, precision targeting).”

    As the Politico piece notes, governments have always tried to engineer their own messages, whether through hokey PR stunts like Calvin Coolidge’s radio addresses or government-produced propaganda shown in movie theatres, or through friendly reporters who are willing to write uncritically — as many have accused writer Judith Miller of doing at the New York Times when she covered the build-up to the Iraq War. But social media provides so many more tools (and real-time ones) for governments to use.

    New York Times

    Just as Elon Musk of Tesla used his blog — and all the user data that his electric car produced during a New York Times review — to argue his case against the newspaper, the Obama government has almost as many tools (if not more) at its disposal as any of the media entities it used to rely on for coverage. It can produce and distribute news stories, audio interviews and video clips just as well as anyone, and media companies who have cut costs are always looking for free content. One photographer said the White House has “built its own content distribution network.”

    The balance of power has shifted

    As Politico describes it, the “balance of power between the White House and the press has tipped unmistakeably towards the government.” The Obama administration is said to be eschewing unscripted scrums in favor of orchestrated media campaigns like the Reddit AMA, which was widely criticized for not being as hard-hitting as a traditional interview (although I took issue with that description). Former Clinton-era press secretary Mike McCurry told Politico:

    “The balance of power used to be much more in favor of the mainstream press [but now] the White House gets away with stuff I would never have dreamed of doing. When I talk to White House reporters now, they say it’s really tough to do business with people who don’t see the need to be cooperative.”

    That word “co-operative” sums it up in a nutshell: in the past, governments had to co-operate with the media because they needed it to get their message out, just as companies like Tesla used to. But that’s not the case any more — or at least not as much as it used to be. The playing field has been levelled. Is that a good thing or a bad thing for democracy, or for society in general? Is more information better, even if it comes directly from the government?

    Images courtesy of Shutterstock / Picsfive and Flickr user jphilipg

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • PE-Backed Hillman Closes Buy of H. Paulin & Co.

    The Hillman Companies Inc. has closed its previously-announced acquisition of all of the issued and outstanding shares of H. Paulin & Co. Ltd., a Toronto-based manufacturer and distributor of fasteners, fluid system products, automotive parts and retail hardware components. The cash purchase price of C$27.60 per share brings the total value of the transaction to approximately C$103 million. Cincinnati, Ohio’s Hillman has been a portfolio company of US private equity firm Oak Hill Capital Partners since 2010.

    PRESS RELEASE

    Hillman Closes H. Paulin & Co., Limited Acquisition

    CINCINNATI AND TORONTO, Feb. 19, 2013 /CNW/ – The Hillman Companies, Inc. (Amex: HLM.Pr) (“Hillman”) and H. Paulin & Co., Limited (TSX: PAP.A) (“Paulin”) are pleased to announce the successful completion of the previously announced arrangement pursuant to which Hillman acquired all of the issued and outstanding Class A common shares (the “Shares”) of Paulin for a cash purchase price of C$27.60 per Share.

    With the completion of the arrangement, the Shares are expected to cease to be listed for trading on the Toronto Stock Exchange on or about the close of business on February 25, 2013. Paulin intends to apply to the relevant securities regulatory authorities to cease to be a reporting issuer in the applicable jurisdictions in Canada.

    Details of the transaction are contained in Paulin’s management information circular dated January 7, 2013, which can be found at www.sedar.com.

    Advisors and Legal Counsel

    Barclays Bank PLC acted as financial advisor to Hillman in connection with the transaction and provided committed debt financing. Stikeman Elliott LLP and Paul, Weiss, Rifkind, Wharton & Garrison LLP acted as legal counsel to Hillman. Goodmans LLP acted as legal counsel to Paulin. Ernst & Young LLP acted as financial advisor and McCarthy Tétrault LLP acted as legal counsel to the special committee of the board of directors of Paulin.

    About Paulin

    Headquartered in Toronto, Canada, Paulin was founded in 1920 and is a leading Canadian distributor and manufacturer of fasteners, fluid system products, automotive parts, and retail hardware components. Paulin’s distribution facilities are located across Canada in Vancouver, Edmonton, Winnipeg, Toronto, Montreal, and Moncton, as well as in Flint, Michigan and Cleveland, Ohio. Paulin’s four manufacturing facilities are located in Ontario, Canada. The Company’s customers include retail hardware, industrial, and automotive (both Original Equipment Manufacturers and aftermarket). Annual revenues of Paulin for 2011 were approximately C$139 million.

    About Hillman

    Founded in 1964 and headquartered in Cincinnati, Ohio, Hillman is a leading value-added distributor of approximately 80,000 SKUs, consisting of fasteners, key duplication systems, engraved tags, and related hardware items to over 20,000 retail customers in the U.S., Canada, Mexico, South America, and Australia, including home improvement centers, mass merchants, national and regional hardware stores, pet supply stores, and other retailers. Hillman provides a comprehensive solution to its retail customers for managing SKU intensive, complex home improvement categories. Hillman also offers its customers additional services, such as inventory management and in-store merchandising services.

    In May 2010, Oak Hill Capital Partners acquired Hillman. Members of Hillman’s management team also invested in Hillman. Oak Hill Capital Partners is a private equity firm managing funds with more than $8 billion of committed capital from leading entrepreneurs, endowments, foundations, corporations, pension funds, and global financial institutions. For more information about Oak Hill Capital Partners, visit www.oakhillcapital.com.

    This press release has been issued pursuant to the early warning press release requirements under Canadian securities laws, which also require a report to be filed with the B.C., Alberta, Manitoba and Ontario Securities Commissions containing additional information with respect to the foregoing matters. Hillman has acquired ownership of, and control over, 3,288,000 Shares, being 100% of the issued and outstanding Shares, in exchange for cash consideration of $27.60 per Share. The acquisition was entered into for the purpose of integrating Paulin into Hillman’s North American operations.

    The Hillman Companies Inc. | 10590 Hamilton Avenue | Cincinnati, Ohio 45231

    SOURCE: H. Paulin & Co. Limited

    For more information on Paulin visit www.hpaulin.com or call Investor Relations at (416) 694-3360, ext. 135.

    For more information on Hillman, please visit www.hillmangroup.com or call Investor Relations at (513) 851-4900, ext. 2084.

    Photo courtesy of Shutterstock.

    The post PE-Backed Hillman Closes Buy of H. Paulin & Co. appeared first on peHUB.

  • Skip Skype, workout with friends online with Wello

    The next time you want to catch up with far-flung friends, you can do more than just Skype with them, you could do yoga or pilates or kickboxing with them via the web, thanks to Wello.

    The Kleiner Perkins-backed startup, which is part of health tech accelerator Rock Health’s current class, on Tuesday announced that it had added group lessons to its online training options.

    In July, the startup launched live online personal training sessions, in which trainers provide instruction with a webcam, laptop and internet connection for an average of $40. The new group lessons not only enable people to participate in online exercise classes at their convenience (with friends or strangers), they give people an option at a lower price point.  Because the group (of three to five people) shares the cost of the trainer, prices start at $7.50, with most costing around $15.

    Wello Class pageOn Wello’s site, customers can search by group workouts or one-on-one sessions and then browse a range of exercise types (from yoga and pilates to martial arts and aerobics), indicating when they want to take a class and what their goals are. When it’s time for the class, the instructor is most prominently displayed on half of the screen, with the client and the rest of the class splitting the other half.

    In the past few years, fitness apps, online video exercise classes and even streaming classes have been on the upswing. But co-founder Ann Scott Plante said she and her co-founder Lindsey Silverglide wanted to create a fitness platform that combined the convenience of working from home with the accountability and motivation that comes with a real-life trainer. To test out the idea of live online fitness class, the founders did a workout with a trainer via Skype. After a 30-minute workout, Plante said, they realized they were on to something.

    “We believe that trainers are the secret sauce,” she said, adding that the live instruction and commitment to a person mean that customers actually take the classes they sign up for and learn the various techniques.

    On the consumer side, the online platform provides convenience and accountability (particularly with the new group option). And it gives trainers the opportunity to book classes during the middle of the day and other traditionally quieter times.

    Since launching the startup two years ago, she said more than 1,000 trainers have applied to their marketplace. The 20 percent who get accepted go through a training program to learn how to be most effective over video. While the company declined to share their number of users, they said two-thirds of those who take one class online go on to take another.

    To date, the company has raised $1 million from Kleiner Perkins Caufield & Byers, Mohr Davidow Ventures, Aberdare and others.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Nexus 7 sales soared in 2012, but still fell short of iPad mini and Kindle Fire

    Nexus 7 Sales
    Google (GOOG) doesn’t reveal sales figures for its Nexus smartphones and tablets, however that hasn’t stopped people from speculating and trying to figure out just how many devices the company has sold. Mobile industry analyst Benedict Evans crunched numbers from ASUS (2357), the manufacturer of the Nexus 7, and found that Google likely sold between 4.5 million to 4.6 million units of its flagship 7-inch tablet. The estimates were derived using overall tablet sales reported by ASUS and various statements from the company’s CEO. While sales of nearly 5 million units is respectable, it falls significantly short of the competition. Evans estimates that despite being released in November, Apple (AAPL) may have sold around 10 million iPad mini tablets and Amazon’s (AMZN) new Kindle Fire likely “outsold the Nexus 7 as well.”