Blog

  • Brookfield Completes Property Spin-Off

    Brookfield Asset Management has completed the spin-off of Brookfield Property Partners, a newly-created company which owns all of Brookfield’s commercial property assets. Brookfield Asset Management is a global alternative asset manager with over $175 billion in assets under management. The company has over a 100-year history of owning and operating assets with a focus on property, renewable power, infrastructure and private equity.

    PRESS RELEASE

    Brookfield Asset Management Inc. (“Brookfield”) (TSX:BAM.A)(NYSE:BAM)(EURONEXT:BAMA) today announced the completion of the spin-off of Brookfield Property Partners L.P. (“BPY”) (TSX:BPY.UN)(NYSE:BPY), a newly-created company which owns substantially all of Brookfield’s commercial property assets.
    The spin-off was effected by way of a special dividend of units of BPY to holders of Brookfield’s Class A and B limited voting shares (the “Shares”) as of the record date, March 26, 2013. Each holder of Shares received one BPY unit for approximately every 17.42 Shares (that is, approximately 0.0574 BPY units for each Share). Shareholders of Brookfield now own 35,839,414 BPY units, or 7.56% of BPY, and Brookfield owns the remaining 92.44% of BPY (assuming the exchange of all of Brookfield’s redeemable partnership units, which it holds in an affiliate of BPY, for BPY units). The BPY units commenced regular-way trading on the Toronto Stock Exchange and the New York Stock Exchange this morning under the symbols “BPY.UN” and “BPY” respectively.
    “Brookfield Property Partners public listing opens an exciting new chapter in the growth of a leading global commercial property company, with the scale and expertise needed to deliver superior long term performance,” said Ric Clark, chief executive officer at Brookfield Property Partners and Senior Managing Partner and head of the global property group at Brookfield Asset Management.
    “This final step in the launch of Brookfield Property Partners significantly furthers our asset management strategy, providing investors with access to our real asset platforms through three flagship listed entities which deliver income, growth and a portfolio of strongly performing private equity funds,” commented Bruce Flatt, Chief Executive Officer of Brookfield. “BPY joins our two other high dividend yield and growth entities, Brookfield Infrastructure Partners and Brookfield Renewable Energy Partners, which since their inceptions, have delivered annual compound returns in excess of 15%.”
    Brookfield shareholders will receive a cash payment in lieu of any fractional interests in the BPY units. Brookfield will use the volume-weighted average of the regular-way trading price of the BPY units for the five trading days immediately following the spin-off to determine the value of the BPY units for the purpose of calculating the cash payable in lieu of any fractional interests. Payment of this cash amount will be made by check and mailed on or about April 24, 2013.
    Prior to completion of the spin-off, BPY acquired from Brookfield substantially all of its commercial property operations, including its office, retail, multi-family and industrial assets, including approximately $157 million worth of ownership interests that Brookfield acquired on April 12, 2012 from fellow investors in the consortium that holds underlying common shares and warrants of General Growth Properties, Inc. and common shares of Rouse Properties, Inc. As consideration for these interests, the investors received approximately $110 million in cash and a note for approximately $47 million that was issued by one of BPY’s holding entities and matures on October 12, 2013. This transaction resulted in an increase in the value of the special dividend of BPY units to Brookfield shareholders from the $1.45 value per Share that was estimated upon declaration of the dividend to $1.47 per Share upon payment, or approximately $920 million dollars in the aggregate, based on International Financial Reporting Standards carrying values.
    In order to satisfy Canadian withholding tax and U.S. “backup” withholding tax obligations on the special dividend, a portion of the BPY units otherwise distributable to non-Canadian investors will be withheld from registered shareholders. For non-Canadian beneficial owners of Brookfield shares registered in the name of a broker or other intermediary, these withholding tax obligations will be satisfied in the ordinary course through arrangements with the broker or intermediary. Beneficial owners should consult their brokers to determine how the withholding tax obligations will be satisfied for their units and on any questions they may have regarding fractional units.
    As contemplated in BPY’s Form 20-F filed with the U.S. Securities and Exchange Commission and its Canadian Prospectus and U.S. Information Statement filed with the Ontario Securities Commission, on April 14, 2013 the existing board of directors of BPY’s general partner was replaced in its entirety and expanded to seven members, a majority of whom are independent of BPY and Brookfield. The seven members of the board of directors are Gordon E. Arnell, Omar Carneiro da Cunha, Stephen DeNardo, J. Bruce Flatt, Louis Joseph Maroun, Lars Rodert and José Ramón Valente Vías. For biographical information about BPY’s directors please refer to the section entitled “Governance” beginning on page 121 of the Form 20-F and page 123 of the Canadian Prospectus and U.S. Information Statement.
    Further details regarding the operations of Brookfield Property Partners are set forth in regulatory filings. A copy of the filings may be obtained through the website of the SEC at www.sec.gov and on BPY’s SEDAR profile at www.sedar.com.
    Brookfield Asset Management Inc. is a global alternative asset manager with over $175 billion in assets under management. The company has over a 100-year history of owning and operating assets with a focus on property, renewable power, infrastructure and private equity. Brookfield has a range of public and private investment products and services, which leverage its expertise and experience and provide it with a competitive advantage in the markets where it operates.
    Brookfield Property Partners is a commercial real estate owner, operator and investor operating globally. Its diversified portfolio includes interests in over 300 office and retail properties encompassing more than 250 million square feet. In addition, the company has interests in approximately 15,600 multi-family units, 29 million square feet of industrial space and an 18 million square foot office development pipeline. Brookfield Property Partners’ goal is to be the leading global investor in best in class commercial property assets.

    Note: This news release contains forward-looking information within the meaning of Canadian provincial securities laws and “forward-looking statements” within the meaning of Section 27A of the U.S. Securities Act of 1933, as amended, Section 21E of the U.S. Securities Exchange Act of 1934, as amended, “safe harbour” provisions of the United States Private Securities Litigation Reform Act of 1995 and in any applicable Canadian securities regulations. The words “continue,” “expect,” “intend,” “believe,” derivations thereof and other expressions, including conditional verbs such as “may,” “will,” “could,” “would,” and “should,” are predictions of or indicate future events, trends or prospects or identify forward-looking statements. Forward-looking statements in this news release include statements with respect to: our expectations for Brookfield Property Partners L.P.; the anticipated benefits of the spin-off; and other statements with respect to our beliefs, outlooks, plans, expectations and intentions. Although Brookfield Asset Management and Brookfield Property Partners believe that BPY’s anticipated future results, performance or achievements expressed or implied by the forward-looking statements and information are based upon reasonable assumptions and expectations, the reader should not place undue reliance on forward-looking statements and information as such statements and information involve known and unknown risks, uncertainties and other factors which may cause the actual results, performance or achievements of the company to differ materially from anticipated future results, performance or achievement expressed or implied by such forward-looking statements and information.
    Factors that could cause actual results to differ materially from those contemplated or implied by forward-looking statements include: economic and financial conditions in the countries in which we do business; the behaviour of financial markets, including fluctuations in interest and exchange rates; availability of equity and debt financing; strategic actions including dispositions; the ability to complete and effectively integrate acquisitions into existing operations and the ability to attain expected benefits; regulatory and political factors within the countries in which the company operates; availability of new tenants to fill property vacancies; tenant bankruptcies; acts of God, such as earthquakes and hurricanes; the possible impact of international conflicts and other developments including terrorist acts; changes in accounting policies to be adopted under IFRS; and other risks and factors detailed from time to time in BPY’s Form 20-F filed with the Securities and Exchange Commission as well as other documents filed by BPY with the securities regulators in Canada and the United States.
    We caution that the foregoing factors that may affect future results are not exhaustive. When relying on our forward-looking statements to make decisions with respect to Brookfield Asset Management or Brookfield Property Partners, investors and others should carefully consider the foregoing factors and other uncertainties and potential events. Except as required by law, the companies undertakes no obligation to publicly update or revise any forward-looking statements or information, whether written or oral, as a result of new information, future events or otherwise.
    Contact Information
    Media: Brookfield Asset Management Inc.
    Andrew Willis
    SVP, Communications & Media
    (416) 369-8236
    (416) 363-2856 (FAX)
    [email protected]

    Investors: Brookfield Asset Management Inc.
    Katherine Vyse
    SVP, Investor Relations
    (416) 369-8246
    (416) 363-2856 (FAX)
    [email protected]

    Brookfield Property Partners
    Melissa Coley
    Vice President, Investor Relations & Communications
    212-417-7215
    [email protected]

    The post Brookfield Completes Property Spin-Off appeared first on peHUB.

  • Tauriga Sciences Appoints COO

    Biotechnology company Tauriga Sciences has appointed Stella M. Sung to the position of chief operating officer effectively immediately. Sung is currently business development officer of Avita Medical, a public regenerative medicine company, and managing director of Pearl Street Venture Fund, a life science venture fund.

    PRESS RELEASE

    Tauriga Sciences, Inc. (OTCQB: TAUG) or (“The Company” or “Tauriga”) has today announced the appointment of Stella M. Sung, Ph.D. (“Dr. Sung”) to the position of Chief Operating Officer (“COO”) effectively immediately. In assuming this position, Dr. Sung will oversee the evaluation process and structuring of potential biotech transactions, including: due diligence, valuations, DCF modeling, and capital requirements.
    Tauriga CEO Seth M. Shaw commented, “The addition of Dr. Stella Sung to the Company’s core management team as Chief Operating Officer is a major achievement for the Company. Her outstanding rolodex of institutional investors and access to intriguing opportunities in the life sciences space are of great importance to the Company moving forward.”
    Newly appointed Tauriga Chief Operating Officer, Dr. Stella Sung stated, “I am enthusiastic about building Tauriga’s portfolio of diversified assets in the health care space. Serving as the Company’s COO enables me to source and structure transactions with the goal of maximizing shareholder value, and Tauriga is already creating a pipeline of potential deals.”
    Please see below Bio for Dr. Stella M. Sung, Chief Operating Officer — Tauriga Sciences, Inc.:
    Dr. Stella M. Sung brings almost 20 years of leadership experience in the healthcare sector as both a senior operating executive and an early stage life science venture capitalist. Dr. Sung is currently Business Development Officer of Avita Medical, a public regenerative medicine company, and Managing Director of Pearl Street Venture Fund, a life science venture fund. She previously held the position of Chief Business Officer of Cylene Pharmaceuticals, a venture-backed oncology company. Dr. Sung has served as a Managing Director or General Partner for several life science venture firms, including Coastview Capital (founded by former Amgen CEO Gordon Binder) and Oxford Bioscience Partners. She has led venture rounds of financing for seven transactions, co-founded two biotechnology companies, served on 7 Boards of Directors and served as Chairman of the Board for four biotechnology companies. Previously, she focused on life science and health care investments at Advent International, a global private equity firm that has raised over $6 billion in cumulative capital to date. Dr. Sung received her B.S. in chemistry from The Ohio State University and her Ph.D. in chemistry from Harvard University, where she was a National Science Foundation Pre-Doctoral Fellow. She earned her Harvard Ph.D. under the guidance of Professor Dudley Herschbach, the 1986 Nobel Laureate in Chemistry. (http://www.psvf.com/stella-m-sung.asp)
    About Tauriga Sciences, Inc.:
    Tauriga Sciences, Inc. (“the Company”) is a holding company that operates in the biotechnology space, which includes medical devices and development of proprietary drug compounds. The mission of the Company is to acquire a diversified portfolio of medical technologies with the aim of providing financial and human capital resources, to unlock significant value for the shareholders. The Company’s business model entails the acquisition of licenses, equity stakes, rights on both an exclusive and non-exclusive basis, and entire businesses. Management is firmly committed to building lasting shareholder value in the short, intermediate, and long terms. The Company’s new corporate website can be found at URL address (www.taurigasciences.com).
    DISCLAIMER:
    Forward-Looking Statements: Except for statements of historical fact, this news release contains certain “forward-looking statements” as defined by the Private Securities Litigation Reform Act of 1995, including, without limitation expectations, beliefs, plans and objectives regarding the development, use and marketability of products. Such forward-looking statements are based on present circumstances and on IMUN’s predictions with respect to events that have not occurred, that may not occur, or that may occur with different consequences and timing than those now assumed or anticipated. Such forward-looking statements involve known and unknown risks, uncertainties and other factors, and are not guarantees of future performance or results and involve risks and uncertainties that could cause actual events or results to differ materially from the events or results expressed or implied by such forward-looking statements. Such factors include general economic and business conditions, the ability to successfully develop and market products, consumer and business consumption habits, the ability to fund operations and other factors over which IMUN has little or no control. Such forward-looking statements are made only as of the date of this release, and IMUN assumes no obligation to update forward-looking statements to reflect subsequent events or circumstances. Readers should not place undue reliance on these forward-looking statements. Risks, uncertainties and other factors are discussed in documents filed from time to time by IMUN with the Securities and Exchange Commission.
    Contact Information
    Contact:

    For more information please contact:

    Mr. Seth M. Shaw
    Chairman & Chief Executive Officer
    Tauriga Sciences, Inc.
    New York: +1-917-796-9926
    Montreal: +1-514-840-3697
    Email: Email Contact

    The post Tauriga Sciences Appoints COO appeared first on peHUB.

  • Woodbridge Opens India Office

    Middle-market mergers and acquisitions firm Woodbridge International is opening its Pune, India office. The new office will serve Woodbridge clients throughout India and will be led by senior M&A advisors Darshan Rathod and Vaibhav Sundecha, along with Woodbridge vice president Jitender Chopra, who is based in New York.

    PRESS RELEASE

    Woodbridge International, a leading middle-market mergers and acquisitions firm, is pleased to announce the opening of its Pune, India office. The new office will serve Woodbridge clients throughout India and will be led by Senior M&A advisors Darshan Rathod and Vaibhav Sundecha, along with Woodbridge Vice President Jitender Chopra, who is based in New York.
    Woodbridge International is an M&A firm focused on selling privately-held, middle-market companies with transactional values ranging from 25 Crore to over 500 Crore ($5 million-$100 million).
    Woodbridge recently closed several cross-border deals in a variety of industries. In March of this year, Labor Import, a Brazilian distributor of medical supplies, was sold to Bunzl, a London-based distributor traded on the London stock exchange.
    In 2012 Woodbridge completed four cross-border transactions: U.S.-based A&A Manufacturing Company, Inc. was acquired by Halltech Gmbh, a leading German specialty OEM; Panama-based Hidrotenencias S.A., a hydroelectric power company, entered into a capital transaction with ACON, a private equity investment fund based in the U.S. and Latin America; U.K.-based Private Equity Group Permira through their portfolio company, Genesis, located in the U.S. and Brazil, acquired Woodbridge’s Brazilian client, LM Sistemas; and Woodbridge’s Japanese client, Yamada, entered into a joint venture with LOM, a Scandinavian company with a division in Manaus, Brazil.
    Woodbridge’s innovative process for marketing companies globally to strategic and financial buyers is unique — and includes the production of a dynamic 2-minute company video buyers can watch on their screens, wherever they are in the world.
    Vaibhav specializes in transaction advisory for midsize companies and has worked for Ernst & Young and PwC. Darshan specializes in lead advisory and previously worked for RREEF (Deutsche Bank Real Estate Private Equity), Ernst & Young and PwC.
    Vaibhav and Darshan will work closely with Jitender Chopra and the entire Woodbridge team in serving Woodbridge’s clients in India. Prior to joining Woodbridge, Jitender worked in the Credit Risk division of J.P. Morgan’s Investment Banking line of business. He also previously worked at Fidelity Investments and AXA Equitable.
    Woodbridge welcomes Vaibhav and Darshan to its team and looks forward to bringing India-based sellers to buyers around the globe.
    Woodbridge International, founded in 1993, is an innovative M&A firm headquartered in New Haven, CT. The firm serves clients from its 10 North American offices and locations in the Netherlands, Mexico, Brazil and Honduras.

    The post Woodbridge Opens India Office appeared first on peHUB.

  • Pine Tree Equity Raises Fund III

    Pine Tree Equity has closed its third private investment fund on $100 million. The firm’s capital raise was executed in less than 90 days. This brings the firm’s total committed capital base to nearly $200 million.

    PRESS RELEASE

    Pine Tree Equity III, LP (“Pine Tree Equity”), a private equity firm based in Miami, FL, is pleased to announce that it has closed its third private investment fund of $100 million in April 2013. The firm’s capital raise was executed in less than 90 days, and it brings the firm’s total committed capital base to nearly $200 million. Due to its differentiated approach in the small capitalization space and strong returns despite the challenging market, Pine Tree Equity III was materially oversubscribed beyond its $100 million cap.

    “We are extremely thankful to have received great support from existing investors and significant demand from new investors,” said Jeff Settembrino, Pine Tree’s Managing Partner. “Although we were oversubscribed, we wanted to keep our third fund at $100 million in order to remain dedicated to the small capitalization space where our committed capital and proven experience has helped transform entrepreneurial success stories into institutional platforms positioned for continued growth.”

    Pine Tree Equity III will continue to execute its successful strategy of investing in and expanding small capitalization companies – with revenue of $10 million to $50 million or EBITDA of $2 million to $6 million – in partnership with founding management. Since its founding in January 2007, Pine Tree Equity has closed 21 acquisitions with founding entrepreneurs in a variety of industries, including business, consumer and financial services.

    Pine Tree Equity

    Pine Tree Equity, based in Miami, FL, is a private equity firm with committed equity capital focused on the investment in and expansion of small capitalization companies – with revenue of $10.0 million to $50.0 million – in partnership with management.

    The post Pine Tree Equity Raises Fund III appeared first on peHUB.

  • Adobe releases Photoshop Lightroom 5 Beta

    Adobe has announced the first public beta of Photoshop Lightroom 5. And while a first look suggests this isn’t the most major of upgrades, there are still some worthwhile improvements to be found.

    A new one-click Upright tool can analyse your images and detect tilted lines, for instance. You can choose a correction method, but otherwise the program will straighten images all on its own.

    Lightroom 5 gains some more Photoshop-like technology with its enhanced healing brush, which can heal or clone with brush strokes.

    A new radial filter allows you to apply your preferred Lightroom image adjustments to a circular mask, which can then be resized or feathered to produce a more natural effect.

    A Smart Preview option helps you work remotely from your original images. Lightroom 5 can create smaller versions of these files, called Smart Previews; if you’re disconnected from the source images then you can work on the Smart Previews, instead; and when you reconnect, the program is able to reapply all your edits to the original pictures.

    Elsewhere, more capable photo book creation includes book templates which you can customise to suit your needs.

    And it’s now possible to combine video clips, images and music to produce your very own HD video slideshow.

    While there’s no single killer feature here, the automatic perspective correction and enhanced healing brush are going to be very useful. And if you’d like to try them out, the beta of Photoshop Lightroom 5 is available now. (No need to worry if you have an earlier version; Adobe says it can be installed alongside an existing copy of Lightroom installation without overwriting anything.)

  • Symantec highlights 58 percent increase in mobile attacks

    Security giant Symantec’s 18th annual Internet Security Threat Report is out today and reveals that cyber criminals are increasingly scouring the Web for personal details in order to target their attacks. Armed with your information they can exploit security gaps in social networks and other sites to infect your system or steal your details.

    It’s not just your PC that’s at risk either; the report shows an alarming 58 percent increase in attacks on mobile devices with just under a third of these aimed at stealing data without the user’s knowledge. Android is the most targeted mobile platform as its open source nature makes it easier to hide malware in apps. The securer-than-thou smugness of Apple users receives a blow too as the report notes more than 600,000 Mac systems were infected by a single attack last April.

    When it comes to types of threat, the growth of ransomware continues with infections becoming more aggressive and harder to undo. Another scary statistic is that 61 percent of malicious sites are actually legitimate websites that have been compromised so you may be at risk even if you think you’re practising safe surfing.

    Symantec also highlights some common myths about security and you can read these as a handy infographic and access the full ISTR report here.

    Norton’s security expert Richard Clooke said, “The report results have shown that it is still crucial for Norton to continue to educate consumers on how they can help protect themselves from acts of cybercrime. Ransomware, for example, a scam which disables victims’ computers until they pay a ransom, continues to be a key theme and is now becoming more sophisticated than ever…”

    Of course all of this is aimed at boosting sales of Symantec’s security products, but it does underline that the threat landscape is an ever changing one and that we all need to be careful out there.

    Photo Credits: Slavoljub Pantelic/Shutterstock

  • Better Bing blesses Windows 8

    In case you missed the update, because I nearly did, Microsoft brings better Bing apps to Windows 8. They dropped yesterday, but I’m just getting round to them today. I love `em. The search engine is hugely underrated compared to Google, and the core services look so damn good and feel even better from a touchscreen.

    In a self-aggrandizing post, the Bing team describes core apps Finance, Maps, News, Sports, Travel and Weather as “immersive vertical experiences”. I so totally agree. Modern UI offers the most immersive experience on tablets, for fully-supporting apps. Microsoft claims that they “were designed from the ground up to embrace speed and touch providing you with a fast, fluid and consistent way to delve into your interests and get things done”.

    As a user, three of the updates are most useful — News, Maps and Weather. I don’t invest, have limited sports interest and don’t travel enough.

    News. Bloggers and some news sites obsess over Google News, using every trick imaginable to get placement, hoping for surge that can deliver tens of thousands of pageviews in minutes. Hell, I never look at the thing. Google News is a drug habit. I won’t be addicted.

    But Bing News is fanciful for increasing utility. Just as Google dumps RSS and sends Reader to the executioner, Microsoft supports feeds, and offline reading. Yeah, I’ll give Bing News a look hard looksee.. Oh yeah, there’s support for alerts, something else getting the big Google boot.

    Maps. Google is hard to beat in this area, but Bing is no slouch, either. Microsoft adds more real-time information, which makes getting around lots better — well, I say from first-blush look. I need to get out on the road to say for certain.

    Weather. Microsoft’s app is my favorite from anybody. Weather information is well-presented — and lots of it. The thing is so damn immersive, I get sucked in looking for stuff everytime I open it. The new dynamically roaming weather maps are chock full of useful information.

  • Higher mercury levels increase risk of diabetes

    A new research conducted by the Indiana University School of Public Health-Bloomington has found that young adults consuming higher levels of mercury face a higher risk of type 2 diabetes by 65 percent later in life. Led by the university’s epidemiologist Ka He, the…
  • Once the media tells us who carried out the Boston marathon bombing, how can we believe them?

    At some point here in the next few days, the mainstream media is going to announce the culprit behind yesterday’s bombings at the Boston marathon race. The question, though, is how can we believe them? When it comes to reporting the truth about acts of terrorism,…
  • Nokia Lumia 820 review

    If you are in the market for a mid-range Windows Phone 8 device then the Nokia Lumia 820 should definitely make your shortlist. The smartphone is affordable, fast, responsive, looks nice and comes with the Finnish manufacturer’s exclusive collection of enticing apps. Users can even personalize the appearance of the Lumia 820 by switching between different back covers of attractive colors.

    In a number of ways, the Lumia 820 is closer to high-end rather than mid-range Windows Phone 8 devices. The smartphone comes with the same processor as the Lumia 920 (which explains the speed part), features support for wireless charging through optional back plates and sports an AMOLED display where black is really black and not a shade of gray. But, the Lumia 820 is not a scaled down version of the bigger Lumia 920 or any other high-end Windows Phone 8 handset.

    The Specs

    The Lumia 820 comes with a 4.3-inch AMOLED display with ClearBlack technology, resolution of 480 by 800 and 217 pixels per inch density. A 1.5 GHz dual-core Qualcomm Snapdragon S4 processor, 1GB of RAM and a 1650 mAh battery power the smartphone.

    Other specs include 8GB of internal storage; microSD card slot; 8.7 MP back-facing camera with 1080p video recording; 0.3 MP front-facing camera with 480p video recording; USB 2.0; Bluetooth 3.0; Wi-Fi 802.11 a/b/g/n; NFC; 4G LTE; magnetometer and A-GPS as the most noteworthy.

    The Lumia 820 comes in at 123.8 x 68.5 x 9.9 mm and 160 grams.

    Build Quality and Handling

    Similar to other Windows Phone 8 handsets in Nokia’s lineup, the Lumia 820 comes with an all-glass front panel and polycarbonate backside. Like I previously mentioned the back panel is interchangeable, and comes in seven different colors: cyan, purple, red, white and yellow as glossy and gray and black as matte finishes.

    Needless to say, there’s a back panel for (almost) everyone’s taste. If you still aren’t happy enough, Nokia even provides a 3D printing development kit for the Lumia 820, which delivers complete customization albeit at a cost.

    The Lumia 820 that I received for this review comes with the glossy yellow cover. The good part is that the color and the finish mask any imperfections very well, and there is the possibility of replacing the back panel to get rid of all the wear should it occur. Also, the yellow cover really makes the Lumia 820 stand out in a crowd.

    The downside with having a glossy finish on the back, and this holds true for every handset that I have tested, is that the smartphone can get very slippery and is more prone to drops. A matte finish partially solves the problem, albeit Nokia’s respective coatings are still somewhat slippery.

    The back panel on the Lumia 820 flexes under pressure, something that is not immediately noticeable unless a higher pressure is applied. That said, the Lumia 820 is well built and gives the impression that it can withstand a fair amount of abuse. And, being quite small, the Lumia 820 is also easy to handle and use with one hand.

    The weight is not as substantial as the Lumia 920 for instance, although after using Nokia’s flagship for a while nothing really is heavy anymore. At 160 grams, though, the 820 is far from being the lightest smartphone around, which is something to consider for some folks.

    Generally speaking the Lumia 820 does well in the design department. It’s a fairly standard approach with rounded corners, non-tapered edges and significant thickness. But phone lags behind its larger brother, the Lumia 920, or the HTC Windows Phone 8X in this regard.

    The Display

    As I mentioned in the specs, the Lumia 820 comes with a 4.3-inch AMOLED display with a resolution of 480 by 800 and a 217 ppi density. There are two notable downsides here: the resolution and, therefore, the pixel density.

    If you are used to looking at a display with a 720p resolution (720 by 1280) or higher, the panel on the Lumia 820 will look underwhelming by comparison. Text is not as sharp as it should be, easy to spot when looking through the app list or the Settings menu. A similar impression is given when viewing web pages in Internet Explorer.

    But there are good parts to the Lumia 820’s display. Colors pop and appear vibrant, something which is highly noticeable when holding the handset next to the Lumia 920. That is in part due to the AMOLED technology and the software calibration performed by Nokia. All in all, except the low resolution, the display on the Lumia 820 is decent.

    Performance and Battery Life

    Due to the 1.5 GHz dual-core Qualcomm Snapdragon processor and the 1GB of RAM, the Lumia 820 comes with plenty of power. Performance is good across the board, although I have noticed a couple of hiccups here and there likely due to some software glitches.

    Windows Phone 8 is very responsive, fast and fluid, a trait that is common among similar devices. Browsing speeds are also good and double tap to zoom performs great with almost instant text refresh. If you care about such things, the Lumia 820 got a score of 910.9 ms in SunSpider, version 0.9.1. That’s one of the highest scores attainable on a smartphone today.

    What about battery life? During my testing, with brightness set on auto, two email accounts, Facebook, LinkedIn and Twitter syncing in the background, mobile data and Wi-Fi always enabled, some calls and texts and an overall medium use, the battery on the Lumia 820 got me through the day and even into the next one. With light usage, I saw around 30 hours of battery life but I’m certain your experience may vary with a different usage pattern.

    Hardware Extras

    Just like the Lumia 920, the Lumia 820 features support for wireless charging and comes with a display that can be operated using gloves. While the former may require purchasing a new part (the back panel in this case), the latter is available out-of-the-box. You’ll still need gloves, obviously.

    I have doubts when it comes to the benefits of the ultra-sensitive display. Sure it’s nice to be able to use the smartphone with gloves, but you’d have to wear thinner and grippier gloves to operate the display well enough and, obviously, hold the smartphone. To me, that’s just a nice feature that I’ll likely never use as it’s quite difficult to take out the smartphone from inside my pants’ pockets using gloves.

    The wireless charging feature is more useful more of the time. I have a Nokia wireless charging pillow, Fatboy-branded, which I use to top the battery on my Lumia 920. Sadly, I couldn’t test this feature on the Lumia 820 as the back panel doesn’t support wireless charging. That said, it should work just as on any other compatible smartphone — just place it on top and it charges.

    It is worth noting that the 8GB of internal storage runs out pretty quickly when shooting video, snapping pics or installing apps. A microSD card is supported and should be acquired. The Lumia 820 can house microSD cards up to 64GB in size, which should cover even the most demanding users’ needs.

    The Cameras

    The Lumia 820 comes with an 8.7 MP back-facing camera, similar in count to the one on the Lumia 920, but without the Nokia PureView branding. That means no tricked out OIS (Optical Image Stabilization) or impressive low-light performance.

    Pictures are of decent quality, although they do not impress overall. The shots feature a decent level of detail, though not up to par with photos shot with the Lumia 920, for instance. Also, under low-light, the camera makes too much use of flash, which makes pictures look overexposed in the focus area, but underexposed otherwise.

    The video camera also captures decent videos, but like previously mentioned without OIS. That means videos appear slightly shaky when you move. By contrast the Lumia 920 fares better in this regard. The camera also adapts slower than it should to light changes (the Lumia 920 again fares better). Sound is decent overall, but is high-pitched. By contrast the Lumia 920 delivers a more muffled sound.

    For a mid-range device, generally speaking both the photos and videos produced with the Lumia 820 are decent. If you are not a photography enthusiast this smartphone’s back-facing camera should suffice most of the time. About the 0.3 MP front-facing camera, let’s just say that it’s there and leave it at that. The low quality of the latter is to be expected considering the megapixel count.

    It is worth noting that the Lumia 820 review unit that I have comes with Windows Phone version 8.0.10211.204 and firmware version 1232.5957.1308.0001, both latest available at the time of writing this article. Nokia may tweak the camera software in future releases, so your experience can vary depending on the firmware.

    Software Features

    In the software department, the Lumia 820 offers pretty much the same apps and features as the 920. The only apparent difference, and this may boil down to this particular software version, is the more restrictive screen time-out intervals — the Lumia 820 tops at five minutes, while the 920’s display can be kept on until the battery runs out. Otherwise, you’re looking at the same level of software equipment.

    You can read my comparison between the Windows Phone 8X and the Lumia 920 as well as my Lumia 920 first-impressions review for the scoop on some of the most important Nokia-branded apps.

    Needless to say you will not be disappointed by the Nokia collection inside Windows Phone’s app store. There are plenty of useful apps ranging from maps, navigation, photo editing, games, weather, sports, shopping to social networking (Nokia’s Foursquare app for instance).

    Price

    This one is a tough nut to crack.

    In Europe, the Lumia 820 is available at roughly the same price as the Windows Phone 8X, when purchased off-contract. The latter, however, is HTC’s Windows Phone flagship and features better hardware specifications, albeit lesser software prowess. You’ll have to choose which one really matters to you: hardware or software. If the latter is the case then the Lumia 820 is best, otherwise go for the Windows Phone 8X.

    In the United States, for instance, the Lumia 820 can be had with no upfront cost upon signing a two-year contract with AT&T. Verizon, which sells the Lumia 822 — a branded version of the Lumia 820 — also offers the smartphone for free. At this price-point the Lumia 820 and its Lumia 822 sibling offer unbeatable value for the money in the Windows Phone realm.

    But, if you’re on AT&T and plan on staying there for another two years I advise you to pony up for the Lumia 920. It runs for $99.99 on a two-year contract and offers more bang for the buck: bigger and higher resolution screen, better cameras, better build quality and, dare I say, better looks (I know that is subjective).

    The Bottom Line

    Nokia really improves the Windows Phone 8 experience with the Lumia lineup. The added apps bring real value to the mix, something that other manufacturers should pay attention to even in mid-range to low-end market. The Lumia 820 also never once felt underpowered or out of its element. I appreciate the extra dose of excitement brought by the yellow trim and the other colorful back plates, which almost makes me regret getting a Lumia 920 in boring black.

    The Lumia 820 is likely the best mid-range Windows Phone device currently available. The only things that really let it down are the average back-facing camera and low-resolution display, the latter of which is easily noticeable more of the time than the former. If you are willing to put up with these two shortcomings, the Lumia 820 can pretty much do everything that the Lumia 920 does only at a lower price-point.

    Photo Credits: Mihaita Bamburic

  • Accidental Empires, Part 21 — Future Computing (Chapter 15)

    Twenty-first in a series. The final chapter to the first edition, circa 1991, of Robert X. Cringely’s Accidental Empires concludes with some predictions prophetic and others, well…

    Remember Pogo? Pogo was Doonesbury in a swamp, the first political cartoon good enough to make it off the editorial page and into the high-rent district next to the horoscope. Pogo was a ‘possum who looked as if he was dressed for a Harvard class reunion and who acted as the moral conscience for the first generation of Americans who knew how to read but had decided not to.

    The Pogo strip remembered by everyone who knows what the heck I am even talking about is the one in which the little ‘possum says, “We have met the enemy and he is us.” But today’s sermon is based on the line that follows in the next panel of that strip — a line that hardly anyone remembers. He said, “We are surrounded by insurmountable opportunity.”

    We are surrounded by insurmountable opportunity.

    Fifteen years ago, a few clever young people invented a type of computer that was so small you could put it on a desk and so useful and cheap to own that America found places for more than 60 million of them. These same young people also invented games to play on those computers and business applications that were so powerful and so useful that we nearly all became computer literate, whether we wanted to or not.

    Remember computer literacy? We were all supposed to become computer literate, or something terrible was going to happen to America. Computer literacy meant knowing how to program a computer, but that was before we really had an idea what personal computers could be used for. Once people had a reason for using computers other than to learn how to use computers, we stopped worrying about computer literacy and got on with our spreadsheets.

    And that’s where we pretty much stopped.

    There is no real difference between an Apple II running VisiCalc and an IBM PS/2 Model 70 running Lotus 1-2-3 version 3.0. Sure, the IBM has 100 times the speed and 1,000 times the storage of the Apple, but they are both just spreadsheet machines. Put the same formulas in the same cells, and both machines will give the same answer.

    In 1984, marketing folks at Lotus tried to contact the people who bought the first ten copies of VisiCalc in 1979. Two users could not be reached, two were no longer using computers at all, three were using Lotus 1-2-3, and three were still using VisiCalc on their old Apple IIs. Those last three people were still having their needs met by a five-year-old product.

    Marketing is the stimulation of long-term demand by solving customer problems. In the personal computer business, we’ve been solving more or less the same problem for at least 10 years. Hardware is faster and software is more sophisticated, but the only real technical advances in software in the last ten years have been the Lisa’s multitasking operating system and graphical user interface, Adobe’s PostScript printing technology, and the ability to link users together in local area networks.

    Ken Okin, who was in charge of hardware engineering for the Lisa and now heads the group designing Sun Microsystems’ newest workstations, keeps a Lisa in his office at Sun just to help his people put their work in perspective. “We still have a multitasking operating system with a graphical user interface and bit-mapped screen, but back then we did it with half a mip [one mip equals one million computer instructions per second] in 1 megabyte of RAM,” he said. “Today on my desk I have basically the same system, but this time I have 16 mips and an editor that doesn’t seem to run in anything less than 20 megabytes of RAM. It runs faster, sure, but what will it do that is different from the Lisa? It can do round windows; that’s all I can find that’s new. Round windows,great!”

    There hasn’t been much progress in software for two reasons. The bigger reason is that companies like Microsoft and Lotus have been making plenty of money introducing more and more people to essentially the same old software, so they saw little reason to take risks on radical new technologies. The second reason is that radical new software technologies seem to require equally radical increases in hardware performance, something that is only now starting to take place as 80386- and 68030-based computers become the norm.

    Fortunately for users and unfortunately for many companies in the PC business, we are about to break out of the doldrums of personal computing. There is a major shift happening right now that is forcing change on the business. Four major trends are about to shift PC users into warpspeed: standards-based computing, RISC processors, advanced semiconductors, and the death of the mainframe. Hold on!

    In the early days of railroading in America, there was no rule that said how far apart the rails were supposed to be, so at first every railroad set its rails a different distance apart, with the result that while a load of grain could be sent from one part of the country to another, the car it was loaded in couldn’t be. It took about thirty years for the railroad industry to standardize on just a couple of gauges of track. As happens in this business, one type of track, called standard gauge, took about 85 percent of the market.

    A standard gauge is coming to computing, because no one company — even IBM — is powerful enough to impose its way of doing things on all the other companies. From now on, successful computers and software will come from companies that build them from scratch with the idea of working with computers and software made by their competitors. This heretical idea was foisted on us all by a company called Sun Microsystems, which invented the whole concept of open systems computing and has grown into a $4 billion company literally by giving software away.

    Like nearly every other venture in this business, Sun got its start because of a Xerox mistake. The Defense Advanced Research Projects Agency wanted to buy Alto workstations, but the Special Programs Group at Xerox, seeing a chance to stick the feds for the entire Alto development budget, marked up the price too high even for DARPA. So DARPA went down the street to Stanford University, where they found a generic workstation based on the Motorola 68000 processor. Designed originally to run on the Stanford University Network, it was called the S.U.N. workstation.

    Andy Bechtolscheim, a Stanford graduate student from Germany, had designed the S.U.N. workstation, and since Stanford was not in the business of building computers for sale any more than Xerox was, he tried to interest established computer companies in filling the DARPA order. Bob Metcalfe at 3Com had a chance to build the S.U.N. workstation but turned it down. Bechtolscheim even approached IBM, borrowing a tuxedo from the Stanford drama department to wear for his presentation because his friends told him Big Blue was a very formal operation.

    He appeared at IBM wearing the tux, along with a tastefully contrasting pair of white tennis shoes. For some reason, IBM decided not to build the S.U.N. workstation either.

    Since all the real computer companies were uninterested in building S.U.N. workstations, Bechtolscheim started his own company, Sun Microsystems. His partners were Vinod Khosla and Scott McNealy, also Stanford grad students, and Bill Joy, who came from Berkeley. The Stanford contingent came up with the hardware design and a business plan, while Joy, who had played a major role in writing a version of the Unix operating system at Berkeley, was Mr. Software.

    Sun couldn’t afford to develop proprietary technology, so it didn’t develop any. The workstation design itself was so bland that Stanford University couldn’t find any basis for demanding royalties from the start-up. For networking they embraced Bob Metcalfe’s Ethernet, and for storage they used off-the-shelf hard disk drives built around the Small Computer System Interface (SCSI) specification. For software, they used Bill Joy’s Berkeley Unix. Berkeley Unix worked well on a VAX, so Bechtolscheim and friends just threw away the VAX and replaced it with cheaper hardware. The languages, operating system, networking, and windowing systems were all standard.

    Sun learned to establish de facto standards by giving source code away. It was a novel idea, born of the Berkeley Unix community, and rather in keeping with the idea that for some boys, a girl’s attractiveness is directly proportional to her availability. For example, Sun virtually gave away licenses for its Network Filing System networking scheme, which had lots of bugs and some severe security problems, but it was free and so became a de facto standard virtually overnight. Even IBM licensed NFS. This giving away of source code allowed Sun to succeed, first by being the standard setter and then following up with the first hardware to support that standard.

    By 1985, Sun had defined a new category of computer, the engineering workstation, but competitors were starting to catch on and catch up to Sun. The way to remain ahead of the industry, they decided, was to increase performance steadily, which they could do by using a RISC processor — except that there weren’t any RISC processors for sale in 1985.

    RISC is an old IBM idea called Reduced Instruction Set Computing. RISC processors were incredibly fast devices that gained their speed from a simple internal architecture that implements only a few computer instructions. Where a Complex Instruction Set Computer (CISC) might have a special “walk across the room but don’t step on the dog” instruction, RISC processors can usually get faster performance by using several simpler instructions: walk-walk-step over-walk-walk.

    RISC processors are cheaper to build because they are smaller and more can be fit on one piece of silicon. And because they have fewer transistors (often under 100,000), yields are higher too. It’s easier to increase the clock speed of RISC chips, making them faster. It’s easier to move RISC designs from one semiconductor technology to a faster one. And because RISC forces both hardware and software designers to keep it simple, stupid, they tend to be more robust.

    Sun couldn’t interest Intel or Motorola in doing one. Neither company wanted to endanger its lucrative CISC processor business. So Bill Joy and Dave Patterson designed a processor of their own in 1985, called SPARC. By this time, both Intel and Motorola had stopped allowing other semiconductor companies to license their processor designs, thus keeping all the high-margin sales in Santa Clara and Schaumberg, Illinois. This, of course, pissed off the traditional second source manufacturers, so Sun signed up those companies to do SPARC.

    Since Sun designed the SPARC processor, it could buy them more cheaply than any other computer maker. Sun engineers knew, too, when higher-performance versions of the SPARC were going to be introduced. These facts of life have allowed Sun to dominate the engineering workstation market, as well as making important inroads into other markets formerly dominated by IBM and DEC.

    Sun scares hardware and software competitors alike. The company practically gives away system software, which scares companies like Microsoft and Adobe that prefer to sell it. The industry is abuzz with software consortia set up with the intention to do better standards-based software than Sun does but to sell it, not give it away.

    Sun also scares entrenched hardware competitors like DEC and IBM by actually encouraging cloning of its hardware architecture, relying on a balls-to-the-wall attitude that says Sun will stay in the high-margin leading edge of the product wave simply by bringing newer, more powerful SPARC systems to market sooner than any of its competitors can.

    DEC has tried, and so far failed, to compete with Sun, using a RISC processor built by MIPS Computer Systems. Figuring if you can’t beat them, join them, HP has actually allied with Sun to do software. IBM reacted to Sun by building a RISC processor of its own too. Big Blue spent more on developing its Sun killer, the RS/6000, than it would have cost to buy Sun Microsystems outright. The RS/6000, too, is a relative failure.

    Why did Bill Gates, in his fourth consecutive hour of sitting in a hotel bar in Boston, sinking ever deeper into his chair, tell the marketing kids from Lotus Development that IBM would be out of business in seven years? What does Bill Gates know that we don’t know?

    Bill Gates knows that the future of computing will unfold on desktops, not in mainframe computer rooms. He knows that IBM has not had a very good handle on the desktop software market. He thinks that without the assistance of Microsoft, IBM will eventually forfeit what advantage it currently has in personal computers.

    Bill Gates is a smart guy.

    But you and I can go even further. We can predict the date by which the old IBM — IBM the mainframe computing giant — will be dead. We can predict the very day that the mainframe computer era will end.

    Mainframe computing will die with the coming of the millennium. On December 31,1999, right at midnight, when the big ball drops and people are kissing in New York’s Times Square, the era of mainframe computing will be over.

    Mainframe computing will end that night because a lot of people a long time ago made a simple mistake. Beginning in the 1950s, they wrote inventory programs and payroll programs for mainframe computers, programs that process income tax returns and send out welfare checks—programs that today run most of this country. In many ways those programs have become our country. And sometime during those thirty-odd years of being moved from one mainframe computer to another, larger mainframe computer, the original program listings, the source code for thousands of mainframe applications, were just thrown away. We have the object code—the part of the program that machines can read—which is enough to move the software from one type of computer to another. But the source code—the original program listing that people can read, that has details of how these programs actually work—is often long gone, fallen through a paper shredder back in 1967. There is mainframe software in this country that cost at least $50 billion to develop for which no source code exists today.

    This lack of commented source code would be no big deal if more of those original programmers had expected their programs to outlive them. But hardly any programmer in 1959 expected his payroll application to be still cutting checks in 1999, so nobody thought to teach many of these computer programs what to do when the calendar finally says it’s the year 2000. Any program that prints a date on a check or an invoice, and that doesn’t have an algorithm for dealing with a change from the twentieth to the twenty-first century, is going to stop working. I know this doesn’t sound like a big problem, but it is. It’s a very big problem.

    Looking for a growth industry in which to invest? Between now and the end of the decade, every large company in America either will have to find a way to update its mainframe software or will have to write new software from scratch. New firms will appear dedicated to the digital archaeology needed to update old software. Smart corporations will trash their old software altogether and start over. Either solution is going to cost lots more than it did to write the software in the first place. And all this new mainframe software will have one thing in common: it won’t run on a mainframe. Mainframe computers are artifacts of the 1960s and 1970s. They are kept around mainly to run old software and to gladden the hearts of MIS directors who like to think of themselves as mainframe gods. Get rid of the old software, and there is no good reason to own a mainframe computer. The new software will run faster, more reliably, and at one-tenth the cost on a desktop workstation, which is why the old IBM is doomed.

    “But workstations will never run as reliably as mainframes,” argue the old-line corporate computer types, who don’t know what they are talking about. Workstations today can have as much computing power and as much data storage as mainframes. Ten years from now, they’ll have even more. And by storing copies of the same corporate data on duplicated machines in separate cities or countries and connecting them by high-speed networks, banks, airlines, and all the other other big transaction processors that still think they’d die without their mainframe computers will find their data are safer than they are now, trapped inside one or several mainframes, sitting in the same refrigerated room in Tulsa, Oklahoma.

    Mainframes are old news, and the $40 billion that IBM brings in each year for selling, leasing, and servicing mainframes will be old news too by the end of the decade.

    There is going to be a new IBM, I suppose, but it probably won’t be the company we think of today. The new IBM should be a quarter the size of the current model, but I doubt that current management has the guts to make those cuts in time. The new IBM is already at a disadvantage, and it may not survive, with or without Bill Gates.

    So much for mainframes. What about personal computers? PCs, at least as we know them today, are doomed too. That’s because the chips are coming.

    While you and I were investing decades alternately destroying brain cells and then regretting their loss, Moore’s Law was enforcing itself up and down Silicon Valley, relentlessly demanding that the number of transistors on a piece of silicon double every eighteen months, while the price stayed the same. Thirty-five years of doubling and redoubling, thrown together with what the lady at the bank described to me as “the miracle of compound interest,” means that semiconductor performance gains are starting to take off. Get ready for yet another paradigm shift in computing.

    Intel’s current top-of-the-line 80486 processor has 1.2 million transistors, and the 80586, coming in 1992, will have 3 million transistors. Moore’s Law has never let us down, and my sources in the chip business can think of no technical reason why it should be repealed before the end of the decade, so that means we can expect to see processors with the equivalent of 96 million transistors by the year 2000. Alternately, we’ll be able to buy a dowdy old 80486 processor for $11.

    No single processor that can be imagined today needs 96 million transistors. The reality of the millennium processor is that it will be a lot smaller than the processors of today, and smaller means faster, since electrical signals don’t have to travel as far inside the chip. In keeping with the semiconductor makers’ need to add value continually to keep the unit price constant, lots of extra circuits will be included in the millennium processor— circuits that have previously been on separate plug-in cards. Floppy disk controllers, hard disk controllers, Ethernet adapters, and video adapters are already leaving their separate circuit cards and moving as individual chips onto PC motherboards. Soon they will leave the motherboard and move directly into the microprocessor chip itself.

    Hard disk drives will be replaced by memory chips, and then those chips too will be incorporated in the processor. And there will still be space and transistors left over—space enough eventually to gang dozens of processors together on a single chip.

    Apple’s Macintosh, which used to have more than seventy separate computer chips, is now down to fewer than thirty. In two years, a Macintosh will have seven chips. Two years after that, the Mac will be two chips, and Apple won’t be a computer company anymore. By then Apple will be a software company that sells operating systems and applications for single-chip computers made by Motorola. The MacMotorola chips themselves may be installed in desktops, in notebooks, in television sets, in cars, in the wiring of houses, even in wristwatches. Getting the PC out of its box will fuel the next stage of growth in computing. Your 1998 Macintosh may be built by Nissan and parked in the driveway, or maybe it will be a Swatch.

    Forget about keyboards and mice and video displays, too, for the smallest computers, because they’ll talk to you. Real-time, speaker-independent voice recognition takes a processor that can perform 100 million computer instructions per second. That kind of performance, which was impossible at any cost in 1980, will be on your desktop in 1992 and on your wrist in 1999, when the hardware will cost $625. That’s for the Casio version; the Rolex will cost considerably more.

    That’s the good news. The bad news comes for companies that today build PC clones. When the chip literally becomes the computer, there will be no role left for computer manufacturers who by then would be slapping a chip or two inside a box with a battery and a couple of connectors. Today’s hardware companies will be squeezed out long before then, unable to compete with the economics of scale enjoyed by the semiconductor makers. Microcomputer companies will survive only by becoming resellers, which means accepting lower profit margins and lower expectations, or by going into the software business.

    On Thursday night, April 12, 1991, eight top technical people from IBM had a secret meeting in Cupertino, California, with John Sculley, chairman of Apple Computer. Sculley showed them an IBM PS/2 Model 70 computer running what appeared to be Apple’s System 7.0 software. What the computer was actually running was yet another Apple operating system code-named Pink, intended to be run on a number of different types of microprocessors. The eight techies were there to help decide whether to hitch IBM’s future to Apple’s software.

    Sculley explained to the IBMers that he had realized Apple could never succeed as a hardware company. Following the model of Novell, the network operating system company, Apple would have to live or die by its software. And living, to a software company, means getting as many hardware companies as possible to use your operating system. IBM is a very big hardware company.

    Pink wasn’t really finished yet, so the demo was crude, the software was slow, the graphics were especially bad, but it worked. The IBM experts reported back to Boca Raton that Apple was onto something.

    The talks with Apple resumed several weeks later, taking place sometimes on the East Coast and sometimes on the West. Even the Apple negotiators scooted around the country on IBM jets and registered in hotels under assumed names so the talks could remain completely secret.

    Pink turned out to be more than an operating system. It was also an object-oriented development environment that had been in the works at Apple for three years, staffed with a hundred programmers. Object orientation was a concept invented in Norway but perfected at Xerox PARC to allow large programs to be built as chunks of code called objects that could be mixed and matched to create many different types of applications. Pink would allow the same objects to be used on a PC or a mainframe, creating programs that could be scaled up or down as needed. Combining objects would take no time at all either, allowing applications to be written faster than ever. Writing Pink programs could be as easy as using a mouse to move object icons around on a video screen and then linking them together with lines and arrows.

    IBM had already started its own project in partnership with Metaphor Computer Systems to create an object-oriented development environment called Patriot. Patriot, which was barely begun when Apple revealed the existence of Pink to IBM, was expected to take 500 man-years to write. What IBM would be buying in Pink, then, was a 300 man-year head start.

    In late June, the two sides reached an impasse, and talks broke down. Jim Cannavino, head of IBM’s PC operation, reported to IBM chairman John Akers that Apple was asking for too many concessions. “Get back in there, and do whatever it takes to make a deal,” Akers ordered, sounding unlike any previous chairman of IBM. Akers knew that the long-term survival of IBM was at stake.

    On July 3, the two companies signed a letter of intent to form a jointly owned software company that would continue development of Pink for computers of all sizes. To make the deal appear as if it went two ways, Apple also agreed to license the RISC processor from IBM’s RS/6000 workstation, which would be shrunk from five chips down to two by Motorola, Apple’s longtime supplier of microprocessors. Within three years, Apple and IBM would be building computers using the same processor and running the same software—software that would look like Apple’s Macintosh, without even a hint of IBM’s Common User Access interface or its Systems Application Architecture programming guidelines. Those sacred standards of IBM were effectively dead because Apple rightly refused to be bound by them. Even IBM had come to realize that market share makes standards; companies don’t. The only way to succeed in the future will be by working seamlessly with all types of computers, even if they are made by competitors.

    This deal with Apple wasn’t the first time that IBM had tried to make a quantum leap in system software. In 1988, Akers had met Steve Jobs at a birthday party for Katherine Graham, owner of Newsweek and the Washington Post. Jobs took a chance and offered Akers a demo of NeXTStep, the object-oriented interface development system used in his NeXT Computer System. Blown away by the demo, Akers cut the deal with NeXT himself and paid $10 million for a NeXTStep license.

    Nothing ever came of NeXTStep at IBM because it could produce only graphical user interfaces, not entire applications, and  because the programmers at IBM couldn’t figure how to fit it into their raison d’etre—SAA. But even more important, the technical people of IBM were offended that Akers had imposed outside technology on them from above. They resented NeXTStep and made little effort to use it. Bill Gates, too, had argued against NeXTStep because it threatened Microsoft. (When InfoWorld’s Peggy Watt asked Gates if Microsoft would develop applications for the NeXT computer, he said, “Develop for it? I’ll piss on it.”)

    Alas, I’m not giving very good odds that Steve Jobs will be the leader of the next generation of personal computing.

    The Pink deal was different for IBM, though, in part because NeXTStep had failed and the technical people at IBM realized they’d thrown away a three-year head start. By 1991, too, IBM was a battered company, suffering from depressed earnings and looking at its first decline in sales since 1946. A string of homegrown software fiascos had IBM so unsure of what direction to move in that the company had sunk to licensing nearly every type of software and literally throwing it at customers, who could mix and match as they liked. “Want an imaging model? Well, we’ve got PostScript, GPI, and X-Windows—take your pick.” Microsoft and Bill Gates were out of the picture, too, and IBM was desperate for new software partnerships.

    IBM has 33,000 programmers on its payroll but is so far from leading the software business (and knows it) that it is betting the company on the work of 100 Apple programmers wearing T-shirts in Mountain View, California.

    Apple and IBM, caught between the end of the mainframe and the ultimate victory of the semiconductor makers, had little choice but to work together. Apple would become a software company, while IBM would become a software and high-performance semiconductor company. Neither company was willing to risk on its own the full cost of bringing to market the next-generation computing environment ($5 billion, according to Cringely’s Second Law). Besides, there weren’t any other available allies, since nearly every other computer company of note had already joined either the ACE or SPARC alliances that were Apple and IBM’s competitors for domination of future computing.

    ACE, the Advanced Computing Environment consortium, is Microsoft’s effort to control the future of computing and Compaq’s effort to have a future in computing. Like Apple-IBM, ACE is a hardware-software development project based on linking Microsoft’s NT (New Technology) operating system to a RISC processor, primarily the R-4000, from MIPS Computer Systems. In fact, ACE was invented as a response to IBM’s Patriot project before Apple became involved with IBM.

    ACE has the usual bunch of thirty to forty Microsoft licensees signed up, though only time will tell how many of these companies will actually offer products that work with the MIPS/ Microsoft combination.

    But remember that there is only room for two standards; one of these efforts is bound to fail.

    In early 1970, my brother and I were reluctant participants in the first draft lottery. I was hitchhiking in Europe at the time and can remember checking nearly every day in the International Herald Tribune for word of whether I was going to Vietnam. I finally had to call home for the news. My brother and I are three years apart in age, but we were in the same lottery because it was the first one, meant to make Richard Nixon look like an okay guy. For that year only, every man from 18 to 26 years old had his birthday thrown in the same hopper. The next year, and every year after, only the 18-year-olds would have their numbers chosen. My number was 308. My brother’s number was 6.

    Something very similar to what happened to my brother and me with the draft also happened to nearly everyone in the personal computer business during the late 1970s. Then, there were thousands of engineers and programmers and would-be entrepreneurs who had just been waiting for something like the personal computer to come along. They quit their jobs, quit their schools, and started new hardware and software companies all over the place. Their exuberance, sheer numbers, and willingness to die in human wave technology attacks built the PC business, making it what it is today.

    But today, everyone who wants to be in the PC business is already in it. Except for a new batch of kids who appear out of school each year, the only new blood in this business is due to immigration. And the old blood is getting tired—tired of failing in some cases or just tired of working so hard and now ready to enjoy life. The business is slowing down, and this loss of energy is the greatest threat to our computing future as a nation. Forget about the Japanese; their threat is nothing compared to this loss of intellectual vigor.

    Look at Ken Okin. Ken Okin is a great hardware engineer. He worked at DEC for five years, at Apple for four years, and has been at Sun for the last five years. Ken Okin is the best-qualified computer hardware designer in the world, but Ken Okin is typical of his generation. Ken Okin is tired.

    “I can remember working fifteen years ago at DEC,” Okin said. “I was just out of school, it was 1:00 in the morning, and there we were, testing the hardware with all these logic analyzers and scopes, having a ball. ‘Can you believe they are paying for us to play?’ we asked each other. Now it’s different. If I were vested now, I don’t know if I would go or stay. But I’m not vested—that will take another four years—and I want my fuck you money.”

    Staying in this business for fuck you money is staying for the wrong reason.

    Soon, all that is going to remain of the American computer industry will be high-performance semiconductors and software, but I’ve just predicted that we won’t even have the energy to stay ahead in software. Bummer. I guess this means it’s finally my turn to add some value and come up with a way out of this impending mess.

    The answer is an increase in efficiency. The era of start-ups built this business, but we don’t have the excess manpower or brainpower anymore to allow nineteen out of twenty companies to fail. We have to find a new business model that will provide the same level of reward without the old level of risk, a model that can produce blockbuster new applications without having to create hundreds or thousands of tiny technical bureaucracies run by unhappy and clumsy administrators as we have now. We have to find a model that will allow entrepreneurs to cash out without having to take their companies public and pretend that they ever meant more than working hard for five years and then retiring. We started out, years ago, with Dan Fylstra’s adaptation of the author-publisher model, but that is not a flexible or rich enough model to support the complex software projects of the next decade. Fortunately, there is already a business model that has been perfected and fine-tuned over the past seventy years, a business model that will serve us just fine. Welcome to Hollywood.

    The world eats dinner to U.S. television. The world watches U.S. movies. It’s all just software, and what works in Hollywood will work in Silicon Valley too. Call it the software studio.

    Today’s major software companies are like movie studios of the 1930s. They finance, produce, and distribute their own products. Unfortunately, it’s hard to do all those things well, which is why Microsoft reminds me of Disney from around the time of The Love Bug. But the movie studio of the 1990s is different; it is just a place where directors, producers, and talent come and go—only the infrastructure stays. In the computer business, too, we’ve held to the idea that every product is going to live forever. We should be like the movies and only do sequels of hits. And you don’t have to keep the original team together to do a sequel. All you have to do is make sure that the new version can read all the old product files and that it feels familiar.

    The software studio acknowledges that these start-up guys don’t really want to have to create a large organization. What happens is that they reinvent the wheel and end up functioning in roles they think they are supposed to like, but most of them really don’t. And because they are performing these roles — pretending to be CEOs — they aren’t getting any programming done. Instead, let’s follow a movie studio model, where there is central finance, administration, manufacturing, and distribution, but nearly everything else is done under contract. Nearly everyone — the authors, the directors, the producers — works under contract. And most of them take a piece of the action and a small advance.

    There are many advantages to the software studio. Like a movie studio, there are established relationships with certain crafts. This makes it very easy to get a contract programmer, writer, marketer, etc. Not all smart people work at Apple or Sun or Microsoft. In fact, most smart people don’t work at any of those companies. The software studio would allow program managers to find the very best person for a particular job. A lot of the scrounging is eliminated. The programmers can program. The would-be moguls can either start a studio of their own or package ideas and talent together just like independent movie producers do today. They can become minimoguls and make a lot of money, but be responsible for at most a few dozen people. They can be Steven Spielberg or George Lucas to Microsoft’s MGM or Lotus’s Paramount.

    We’re facing a paradigm shift in computing, which can be viewed either as a catastrophe or an opportunity. Mainframes are due to die, and PCs and workstations are colliding. Processing power is about to go off the scale, though we don’t seem to know what to do with it. The hardware business is about to go to hell, and the people who made all this possible are fading in the stretch.

    What a wonderful time to make money!

    Here’s my prescription for future computing happiness. The United States is losing ground in nearly every area of computer technology except software and microprocessors. And guess what? About the only computer technologies that are likely to show substantial growth in the next decade are — software and microprocessors! The rest of the computer industry is destined to shrink.

    Japan has no advantage in software, and nothing short of a total change of national character on their part is going to change that significantly. One really remarkable thing about Japan is the achievement of its craftsmen, who are really artists, trying to produce perfect goods without concern for time or expense. This effect shows, too, in many large-scale Japanese computer programming projects, like their work on fifth-generation knowledge processing. The team becomes so involved in the grandeur of their concept that they never finish the program. That’s why Japanese companies buy American movie studios: they can’t build competitive operations of their own. And Americans sell their movie studios because the real wealth stays right here, with the creative people who invent the software.

    The hardware business is dying. Let it. The Japanese and Koreans are so eager to take over the PC hardware business that they are literally trying to buy the future. But they’re only buying the past.

    Reprinted with permission

    Photo Credit: Anneka/Shutterstock

  • With new partner, VC firm Aberdare goes all in on digital health

    Venture capital firm Aberdare is evolving with the times. In response to technology, policy and demographic trends that are reshaping the healthcare industry, the group on Tuesday said it is refining its investment thesis.

    To support its growing emphasis on “transformational health investing,” the firm also said it had hired a new partner Mohit Kaushal, an MD and MBA, who previously led investments at West Health and served as director of connected health at the Federal Communications Commission.

    “The old mantra was very much around improved outcomes as a thesis and then premium pricing on top of that, whether it’s a drug or a device. Aberdare came to the conclusion, from a very bottoms up approach, that the world is changing and that this was not a viable thesis any more,” said Kaushal. “The efficiency angle is just way more important these days.”

    Given the rising cost of healthcare and the growing elderly population, as well as shifts in the technology and policy landscape, big changes in the way health care is delivered and regulated are on the horizon. While outcomes are still critical, Kaushal says it’s become increasingly important to encourage cost-effective health care systems.

    With an investment portfolio that includes wearable electronics startup MC10, online diabetes prevention program Omada Health and health startup accelerator Rock Health, Aberdare is already a leader in the emerging world of digital health. But Kaushal said the firm will focus even more closely on three areas in healthcare: personalized medicine (including diagnostics and genomics), smart sensors and health care IT.

    As various reports have shown, venture capital interest in life sciences companies is on the decline. Some firms like Aberdare and Venrock, which historically invested in more traditional pharmaceutical and device companies, were early supporters of startups that bring newer digital technologies to healthcare.  But others have been slower to shift their focus. Going forward, it will be interesting to see how the rest of the industry shakes out.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • VC-Backed BTI Systems Names Tod Nielsen To Its Board

    Ottawa-based BTI Systems Inc. has named software industry veteran Tod Nielsen to the company’s board of directors. Nielsen recently helped establish the EMC/VMware spin-off Pivotal Initiative. Prior to that he was with VMware, Borland and Microsoft. BTI, a provider of intelligent networking software and systems, has been venture-backed since 2000. Its investors include Bain Capital Ventures, BDC Venture Capital, Covington Funds, Export Development Canada, GrowthWorks and Kodiak Venture Partners.

    PRESS RELEASE

    BTI Systems Adds Former VMware Executive Tod Nielsen to Board of Directors

    Internationally Respected Industry Leader and Former VMware Co-President of Applications Platform Group Brings Significant Software, Cloud Computing and Big Data Market Expertise

    OTTAWA and BOSTON, April 16, 2013 – Following closely on BTI Systems’ introduction ofIntelligent Cloud Connect – an open, software-rich platform designed to significantly improve cloud networking performance by combining network intelligence and application awareness with significantly expanded capacity and scale – the company today announced that software industry visionary Tod Nielsen is joining its board of directors.

    Based in Silicon Valley and globally recognized for his combination of strategic vision and operational capabilities, Nielsen most recently helped establish the Pivotal Initiative, the highly anticipated and closely watched EMC/VMware spin-off focusing on cloud computing and Big Data. Prior to that he was with VMware, where he served as co-president of applications platform group and, earlier, as chief operating officer.

    With cloud network traffic expected to increase 6-fold by 2016, Nielsen brings world-class applications and software industry leadership to BTI’s board. In February, the company expanded its portfolio with a first-of-its-kind platform, Intelligent Cloud Connect, which empowers content and service providers to capitalize more fully on opportunities in cloud services delivery. Combining software intelligence with the flexibility of routing and the capacity and scale of optical, BTI’s new platform unlocks the bottleneck between and among data centers, peering points and users. Notably, it quadruples capacity and scale, reduces latency by half, and increases network applications performance by as much as 10 times. It does so while significantly improving operational efficiencies, network control and service innovation compared to legacy solutions.

    “The rapid adoption of cloud computing represents one of the most important and promising shifts for customers and vendors alike,” said Nielsen. “With this comes the demand for new and better architectures to greatly improve the performance of the underlying network and to fuel current and new service offerings. I’m excited to join the board at a time when BTI’s vision, strategy, technology and partnerships are enabling the company to establish its role as a key player in this new ecosystem.”

    Nielsen has held a series of high-profile positions during the past two decades. Prior to the Pivotal Initiative and VMware, he served as the CEO of Borland Corporation. Earlier, he held the positions of senior vice president of marketing and global sales support for Oracle and executive vice president and chief marketing officer for BEA Systems. He is a member of the board of directors at CyrusOne Inc, MyEdu and Club Holdings LLC.

    “BTI is making cloud computing a reality for our content and service provider customers – fueled by the vision, strategy and next-generation innovations required to fundamentally transform their businesses and the services they offer,” said BTI President and CEO Steve Waszak. “We’re thrilled to have a proven thought leader of Tod’s caliber and expertise join us at a time when BTI is entering our next phase of substantial expansion. It comes on the heels of announcing a new round of growth capital, an innovative SDN-enabled platform that represents a new market category for our industry, and expanding our established presence in Silicon Valley. Gaining Tod’s contribution to our vision and direction is a significant competitive advantage.”

    About BTI Systems
    BTI delivers solutions that transform the economics, performance and innovation of global networks through intelligent networking software and systems. Leading content, cloud and service providers choose BTI to drive improved operational efficiencies and profitably deliver high-value services to businesses and consumers around the globe. With more than 350 customers, BTI is headquartered in North America, and operates regional sales, marketing, and R&D centers of excellence throughout the world. For more information, visitwww.btisystems.com.

    Photo courtesy of Shutterstock.

    The post VC-Backed BTI Systems Names Tod Nielsen To Its Board appeared first on peHUB.

  • Reducing Tobacco-Related Cancer Incidence and Mortality: Workshop Summary

    Final Book Now Available

    Tobacco use is the leading cause of preventable death in United States, causing more than 440,000 deaths annually and resulting in $193 billion in health-related economic losses each year–$96 billion in direct medical costs and $97 billion in lost productivity. Since the first U.S. Surgeon General’s report on smoking in 1964, more than 29 Surgeon General’s reports, drawing on data from thousands of studies, have documented the overwhelming and conclusive biologic, epidemiologic, behavioral, and pharmacologic evidence that tobacco use is deadly. This evidence base links tobacco use to the development of multiple types of cancer and other life-threatening conditions, including cardiovascular and respiratory diseases. Smoking accounts for at least 30 percent of all cancer deaths, and 80 percent of lung cancer deaths. Despite the widespread agreement on the dangers of tobacco use and considerable success in reducing tobacco use prevalence from over 40 percent at the time of the 1964 Surgeon General’s report to less than 20 percent today, recent progress in reducing tobacco use has slowed. An estimated 18.9 percent of U.S. adults smoke cigarettes, nearly one in four high school seniors smoke, and 13 percent of high school males use smokeless tobacco products.

    In recognition that progress in combating cancer will not be fully achieved without addressing the tobacco problem, the National Cancer Policy Forum of the Institute of Medicine (IOM) convened a public workshop, Reducing Tobacco-Related Cancer Incidence and Mortality, June 11-12, 2012 in Washington, DC. In opening remarks to the workshop participants, planning committee chair Roy Herbst, professor of medicine and of pharmacology and chief of medical oncology at Yale Cancer Center and Smilow Cancer Hospital, described the goals of the workshop, which were to examine the current obstacles to tobacco control and to discuss potential policy, outreach, and treatment strategies that could overcome these obstacles and reduce tobacco-related cancer incidence and mortality. Experts explored a number of topics, including: the changing demographics of tobacco users and the changing patterns of tobacco product use; the influence of tobacco use on cancer incidence and cancer treatment outcomes; tobacco dependence and cessation programs; federal and state level laws and regulations to curtail tobacco use; tobacco control education, messaging, and advocacy; financial and legal challenges to tobacco control efforts; and research and infrastructure needs to support tobacco control strategies, reduce tobacco related cancer incidence, and improve cancer patient outcomes. Reducing Tobacco-Related Cancer Incidence and Mortality summarizes the workshop.

    [Read the full report]

    Topics: Health and Medicine

  • Google releases Mirror API to developers, complete with documentation and code examples

    google-glass

    If you’re waiting on your Google Glass unit to ship, you can get prepared for it by taking a look at a new API that Google has released for Glass developers. The Mirror API has been released with complete documentation and some code examples for developers to get their feet wet so they’ll already have something ready for when Google Glass does arrive. Getting a head start never hurts, and I’m sure Google knows that. Google has highlighted some major features in the API as well as given some examples and guidelines for ensuring the best user experience.

    With all the rumors of smartwatches lately, it’s pretty obvious wearable technology is going to start making waves in consumer markets fairly soon. But, like we’ve seen with some mobile OSes, if there are no developers or applications, it’s tough to get the platform off the ground. Google made sure that wasn’t a problem for Android, and it looks like they’re taking steps to make sure Glass is a repeat experience.

    source: Google Developers

    Come comment on this article: Google releases Mirror API to developers, complete with documentation and code examples

  • Gracenote co-founder on ‘iPod day’ and better music through data

    It was April 2000 when the team at Gracenote got a call from Apple that would change its business forever. Apple wouldn’t give Gracenote any specifics, but it did offer up some prescient advice: “You need to buy more servers.”

    A few years into Steve Jobs’s second stint as Apple’s CEO, the company hadn’t yet reinvented itself as one of the world’s most-important technology companies, but it was a big-enough distribution channel for the two-year-old Gracenote. At that point, Gracenote had built a respectable business collecting and providing metadata for the compact discs that people were ripping onto their computers, and it relied on software partners to get in front of the music consumers doing the uploading. One of those partners was a popular Mac jukebox application called SoundJam MP.

    Ty Roberts Source: Gracenote

    Ty Roberts Source: Gracenote

    So, Gracenote Co-founder and CTO Ty Roberts told me during a recent interview, his company heeded Apple’s warning and bought more servers. At some point around that time (details on the date of the acquisition are sketchy), Apple bought SoundJam MP. Then, at MacWorld in January 2001, Apple released the first version of iTunes (based on the SoundJam technology) and grew Gracenote’s footprint by putting it on more machines. In October 2001, Apple released the iPod and changed Gracenote’s life forever.

    The holiday season — particularly Christmas morning — provides a clear example of how stark the change was. “We used to call it iPod day,” Roberts explained, because the company’s servers would go crazy as people opened up their new iPods and immediately began ripping CDs onto their computers. The company’s chief scientist would stay up 20 hours a day for 5 days straight to make sure the database didn’t crash under the load.

    From that point on, Roberts explained, a graph showing the rate at which people were uploading music to Gracenote would go from a steady incline into a vertical line. At one point the company was getting metadata from — by Roberts’s estimate — literally every CD being ripped onto personal computers. There was so much database traffic — both writing and reading — because Apple didn’t release the first version of the iTunes Store until April 2003; if users wanted to use their iPods, they had to upload music first.

    Scaling like the big boys

    Today, of course, Gracenote (which Sony acquired for $260 million in 2008) is pretty much ubiquitous, at least when it comes to metadata. It has metadata for about 130 million songs — and growing — from all over the world and provides metadata to everything from iTunes to Path to your car’s entertainment console. Even if they’re not available for sale as MP3, if someone somewhere at some point ripped a CD and entered its information, Gracenote has data on those artists and songs.

    Its database now gets 15 billion queries a month, or 500 million a day (“We’re probably bigger than Bing,” Roberts joked), and the company’s infrastructure has scaled a few times to meet this demand. What began as a small web database running on a few servers grew into an Oracle environment that provided better performance. And when Oracle became cost-prohibitive because of Gracenote’s expanding scale, it shifted again into a highly optimized system that spans thousands of cores in four global data centers.

    Now, GM and VP of Automatic Content Recognition Michael Jeffrey noted, almost everything from the chip level up is optimized specifically for Gracenote.

    There’s no “world music” when you’re in the “world”

    And this setup lets Gracenote do a lot more than just recognize music listeners’ files and give them the album art. For one, Roberts explained, it lets Gracenote be a global company. “We want to have all the music in the world,” Roberts said, “… because our customers ship their products globally.” In fact, part of the reason it’s now part of Sony is that Sony was distributing Gracenote so widely as part of the music player in its Vaio line of laptops.

    In order to ensure that everyone has a natural experience wherever they’re accessing Gracenote, part of the job of the company’s 100-person editorial team is to categorize music hierarchically by locality. So, when a user in Japan uploads a CD and Gracenote returns the metadata, it’s categorized as “rock and roll,” for example, rather than a catch-all category like “world music” that a U.S. user might see.

    “We want music to feel like a person in your country actually organized it,” Roberts said, “not some dude from California.”

    Better music and television through data science

    All that data also makes Gracenote a natural fit for recommending new music, although right now the company prefers to let partners handle the algorithms because recommendations tend to be highly product-specific. For example, the iTunes Genius feature is a pretty run-of-the-mill recommendation engine, but, Roberts explained, Apple places a premium on accuracy because its recommendations cost users 99 cents (or more) a shot. With a subscription service like Spotify, though, trying new music is risk-free, so it can play a little faster and looser with its algorithms.

    Because Gracenote is present in so many cars — about 35 million — the company has put a lot thought into how to optimally deliver services there, too. Until drivers can bring their interest graphs and music libraries with them to their cars, he explained, any sort of in-car recommendation engine has to be pretty simple and non-distracting — perhaps like thumbs-up or thumbs-down button on the display that will eventually be able to recognize someone’s tastes.

    The company has even developed what Roberts calls “machine listening,” which is the ability of an algorithm to recognize the mood, tempo and other audio attributes of music. This is comparable to what Pandora offers, but Gracenote has data on pretty much any song someone could possibly have, which means it can make even your personal music library that much smarter. One idea the company is tinkering is something Roberts describes as “audio coffee.” Depending on any variety of factors — time of day, location, driving conditions or behavior — the stereo system could pick music that either picks up a driver’s pulse or maybe relaxes him.

    For Gracenote’s next chapter, the company is banking tablets to deliver a kick like the iPod did last decade. Gracenote is already working with television partners on real-time ad-swapping and intelligent content recommendations, and now it wants to dive deep into the second-screen world. Its new product called Entourage uses a tablet’s internal sensors to hear the television show or music playing in a room and then surface related content, perhaps from the web — like what Entourage user Zeebox provides — or perhaps produced, interactive material like the SyFy channel delivers via its Sync app.

    Two ads for two different viewers.

    Two ads for two different viewers.

    Later this year, GM and VP Jeffrey said, Gracenote will be doing pilots with some large sports broadcasters around a “cheer and jeer” feature that measures how hard people in a room are cheering for or booing their favorite sports teams. If you’re elated, you might see an ad for season tickets. If you’re sad, maybe it’s an an for beer.

    Even Roberts is impressed, especially considering that the company’s first use of audio recognition was to make sure users got the right data for their exact version of a song: “I never thought the recognition would break open these kind of new fields.”

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

        

  • Android malware infections found to have tripled in 2012

    Android malware infections found to have tripled in 2012
    Anyone developing malicious software for mobile devices has set his or her sights almost exclusively on Android at this point. Mobile security vendor NQ has found that Android devices infected with malware grew from 10.8 million in 2011 to 32.8 million in 2012, meaning that the total number of infected devices tripled year-over-year. NQ also found that almost 95% of malware detected in 2012 was designed specifically for Android devices, meaning Google’s mobile operating system is by far the No. 1 target for would-be cybercriminals.

    Continue reading…

  • Google Glass Explorer units now ready to begin shipping

    google_glass_image_borrowlenses

    Google has recently sent out an email that the Explorer Google Glass units are ready to begin shipping. So if you signed up as a developer for the project last year, it shouldn’t be too much longer before you can start making fashion statements in your town.

    This news matches up with what we’d heard previously about Glass shipping dates, so we can expect the early supporters to have their units by the time Google I/O rolls around. It should be pretty exciting to see what everyone does with these.

    source: Android Central

    Come comment on this article: Google Glass Explorer units now ready to begin shipping

  • Twitter shows how the news is made, and it’s not pretty — but it’s better that we see it

    Not long after the Boston Marathon bombings occurred on Monday afternoon, several Twitter users noted that these kinds of real-time news events illustrate how incredible the service is as a source of breaking news, but at the same time how terrible it is. Sure enough, there were plenty of fake news reports to go around on Monday, from reports of suspicious vehicles to the arrest of alleged perpetrators — just as there were during Hurricane Sandy and the school shootings in Connecticut. But does that invalidate Twitter as a news source? And should the service try harder to filter out bad information and highlight verified news reports? I think the answer to both of these questions is the same: No.

    Erik Wemple of the Washington Post noted that in some cases Twitter can act as a “news ombudsman,” pointing out that there were a number of people advising caution in the tweeting and re-tweeting of details about the blasts, although Wemple may also have been following more members of the media than the average person (ironically, some criticized Wemple himself for being too quick to post his thoughts about Twitter use during the aftermath of the bombings).

    This in itself illustrates one of the problems with Twitter as a news-delivery vehicle, which is that no one can agree on the proper behavior during such events — or at least not enough people to make it worthwhile. When (if ever) is it too soon to speculate about the source of the attack or details like the number of wounded? Which sources are reliable and which aren’t when it comes to retweeting? Does everything have to be verified? Is it okay to retweet graphic videos and photos?

    Journalism in real time, with all its flaws

    These are all the same challenges that breaking-news outlets like CNN face, but they have teams of seasoned editors to make those decisions (and still often get them wrong — perhaps even as wrong as Twitter does). Twitter has nothing but a short attention span, a hair trigger and a couple of buttons that say “tweet” and “retweet,” and they are all too easy to push. Should more people think twice before they click them? Undoubtedly. Will they? Probably not.

    That said, however, there’s no question that Twitter is one of the best tools for breaking-news delivery since the telegraph. Unfortunately, it is also a great tool for distributing lies, speculation, innuendo, hoaxes and every other form of inaccurate information. I’ve argued before that this is just the way the news works now — the news wire and police scanner are no longer available only to journalists, but to anyone who cares to listen. And so is the ability to republish.

    Should Twitter do more to verify sources, or highlight accurate information, as some have suggested? It’s an appealing idea. The service could try to use geotagging to identify those who are close to the scene, or some other method to determine credibility — something third-party services like Sulia and Storyful also try to do through a variety of methods. But is that really Twitter’s place?

    Leave verification to the journalists

    Why don’t we get YouTube to verify the source of videos as well, like the ones that are posted from Syria or Egypt? Or get Google to sort the news it pulls in based on the likelihood of it being credible? The simplest answer is that this isn’t what those services are for — they are distribution engines, or pipes (a series of tubes, if you will). Asking them to become news entities is a little like asking AT&T to eavesdrop on phone calls in order to figure out who is a terrorist.

    Rather than relying on Twitter to do this, I think it’s far better to accept the somewhat chaotic nature of the medium, and rely on journalists — and not just the professional kind, but the amateur kind as well — to filter that information in real time, the way Andy Carvin did during the Arab Spring (by using Twitter as a crowdsourced newsroom) and others did during Hurricane Sandy and the Colorado shootings. Over time, I believe, Twitter becomes a kind of self-cleaning oven, as writer Sasha Frere-Jones put it.

    Sure, it’s messy and erratic, but that’s because it is made of human beings. Traditional media is like that too, we just rarely see it happening out in the open. But I believe that having it happen out in the open is ultimately better than keeping it behind closed doors.

    Post and thumbnail images courtesy of Flickr user Petteri Sulonen

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

          

    • MetroPCS board unanimously approves new T-Mobile merger terms

      MetroPCS board unanimously approves new T-Mobile merger terms
      The merger between T-Mobile and MetroPCS is very close to getting official now that Deutsche Telekom’s revised offer has led some major shareholders to drop their objections to the deal. MetroPCS announced on Monday that its board of directors had unanimously approved the new merger terms and said that the revised deal “significantly improves the value of the proposed combination for MetroPCS stockholders” while adding that the proposed merger “is in the best interest of all MetroPCS stockholders.” The MetroPCS merger is the linchpin of T-Mobile’s strategy to expand its operations in the United States since the prepaid wireless carrier already offers LTE services in several major metropolitan markets. MetroPCS shareholders are scheduled to vote to approve or reject the merger on April 24th.