Category: News

  • Setting up for TED2013: Backstage photos

    Setup starts on the TED2013 stage in Long Beach.

    Backstage setup starts on the TED2013 stage in Long Beach.

    A monumental amount of effort goes into setting up for TED. This year’s iteration in Long Beach is no different. Construction of our stage started on Wednesday, and it wasn’t long before the entire Long Beach Performing Arts Center was full of the buzz of technicians, workers and artists getting ready for next week.

    Setting up for TED 2013

    Artist and TED2013 speaker Jordy Fu (center, in green jacket) works on the installation of her cut-paper piece “Cloud” (2013) for the lobby of the Long Beach Performing Arts Center. Above her hangs Gabriel Dawe’s “Plexus C2,” installed for TED2012 and now part of the permanent collection at the LBPAC.

    Hanging speaker signs in the lobby at the Long Beach Performing Arts Center.

    Hanging speaker signs in the lobby at the Long Beach Performing Arts Center.

    Reflections of the TED2013 speakers in the windows of the Long Beach Performing Arts Center

    Reflections of the TED2013 speakers in the windows of the Long Beach Performing Arts Center

    Photos: James Duncan Davidson (photos 1 and 3); Ryan Lash (photos 2 and 4)

  • Save big bucks and protect your PC! Hurry! Deals end Feb. 28

    We’re now well in to the new year, with the February software offers live through the Downloadcrew Software Store.

    You may have picked up a brand new Windows 8 computer during the festive season. If so, the first thing you’ll want to do is pick up a security suite to keep your system secure. The Downloadcrew Software Store is packed full of security offers from Bitdefender, AVG, Kaspersky, Avira and other brands.

    Let’s start with Bitdefender. If you’re seeking a lightweight low-cost security application for one PC, we’d like to draw your attention toBitdefender Antivirus Plus 2013, a staggering 83 percent off MSRP. Bitdefender Windows 8 Security is the company’s brand new offering, and can be yours for $19.95, 67 percent off MSRP. There’s a saving of 63 percent off Bitdefender Total Security 2013 [3-PC] which includes a strong antivirus engine, intelligent two-way firewall, capable spam filter – all for just $29.95. If you want to keep your computer safe for even less, look no further than Bitdefender Internet Security 2013 [1-PC] which still packs a punch for just $14.99 saving you 70 percent off the MSRP.

    You’ll also find the latest versions of the most popular security tools including AVG Anti-Virus 2013. This industry standard antivirus tool is available for the massively discounted price of $6.95 – that’s a colossal 83 percent off the MSRP. For a more complete security solution, take a look at AVG Internet Security 2013 which includes an added firewall, spam filter and more – all for just $17.99 (or a 67 percent discount). The most complete security package comes in the form of AVG Internet Security 2013 Complete Bundle, which includes FIVE separate AVG applications, worth $307 – yours for only $59.95, a staggering 80 percent off MSRP!

    We recently added the Norton 2013 products to the store. Highlights include Norton Internet Security 2013 [1-PC] for $23.99, or 60 percent off MSRP. The more simplistic anti-malware tool Norton AntiVirus 2013 [1-PC] is $19.99 or 50 percent off MSRP. Multi-user editions are also available. Other Symantec products include Norton Anti-Theft at $27.99, or 30 percent off MSRP.

    The avast! products were also added to the store through January. Within avast! 8 around the corner, buy avast! 7 products now and you’ll get a free upgrade to avast! 8, on release, in March. avast! Internet Security 7 [3-PC] will safeguard three computers, for $41.99, or 40 percent off MSRP. The more basic avast! Pro 7 [1-PC] is yours for $23.99, again 40 percent off MSRP.

    Everyone needs a backup tool, whether this is to store important files in a secure location or to make a 1:1 copy of your hard drive, to safeguard against system failure. There are few better backup tools than True Image 2013 by Acronis (Acronis True Image 2013), available for only $26.99, which is currently 46 percent off MSRP. Through February, you can purchase Genie Timeline Professional 2012 for only $27,95, 53 percent off MSRP, plus you get a free upgrade to the forthcoming Genie Timeline Professional 2013, due for release this March. Finally, an alternative backup tool isBackup4All Professional 4.8, which you can buy for $24.95 or 50 percent off MSRP.

    System maintenance and PC optimization tools help keep your brand new system in check while you continue to enjoy using your PC. System maintenance tools enable you to remove unwanted startup items, erase clutter and temporary files, personalize your computer and much more.

    Our first recommendation is TuneUp Utilities 2013. You can save 60 percent from TuneUp Utilities 2013 [3-PC], ideal for installing on three home computers. If you have a single PC, save even more with TuneUp Utilities 2013 [1-PC]. There is also TuneUp Utilities 2013 [UPGRADE] for existing users to upgrade, at a significant discount. Another recommended maintenance tool is AVG PC TuneUp 2013, effectively a re-branded TuneUp Utilities, which helps you to eke extra performance from your machine. This great utility costs as mere $12.99, or a saving of 68 percent off the MSRP. Auslogics Boostspeed 5 is also worth considering for $19.95, 60 percent off MSRP, as is IObit Advanced SystemCare Ultmate 6 is another great all-round system maintenance suite, available for only $24.99 or 50 percent off MSRP.

    Be sure to check the Downloadcrew Software Store for other software deals and brands. In addition to the above offers, there are further deals from MAGIX, CyberLink, Avira, PC Tools, BullGuard and many others. Hurry though as some of the current offers will end February 28 2013!

    Helder Almeida/Shutterstock

  • YC’s iCracked Is Blowing Up With A New “Uber” For iPhone Repairs Service

    repair-iphone

    Yes, you can fix that smashed iPhone on demand now. That means no visits to the Apple store, or intensive DIY efforts.

    A YC alum called iCracked launched a real-time, iPhone or iPad repair service a little over a month ago.

    Think of it like an “Exec” or an “Uber” for your broken iPhone that you can order straight to your door.

    With hardly any publicity at all, the service is blowing up: it boosted iCracked’s number of monthly customers by about 250 percent and the company tells me the business is eyeing “eight figures” in revenue for this year. The changes add iCracked to a growing class of startups like Exec, Uber, Zimride’s Lyft, Instacart and Postmates that are all trying to solve the logistical issues of delivering products and services in real-time in urban cities.

    “We want to be the ‘AAA’ for your device,” explains AJ Forsythe, the company’s CEO. “We’re doing on-demand repair and buyback for just about every major city in the U.S.”

    He shared some of the maps above and below with us, showing actual completed repairs in the last 30 days. Above is the San Francisco Bay Area, and just for good measure to show that this isn’t a Silicon Valley-only phenomenon, he showed us a map of South Florida (below).

    “We’re trying to get to a place where we can get someone to them in the shortest amount of time at the click of a button,” he said. He partnered with a 20-year-old from the U.K. named Martin Amps, who had built a dispatch system just months ago. Amps never implemented it because it was so specialized, but Forsythe found him on a Hacker News posting and thought the system could be of use to iCracked.

    Up until then, iCracked’s three-prong business model worked similarly. But it didn’t operate in real-time. Customers would have to mail-in their devices or schedule appointments with iTechs.

    iCracked earns revenue in three ways: it does 1) repairs, 2) buybacks and 3) sells do-it-yourself kits (pictured right) for people who want to fix phones themselves.

    The company has more than 350 “iTechnicians,” who work as contractors and are trained to quickly fix broken iPhones and iPads. They earn decent salaries of between $70,000 and $100,000 a year. Forsythe says he’s selective and he only ends up hiring about 2 to 3 percent of iTech applicants.

    While these “iTechs” aren’t full employees of the startup, iCracked earns revenue by selling them parts and connecting them with customers. Depending on whether it’s an iPhone, iPad or iPad and the kind of problem a customer has — whether that’s a screen or battery replacement or water damage — costs hover around $75 to 99. But an iPad LCD replacement can top $200 with the mail-in service.  If you don’t spring for Apple Care, iCracked beats the cost of paying for an entirely new device or spending more than $200 on a replacement phone.

    The “iTechs” make up about 50 percent of iCracked’s revenues, while 30 percent comes from the DIY kits and the remaining 20 percent comes from buybacks, where the company will pay to take old, unused iPhones or iPods off people.

    The new real-time dispatch service will also change the buyback program. Before, iPhone owners would have to mail in their devices, get an appraisal seven to 10 days later and then get a check in the mail after that.

    Eventually, iCracked will be able to send out an iTech immediately, who will estimate the value of the device, and then give the customer a prepaid debit card for that amount on the spot, which can be redeemed at any local ATM.

    This complex, real-time dispatch system is a far cry from where iCracked started. It’s one of those humble “dorm room” businesses that emerged out of Forsythe’s time as an undergrad at Cal Poly-SLO. He gained a reputation on campus as someone who could quickly fix iPhones on the cheap. He then turned it into a business, and started charging people at school $75 per fix.

    Eventually, he started scaling up iCracked by finding makers of inexpensive screens and then hiring and training other people to repair devices. After that, he joined Y Combinator’s winter class of startups last year.

    The business has some angel investment, but Forsythe says he’s shied away from doing a full Series A round. They’re starting to look for additional growth capital now, however.

    “We have this thing called — ‘hardware,’” he joked, poking fun at how venture investors seem to favor software startups.

  • Huawei Prepares To Unveil Ascend P2 Smartphone — Smaller Screen Sibling To The Ascend D2 Android Phablet?

    Huawei Logo

    After unboxing a pair of phablets at CES, Chinese mobile maker Huawei looks to be lining up a new flagship smartphone in its Android-based Ascend P line ahead of the Mobile World Congress trade show kicking off in Barcelona Monday. The company, which pushed into third place in the global smartphone rankings for the first time in Q4, has sent out invites to a press conference taking place tomorrow afternoon (CET).

    It’s not confirmed what device or devices Huawei plans to unveil tomorrow — the invite includes the cryptic tagline “Discover possible” – but CNET‘s Stephen Shankland has snapped a photo of Huawei’s MWC booth, currently under construction before the crowds arrive on Monday, which includes a sign for an as yet unreleased device called the Ascend P2.

    Judging by the name, the Ascend P2 is the sequel to the Ascend P1, which launched in Europe last summer. P stands for ‘Platinum’ in Huawei’s marketing speak — one rung down from its top-of-the-line D for Diamond devices, such as the 5-inch Ascend D2, which it outted at CES, along with the 6.1 inch Ascend Mate (its Galaxy Note rival). Those quad-core whoppers leave room in Huawei’s portfolio for a powerhouse smartphone with a slightly less palm-stretching screen. So, enter stage left the Ascend P2. Either that or it has a typo in its booth signage.

    Aside from an LTE variant, the Ascend P1 was a relatively mid-range affair — with a dual-core 1.5Ghz chip, 4.3-inch display and 8 megapixel camera. The Ascend P2 is rumoured to add more beef the second time around, with various leaks hinting at a 1.8GHz quad-core chipset — which would give it more welly than either the Ascend Mate or the D2 — along with a 4.7 inch display, a 13 megapixel camera and Android 4.1. We’ll find out for sure tomorrow.

    Huawei can’t claim to have the massive brand clout of Samsung and its Galaxy range, but its mobile profile is growing and it has carved out a savvy niche for itself in the Android space by offering relatively impressive specs for the device’s price point — which, in its Ascend G range, has helped to power up the functionality of budget Androids. The company is also taking a similar tack with Windows Phone — showing off an “entry level” Windows 8 device at CES, the Ascend W1, and partnering with Microsoft to launch an “affordable” Windows Phone device for the African market.

  • The Next Range Rover, On and Off-Road Review

    2013 Range Rover

    There are two type of Range Rover enthusiasts out there. Those who like to be seen in them, and those who actually purchase them for their off-road abilities. Unfortunately the vast majority of owners (like 99%) dump big money on these wonderful machines for the sole purpose of being seen. Matt Farah of The Smoking Tire recently headed to Arizona on Range Rovers dime to sample the new 2013 line of Rovers, so make sure to check it out after the jump.

    Source: TheSmokingTire.com

  • When all other Windows troubleshooting fails, try WMI Diagnostic Utility

    Windows Management Instrumentation (WMI) is an important Windows framework that is used by many system components, as well as plenty of third-party applications, so if it’s ever damaged then you could experience all kinds of odd system problems. There’s no single place that you can check to see whether WMI is working, either, as it’s just too complex, and so Microsoft has developed a script called the WMI Diagnostic Utility to provide some in-depth troubleshooting information.

    The tool is aimed at system administrators and other IT professionals, so if you’re a Windows novice then it’s probably best to stay away. If you’ve even just a moderate level of PC experience, though — you’ve no problems running the occasional tool at the command line, say – then it could be worth a look.

    To run it, extract the contents of the download to somewhere safe, launch an elevated command prompt (right click cmd.exe, select “Run as administrator”), change to your new folder, and enter cscript wmidiag.vbs.

    And then wait. WMI really is massive, so the script has a lot to do, and it’ll take some time to gather the necessary information (four to five minutes on our test PC). The command window will update occasionally with details of the current test, though. And when it’s done, Notepad (or whatever else is your default for plain text files) will open to display the finished report.

    As you’re scanning the details, you’ll find a lot of very low-level, technical information which isn’t going to mean very much at all. We were told that our test PC had 1848 “WMI static instances”, but no “WMI dynamic instances”, for example. Is this normal, good, bad…? We don’t have the faintest idea.

    Amongst all this, though, are plenty of nuggets which you may find useful. So we were told that our system had no WMI system or repository files missing, for instance: just knowing that may be helpful if you’re trying to diagnose some odd Windows problem.

    The script also complained that a Registry setting wasn’t what it expected, telling us the precise key and what it’s value should be. Might that also be useful? We don’t know, but again it gives you a starting point. If something similar happened on your system, you could at least enter the key name at Google, see what it’s for, whether the setting might relate to any issues you’re having.

    And the script, as well as the documentation which comes with it, also explains how you can fix some problems by running command line tools such as WMIDIAG.

    This still isn’t a tool you’ll need to run very often. If your PC is taking an age to boot, say, you’ll be better off following the usual troubleshooting tips first (check your startup programs, your Windows services, clean and defrag your hard drive, and so on).

    But, if you’re suffering from major Windows problems, all the regular solutions have failed and you can’t find an answer (or, maybe, you’ve just heard somewhere that your type of problem might be WMI-related), then it may help to run the WMI Diagnosis Utility as a last resort. The tool checks a lot of Windows components, and there’s just a chance that it could uncover something useful.

    Photo Credit: Vladru/Shutterstock

  • Google has a problem with “long-tail” searches, and it needs Quora to help fix it

    The queries we type into Google can be broadly classified into two groups: head queries, or general keyword searches of less than three words; and long tail queries, or specific searches using a phrase or several words. The latter long tail queries account for a significant portion of the searches on Google (with many sources claiming as much as 70 percent).

    Google’s search algorithms are excellent at surfacing relevant content for basic keyword style head queries, but when we search for something specific using a long tail query, the answers aren’t consistently relevant. I would submit that this isn’t so much an issue with Google’s search algorithm as it is a content problem; that is, a large number of content sources that attempt to service long tail queries simply do a poor job of it. For Google to improve its search relevance for long tail queries – which it must, as those continue to become a huge chunk of its searches – it should integrate a high-quality QnA service like Quora with its search.

    Google’s long tail problem

    To better understand the differences between the two types of search, and the dilemma Google now faces, do a quick search using any or all of the following, pretty straightforward long tail queries and check the quality of search results:

    “diet plan for diabetics and high blood pressure”
    “how to get rid of acne”
    “what do turtles eat as pets”
    “how to train your parrot to talk”
    “important things to consider before purchasing a house”

    You will quickly discover that the results are mostly identical or slightly rehashed versions of other articles scraped from multiple sites across the web, many of them originating from content farms like Demand Media and Associated Content. Those sources are among many that specialize in trying to corner the market on servicing long tail queries. However they all suffer from two major problems:

    Poor quality The army of low-paid freelancers who manufacture the “content” for these sites get paid essentially by volume. They are almost never experts in a given topic (or even passingly familiar, one could argue). They simply crank out 500-word article as quickly as possible so that these networks can embed three adsense ads in between and then move on to the next topic.

    Bias toward popular keywords Despite intending to service long tail queries, in fact many of these services tend to produce content around keywords that are popular enough that they can reliably generate advertising revenue.

    A source of reliable long tail query content

    Clearly there is a demand for reliable long tail query content queries. Now consider a practical one like “how to get a passport faster,” and how massively helpful it would be to get the answer from a person who has actually gone through the process, rather than the person who designed the process. Wouldn’t it be logical for Google to implement a source of content that is produced by generally passionate, informed people –  a source like Quora?

    Unlike Wikipedia, which is best at answering head queries, Quora is all about long tail. So integrating Quora with search would provide Google’s users more reliable and useful results for long tail queries. It would also contribute to a virtuous cycle by allowing users to help produce reliable content, too, as searches prompt further contextual content that may need answering. This will help Google get knowledge from content sources (such as those who contribute to Wikipedia) who do not own a website but have valuable knowledge.

    Here’s a rough mockup of a Google search results page for the long tail query “diet plan for hypertension and diabetes” but with Quora integrated:
    Google/Quora

    As another example, for a more task-based query like “how to file taxes,” you might also end up with relevant contextual content in the right pane of the search results:

    • How do I calculate taxes?
    • What is the last date to file taxes?
    • What are the tax changes for 2013?
    • What are the important things I should know before I file my taxes?
    • What is the best software to file taxes?

    Integrating Quora will enable Google to serve far more relevant answers for a much broader range of queries even though a smaller percentage of people will be actively producing the content. And it’s worth noting that in the process, Google will be effectively replacing dollars other networks pay to content churners with upvotes and follows to passionate users instead (talk about virtuous cycles!).

    This is social search, where content will be produced in the context of social, but consumed in the context of search.

    Narendra Reddy is chief product officer for the educational software developer Wignite. Follow him on Twitter @naren.

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • It’s not you Facebook, it’s me — okay, it’s partly you: Why I unfriended almost everyone

    There have been a rash of posts of late from people who have quit Facebook or decided to unfriend everyone they know on the network. I haven’t gone that far, but I recently went through what I like to call “The Great Unfriending,” in which I unfollowed or disconnected from almost 80 percent of the people in my Facebook social graph. Doing so has changed the way I use the network, and I think that change — and the reason why I felt compelled to do so — says a lot about some of the challenges Facebook is facing.

    Unlike Julia Angwin, who says she unfriended everyone she was connected to because Facebook “cannot provide me the level of privacy that I need,” I don’t really have any issues with privacy on Facebook. Angwin said that she was troubled by the fact that “when I share information with a certain group or friend on Facebook, I am often surprised by where the data ends up,” and I respect her decision. But that’s not what bothered me about using the social network.

    It’s not the privacy, it’s the overload

    For better or worse, I made a deliberate decision when I joined the service (and Twitter, and almost every other social network) to be as open as possible, and to share almost everything about myself, within reason. I would never say that everyone should do this, and there are plenty of reasons why people keep certain things off the web — information about their children, for example — but for the most part I agree with Jeff Jarvis that the benefits of “publicness” outweigh the disadvantages.

    comscore-facebook

    So if privacy wasn’t the problem, what was it? In a nutshell, information overload. In the same way I’ve had to struggle with my addiction to real-time connectedness on a mobile device (something I wrote about recently that many readers disagreed with), I started to find that Facebook was a painful experience. And the more I thought about it, the more I thought that the problem was partly me — and the way I was using it — and partly the way Facebook was changing.

    I started to think about how some people I admire, including Union Square Ventures founder Fred Wilson, had pared back their use of Facebook by unfriending a lot of people. And such thoughts don’t seem to be unique: a recent survey by the Pew Center showed that two-thirds of users had taken an extended break, and close to 30 percent were planning to use Facebook less.

    Partly Facebook and partly me

    The part of this that I think was my fault stems from the way I set up my account when I first joined Facebook in 2006: in keeping with my desire to push the limits of openness, I accepted friend requests from almost everyone who sent them, even if they weren’t actually “friends.” And yes, I knew at the time that doing this carried some risk, but I didn’t fully appreciate what it would be like, or how it would eventually ruin the experience for me.

    What I wound up with was almost a thousand “friends,” many of whom were people I had met at conferences, or people who were connected to me through others, or some who were just fans of my writing (who can still use the “subscribe” feature). To these people — all of whom I have since unfriended — I would just like to say that you are all wonderful, but I couldn’t take it any more. My stream became a sea of information I had little or no interest in, with only a few scattered pieces of flotsam and jetsam from the people who I am actually close to.

    Like button

    The part of this that I see as Facebook’s fault has to do with how cluttered my stream became, especially with all of the “sponsored stories” and “liked” pages that began to show up more and more — when a “friend” liked a page about Coca-Cola or Ford, for example. And yes, just like the notifications I complained about on the iPhone, I know that Facebook has knobs and dials that you can tweak so that you don’t see certain things. But who has the time to spend twiddling all those dials all the time? I certainly don’t.

    Facebook has just become less relevant

    So what happened after The Great Unfriending? Facebook became a whole lot more usable as a particular kind of network — the one that lets me see what actual friends and family are doing, including those who are far away (the kind of “ambient intimacy” that researcher Leisa Reichelt talks about). Except for my teenaged daughters, of course, who don’t even use Facebook any more, preferring to spend all their time on Tumblr and Twitter. That’s just one of the things that should worry Mark Zuckerberg, I think.

    What I am left with is a more useful network, but also one that I only use for very specific things, and don’t really spend much time on. If I want to connect with people related to work, I do it through LinkedIn; if I want to connect to people through photos, I do it on Instagram or Flickr (which is why Instagram was such a smart acquisition for Facebook to make); and if I want to connect to people I don’t really know, I use Twitter. If I could get more of my friends to use Path, I might use that for friends and family, in which case I wouldn’t need Facebook at all.

    Facebook has a whole series of challenges as it tries to grow and justify its $65 billion market value. But its biggest problem — bigger than the shift to mobile or the need to generate ad revenue — is that it has to not only remain relevant in people’s lives, but offer them more and more things that will keep them engaged. For me at least, and it seems for others as well, they are losing that battle.

    Post and thumbnail images courtesy of Shutterstock / Stuart Jenner and Flickr user Pew Center

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

  • Microsoft claims expired SSL Certificate caused Azure outage

    Microsoft’s cloud service, Windows Azure, along with Team Foundation Service, suffered a major outage yesterday that also affected non-enterprise people, as it resulted in problems with the Xbox Live as well. However, according to the Xbox Status page, the Live system, along with Xbox Music and Video, which were also affected, is back up and running.

    Now, as of this morning we have some information on the root cause of the much-publicized problem. Brian Harry, Product Unit Manager for Team Foundation Server, blames the nine-hour outage on “an expired SSL certificate in Windows Azure storage”. Harry goes on to explain that the company stores “source code files, Git repos, work item attachments and more” there and that “the expired certificate prevented access to any of this information, making much of the TFService functionality unavailable”.

    Ironically, this is not the first time the company has been plagued by this problem. Several years ago Team Foundation Services was hit by an expired certificate which it blamed on an operational oversight. Harry promises that the company will be investigating what led to this most recent oversight and went on to state “I apologize to all of our affected customers and hope you’ll give us a chance to learn and continue to deliver you a great service”.

    Photo Credit: Ralf Juergen Kraft/Shutterstock

  • Why are so many people SLAPPing each other? How to reduce frivolous defamation suits

    One of the realities of business today is that, intentionally or no, companies have become de facto publishers. Whether on company websites, blogs, Twitter, or Facebook pages, the web and social media offer an ever-growing number of ways for business owners to communicate publicly. That also means they are vulnerable to unique new risks.
    Businesses or individuals that communicate regularly about their industries, local happenings, or public policy – or that take any sort of stand on any of these matters – can find themselves facing defamation lawsuits that are intended simply to intimidate and silence a voice. Such suits are common enough to have a moniker: “SLAPP” suits, or Strategic Lawsuits Against Public Participation.
    Easy to file, and easier still to threaten, it’s difficult to get even the most frivolous case dismissed without incurring serious time, cost and stress.  And as each side bears its own costs in most civil litigation in the U.S., a deep-pocketed opponent can use the threat of financial ruin to get a less-well-heeled opponent to fold in the face of even a completely meritless defamation case. It’s time that the Federal government join states in taking action to protect individuals and businesses from this unnecessary threat.

    A clear cut case

    Matthew Inman, who runs the popular humor site “The Oatmeal,” had just such an experience when he wrote a piece last year accusing the site “FunnyJunk” of infringing his copyright.  Suddenly he found himself served with a letter threatening a defamation suit and demanding a $20,000 payment as restitution.
    The thing about defamation is that the law requires having a certain thickness of skin.  Defamation is not just something written about you that you don’t like.  It’s got to be demonstrably false. It’s got to be damaging. And it can’t just be someone’s opinion.
    By any objective measure, Inman’s piece wasn’t remotely defamatory; it simply expressed the sort of strong opinion that is absolutely protected by the first amendment – and it happened to be completely true.
    The problem is that establishing that something isn’t defamatory can be far more costly than fighting it is worth.  And the threat of legal action chilling what people and businesses are willing to say?  That’s bad for all of us, and the free flow of ideas and information upon which our society depends.

    States slap back

    Fortunately, a number of states have come up with an elegant solution to the problem of SLAPP suits: the anti-SLAPP law.  Under such laws, the defendant in a SLAPP case can file an immediate motion to dismiss the complaint – without having to incur the time and expense of discovery. Unless the plaintiff can then show that the case has definite merit, it will be dismissed with prejudice. And typically under such laws, the plaintiff will also be required to reimburse the defendant’s attorneys fees incurred in bringing the anti-SLAPP motion.
    While 37 states have anti-SLAPP laws on the books, most of these laws are limited to suits related to the political process, rather than the far broader category of expressive rights. However, in recent years, places as ideologically different as Texas and Washington, D.C., have enacted anti-SLAPP laws that apply to any exercise of first amendment rights related to a matter of public concern – which pretty much covers anything a business owner would write about.

    A need for Federal measures

    Back to Inman.  The creator of “The Oatmeal” was better situated than most.  He’s someone who buys digital ink by the barrel, and his public response excoriating the lawyer who sent the demand letter has become the stuff of internet legend. And, importantly, he lives in Washington state, which has strong anti-SLAPP protection. Inman could comfortably respond aggressively, knowing that he would not be exposed to crippling cost and personal anxiety in order to vindicate his free speech rights.
    Unfortunately for businesses that operate across multiple states, or in states without strong anti-SLAPP laws, the risk of being sued for exercising the right of free expression remains.  That’s why an effort has been underway over the last few years, led by the Public Participation Project (disclosure: I am on the board of directors), to enact national anti-SLAPP legislation.  Such legislation would take the broad protections and fee-shifting attributes of anti-SLAPP laws in California, Texas and Washington and apply them nationwide.
    It’s an effort long overdue. While every state law is a step in the right direction, it’s still too easy for plaintiffs to “shop” for a state without anti-SLAPP protection in which to bring a lawsuit. A federal anti-SLAPP law would level the playing field and make sure that everyone could express themselves without fear of intimidation-via-lawsuit.  Until then, business owners active in social media and blogging should get familiar with the status of anti-SLAPP in the states in which they operate – and support the effort to extend these protections nationwide.
    Josh King is vice president and general counsel of Avvo.coma social media platform that provides answers to consumer legal questions and legal marketing resources for lawyers.

     

    Related research and analysis from GigaOM Pro:
    Subscriber content. Sign up for a free trial.

    • TED Weekends listens to outer space

      Honor-HargerHonor Harger isn’t your typical artist. Or your typical astronomer. At the TEDSalon London Spring 2011, Harger shared how she brings these two seemingly unrelated disciplines together — the study of sound and the study of space — to record the songs of planets, moons and quasars.  Honor Harger: A history of the universe in soundHonor Harger: A history of the universe in soundHer talk is called “A history of the universe in sound,” and it is simply a must-see.

      Today’s TED Weekends on the Huffington Post explores the soundtrack of our universe, featuring essays from Harger and others. Below, find excerpts from three for your reading pleasure.

      Honor Harger: Tuning Into the Universe

      Images of space are ubiquitous in our lives. We have been surrounded by stunning portrayals of our own solar system and beyond for generations. But in popular culture, we have no sense of what space sounds like. And indeed, most people associate space with silence.

      There are, of course, perfectly valid scientific reasons for assuming so. Space is a vacuum. But through radio, we can listen to the Sun’s fizzling solar flares, the roaring waves and spitting fire of Jupiter’s stormy interactions with its moon Io, pulsars’ metronomic beats, or the eerie melodic shimmer of a whistler in the magnetosphere. Read the full essay »

      Mario Livio: What Is the Color of the Universe?

      Honor Harger’s TED Talk is on radio astronomy, or, in some sense, the “sound” of the universe (even though radio waves are really electromagnetic radiation, just like light). Can we, however, say what the color of the universe is? To answer this question, we must first establish what we actually mean by the “color of the universe.” A reasonable definition would be to add up all the visible radiation emitted by a very large number of galaxies in a huge cosmic volume, and to determine how all of that light might be perceived by the human eye. This is precisely what astronomers Karl Glazebrook and Ivan Baldry attempted to do in 2002. Using a survey of more than 200,000 galaxies (the “2dF Galaxy Redshift Survey”) and reaching to distances of a few billion light-years, they constructed the distribution of the colors (the spectrum) the eye would see if all that light were to be separated into its components by passing it through a prism.

      Since our universe is expanding, light from distant galaxies is stretched to longer (redder) wavelengths (a phenomenon known as redshift). The farther away the galaxy, the greater the amount of stretching that occurs. Glazebrook and Baldry removed this effect before combining all the light to form a smoothed-out average color. Read the full essay »

      Seth Shostak: Celestial Sound Effects

      Remember the tag line for the 1979 sci-fi flick Alien? It was boldly emblazoned on the film’s advertising posters, and helpfully informed the public that “in space, no one can hear you scream.”

      Well, of course that’s true; at least if you’re floating around without your protective helmet and its built-in walkie-talkie. But then again if you’re bare-headed in space, the fact that no one can hear the noises you’re making is scarcely your biggest problem.

      Nonetheless, there’s a widespread perception that space — which after all, is mostly air-free — is as silent as the shadows. Read the full essay »

    • Why all that hacking news might not be so bad

      The list of companies that have reported being hacked just keeps growing, with Microsoft and Zendesk making headlines most recently. Although it’s caused plenty of anxiety for IT people and everyday users alike, there might just be an upside: The attacks have demonstrated the need for the kinds of information sharing the federal government wants to do to improve cybersecurity.

      Following the demise of one proposal, the Cyber Intelligence Sharing and Protection Act (CISPA), the Obama administration has taken new steps with an executive order and a policy strategy. The executive order draws a roadmap for sharing more of its information with the private sector, and the strategy shows the intent to do more on diplomatic and intelligence fronts.

      The Microsoft and Zendesk hacks follow others in recent weeks at Apple, Facebook, the New York Times, the Wall Street Journal and the Washington Post. Twitter said people had attempted to hack the site. And the security company Mandiant released a report providing details on a Shanghai-based division of the People’s Liberation Army of China that has stolen “hundreds of terabytes of data from at least 141 organizations,” almost all of which have headquarters in countries where English is the native language. Hackers even found a way to build a lure for a spear-phishing attack out of one version of the report.

      President Barack Obama, in his State of the Union address last week, acknowledged that American companies have been hacked and said the country must not “look back years from now and wonder why we did nothing in the face of real threats to our security and our economy.” Obama’s executive order on cybersecurity, released on the same day as the president gave the speech, directs the government to release more, and more timely, information on cybersecurity threats. It calls for a framework for reducing “cyber risks” to critical infrastructure in the United States, and the framework will have to help owners and operators of that infrastructure manage the risk. In doing so, the government cannot pick one product or service as a cure-all; it claims to value a competitive marketplace. The order also mandates that owners or operators of critical infrastructure that could cause catastrophes if hacked will be confidentially contacted and be given a way to submit information to the federal government.

      A week after the executive order, the Obama administration released a policy paper laying out steps for advancing cybersecurity. It says businesses should share best practices, and it states that the FBI and the State Department will do more to try to stop hacks of trade secrets. Elsewhere, it promises that several other federal agencies will continue to do what they have been doing toward that end.

      Some people have argued that the executive order doesn’t do enough to improve cybersecurity. Then again, others like it much better than CISPA.

      Regardless of what people think about it, the federal government’s efforts to respond to the hacks could prompt more companies to protect their own assets. It takes advantage of the good parts of CISPA but not the bad, which my colleague Derrick Harris has previously identified. And with news of more and more attacks coming to the fore, more companies could be inclined to try sharing information with the federal government for the purpose of the greater good. How bad could that be?

      Oh, by the way, as a side effect of all of these attacks and the new federal policies, don’t be surprised to see more enterprises trying out security products that focus on infrastructure, such as Mandiant and Cylance, which I wrote about earlier this month. Look for more stealth-mode security startups jumping out of the shadows, too.

      Feature image courtesy of Shutterstock user Tatiana Popova.

      Related research and analysis from GigaOM Pro:
      Subscriber content. Sign up for a free trial.

    • Android this week: HTC One introduced; Ubuntu on Nexus; Galaxy S4 using Snapdragon?

      This week saw the introduction of HTC’s next flagship phone, named the HTC One just like its predecessor. The handset materials and design are a bit of a departure for HTC as the new phone will use an all aluminum enclosure and a pair of front-facing speakers. As a result, the audio experience ought to be a highlight for the One, but visually oriented readers will be happy with the display as well: HTC is packing in 468 pixels per inch on the 4.7-inch, 1080p display.

      HTC OneHTC’s One will include an LTE radio for fast mobile broadband and run on Google’s Android Jelly Bean software. The company is also including several of its own software features: BlinkFeed streams news, social networking updates and other information; Sense TV provides video content guides and uses an infrared sensor turning the One into a remote control; customized home screens are available, similar to prior versions of HTC’s sense software.

      The flagship phone doesn’t yet have a price tag as that will come from carriers — likely next month — but will be available in both a 32- and 64 GB option. Other internal specs include a 1.7 GHz quad-core Snapdragon 600 chipset, 2 GB of memory, NFC radio and integrated 2300 mAh battery.

      Speaking of Snapdragons, Qualcomm’s chip may power the Samsung Galaxy S4 phone. Samsung has yet to introduce the revised Galaxy but online benchmarks and other evidence point to the company opting for a Snapdragon over its own Exynos chip. Reports indicate the same Snapdragon 600 found in the HTC One will be inside the Galaxy S4, due to heat issues when testing the Samsung 8-core Exynos chip.

      This wouldn’t be the first time Samsung chose a competitors chip to power its own smartphones however. The US version of the Galaxy S III also used a Snapdragon chip, mainly because at the time of launch, Samsung hadn’t yet integrated LTE support in the Exynos silicon. In some sense, Samsung is lucky that it has a secondary option for chipsets, else its flagship phone could face delays. We’ll get the story for sure within the next few weeks as Samsung is expected to hold a launch event for the new Galaxy smartphone on or around March 14.

      Ubuntu on NexusWe don’t, however, have to wait to see Ubuntu on a smartphone: This week, Canonical released instructions on how to install a preview of the alternative platform on Google’s Nexus line of tablets and phones. You’ll end up wiping out your Android system if you do this, but Canonical provided the handy links to Google’s own factory images for all Nexus devices, making it easy to reinstall Android.

      I haven’t taken the Ubuntu plunge on my Galaxy Nexus yet, but expect to next week. From all accounts I’ve read so far, the Ubuntu interface is intuitive, but the software is still rough around the edges. There are still quite a few features and functions not ready yet although the Nexus phones will still be able to make calls and connect to both Wi-Fi and mobile broadband networks.

      Related research and analysis from GigaOM Pro:
      Subscriber content. Sign up for a free trial.

    • 1951 Ford Custom Club Coupe: HemmingsTV

      1951 Ford Business Coupe

      The 1951 Ford Custom Club Coupe is the quintessential old school business coupe. It’s got great lines, a simplistic interior and when maintained properly, will literally run forever. Owner Eric Ralle has been the caretaker of this all original example for the past 27 years and is very proud of the fact that – “it’s just the way Henry Ford built in back in 1950.”

      Yes Eric, it most certainly is.

      Source: HemmingsMedia.com

    • Weekly Address: Congress Must Act Now to Stop the Sequester

      President Obama urges Congress to stop the sequester — the harmful automatic cuts that threaten thousands of jobs and affect our national security from taking effect on March 1.

      Transcript | Download mp4 | Download mp3

    • White House petition to legalize cell phone unlocking gets 100,000 signatures

      Cellphone Unlocking Petition
      In a surprising development, it seems that a lot of people don’t like being told they can’t unlock their cell phones. NPR reports that a petition posted on the White House website asking the Obama administration to “champion a bill that makes [cell phone] unlocking permanently legal” has garnered more than 100,000 signatures, which means that the White House by its own rules must now issue a formal response.

      Continue reading…

    • Mac malware invades Microsoft, too

      How’s this for a helluva endorsement for Windows security over OS X? Today, Microsoft acknowledged falling prey to “similar security intrusion” as Apple and Facebook. They got nabbed by a Java exploit affecting Apple’s OS.

      “We found a small number of computers, including some in our Mac business unit that, were infected by malicious software using techniques similar to those documented by other organizations”, says Microsoft security chief Matt Thomlinson.

      Apple made similar admission on February 19 and Facebook a week ago. Apple issued an OS X fix removing Java, while Facebook disabled the tech. Microsoft disclosed no such action for its users. Party line: No data was taken.

      Facebook offers the most details on what happened: “After analyzing the compromised website where the attack originated, we found it was using a ‘zero-day’ (previously unseen) exploit to bypass the Java sandbox (built-in protections) to install the malware. We immediately reported the exploit to Oracle, and they confirmed our findings and provided a patch on February 1, 2013, that addresses this vulnerability”.

      My question: Who among the big companies discloses next? Surely these three aren’t the only ones running Macs and Java.

      Microsoft’s full statement:

      As reported by Facebook and Apple, Microsoft can confirm that we also recently experienced a similar security intrusion.

      Consistent with our security response practices, we chose not to make a statement during the initial information gathering process. During our investigation, we found a small number of computers, including some in our Mac business unit, that were infected by malicious software using techniques similar to those documented by other organizations. We have no evidence of customer data being affected and our investigation is ongoing.

      This type of cyberattack is no surprise to Microsoft and other companies that must grapple with determined and persistent adversaries (see our prior analysis of emerging threat trends). We continually re-evaluate our security posture and deploy additional people, processes, and technologies as necessary to help prevent future unauthorized access to our networks.

      Matt Thomlinson
      General Manager
      Trustworthy Computing Security

      Photo Credit: Jirsak/Shutterstock

    • Sprint still scrounging for more spectrum despite vast potential holdings

      Sprint Spectrum Acquisitions
      Even though Sprint (S) could soon have a commanding advantage over its rivals in terms of spectrum holdings, CEO Dan Hesse still isn’t satisfied. Bloomberg reports that Hesse and Sprint are still plotting ways grab more spectrum even if the company succeeds in fully purchasing Clearwire and boosting its total mobile data spectrum portfolio to an industry-leading 184MHz.

      Continue reading…

    • Expect more-sophisticated Bank DDoS attacks this year

      What’s the end of February without some scare tactics? Gartner warns that one-quarter of distributed denial of service attacks this year will be against applications. Really? That low? I’m surprised the number isn’t higher. After all, as enterprises shore up the network perimeter, HTTP remains open wide enough to drive a freight train through and for that long duration.

      The attacks seek to overtax CPUs, disrupt applications and, ultimately, distract IT and security personnel. While they look over there, the bad boys are work over here. Gartner sees DDoS attacks as part of a larger trend singling out financial institutions.

      “A new class of damaging DDoS attacks and devious criminal social-engineering ploys were launched against U.S. banks in the second half of 2012, and this will continue in 2013 as well-organized criminal activity takes advantage of weaknesses in people, processes and systems”, Avivah Litan, Gartner vice president, says. He emphasizes there is a “new level of sophistication in organized attacks against enterprises” and that “they will grow in sophistication and effectiveness” this year.

      These attacks increase in intensity — blasting some financial institutions with up to 70 Gbps of “noisy network traffic”, via ye old Internet pipes. 5 Gbps are more typical.

      “To combat this risk, enterprises need to revisit their network configurations, and rearchitect them to minimize the damage that can be done”, Litan says. “Organizations that have a critical Web presence and cannot afford relatively lengthy disruptions in online service should employ a layered approach that combines multiple DoS defenses”.

      I guess unplugging the Internet isn’t the answer. How will we do online banking?

      Photo Credit: Seleznev Oleg/Shutterstock

    • PlayStation win shows AMD shedding its singular focus on x86

      For those of us who remember AMD as the alternative to Intel in our desktops, or as the also-ran to Intel in servers, it’s time to think of the new AMD. Like a Beyonce dumping Destiny’s Child, the chipmaker is ditching its sole reliance on x86 and embracing new architectures such as graphics processors and ARM-based cores. And scoring the processor inside the latest generation PlayStation console is the perfect example of the new AMD.

      AMD has built a custom chip for the PS4 that combines a graphics processor with a CPU core creating what AMD calls an APU, or accelerated processing unit. AMD calls these APUs, and it has been working toward a win in this area since it purchased GPU firm ATI all the way back in 2006. The PlayStation 4 is quite a win, with a few hundred million of the consoles sold in its history.

      The PS4 chip is also the first public design win out of a new group inside AMD, the Embedded and Custom Semi group, which AMD estimates will generate a fifth of its sales in 2013. That group will be responsible for building out custom chips for clients that will sell at massive volumes.

      SONY COMPUTER ENTERTAINMENT INC. PLAYSTATION 4In the case of the PS4, AMD combined its next generation 8-core Jaguar CPU with its next generation GPU. Another way to look at this is to realize that Sony’s PS4 isn’t just limiting the graphics processor to graphics. That chip is likely handling elements of the compute as well.

      The PS4 chip is the first chip for the Embedded and Custom Semi group, but not the first custom effort for AMD. It also made custom versions of graphics processors for the WiiU and the Xbox consoles. But AMD hopes the business will continue to grow, especially as AMD looks beyond its traditional PC market. Not only has it put more focus on graphics and its APU strategy, but it also last year took a license for the ARM architecture and said it plans to use the upcoming 64-bit ARM architecture to build chips for servers.

      John Taylor, the VP of product marketing with AMD, said he can’t share the exact volumes that would entice AMD to design a custom chip, and upon further questioning it appears that the number of chips may not be the sole deciding factor.

      When asked about combining GPUs or even ARM cores in the server business for example, he said, “Well you know that in the server market the chips generally have higher average selling prices than those in the consumer space, so it may not necessarily be that we will demand a 1 million unit run to build these chips. It will be a business decision.”

      Yet Taylor’s hypothetical example of a good customer for the custom semi business was a smart TV manufacturer, one that had already designed portions of a chip that it wanted to combine with computing and/or graphics processors from AMD. However, he acknowledged that AMD now has several architectural options and plans to build a business combining those options for customers outside of AMD’s traditional lines of business.

      Such a commitment isn’t for the faint of heart. The development of a core can take years of forethought, while the combination of cores onto a single system on chip, such as the one offered in the PS4, can take up to a year. As the web and application side of the technology world speeds up, chip firms are still stuck planning for a future that is years out and hoping they can get it right.

      Related research and analysis from GigaOM Pro:
      Subscriber content. Sign up for a free trial.