Author: Karl Bode

  • Good Enough For A Pulitzer, But Not Good Enough For Apple

    Just as online content only begins to get some recognition as being Pulitzer worthy, it looks like those content creators still have a major hurdle to overcome: namely, Apple’s incredibly screwed up application approval process.  Cartoonist Mark Fiore made Internet and journalism history this week as the first online-only journalist to win a Pulitzer prize for his work over at the San Francisco Chronicle. Much more difficult? Getting his iPhone cartoon application past Apple’s application store guardians. Fiore says his application was rejected last December because, as an Apple letter phrased it, his satirical cartoons "ridicule public figures," a violation of Apple’s iPhone Developer Program License Agreement:

    "Applications may be rejected if they contain content or materials of any kind (text, graphics, images, photographs, sounds, etc.) that in Apple’s reasonable judgement may be found objectionable, for example, materials that may be considered obscene, pornographic, or defamatory. Examples of such content have been attached for your reference."

    Except the attached examples provided by Apple weren’t offensive in any way, and included such radical and supposedly-offensive things like caricatures of the couple that recently crashed a White House dinner. Of course, this is only the latest in a long list of bizarre and seemingly arbitrary Apple decisions that have kept developers from getting their wares to the application store. Luckily for Fiore, his plight resulted in some negative press for Apple, and by the end of Thursday, Apple had personally called him to say his application had miraculously and suddenly made the grade. As usual, Apple wouldn’t officially comment about how or why they had screwed up.

    You’d like to think that this would be good news for other platforms, given that developers would eventually get tired of dealing with Apple’s bizarre inconsistencies and turn their efforts elsewhere. But this never really happens, given that Apple’s application store remains the best place to gain exposure and make money — and the inconsistent approval process means many developers are never impacted. It also seems likely that the walls surrounding newer application stores (like Verizon’s) could wind up being even worse. Still the problem remains and, obviously, people wonder if Fiore would have had his rejection reversed if he wasn’t in the media spotlight for his Pulitzer win.

    Meanwhile, Dan Gillmor and outlets like the Columbia Journalism Review think it’s time for journalists to start "pushing back against Apple" and asking some hard questions. Gillmor’s general concern is whether news outlets risk having their applications rejected should they criticize Apple (though that would seemingly indicate consistency, something Apple’s apparently not good at) — and more specifically what happens when a paper like the New York Times enters such a tight iPad business arrangement with a company they cover frequently. Surely most people in the press will get right on asking Apple those kinds of hard questions — right after they stop collectively gushing and cooing over the iPad for hits.

    Permalink | Comments | Email This Story





  • Frontier Communications “Testing” To See How Users Respond To Being Ridiculously Overcharged For Bandwidth

    Last year Time Warner Cable took a pretty severe beating from the press and public for plans to impose not only monthly broadband usage caps as low as 5 GB a month, but also for their decision to charge users up to $2 per additional GB. Given this was a 1,500-2,000% markup above bandwidth costs for the provider, most consumers realized that the already very profitable company was simply making a money grab — and preparing to better monetize and/or stifle Internet video’s impact on TV revenues. The media scuff up wasn’t helped by company executives, who issued missives proclaiming that overcharging customers for bandwidth during a recession was only "fair" and that it would "actually encourage more use of broadband overall."

    Time Warner Cable eventually backed off the plan, but not before their brand (which they’re planning to change) took a lot of damage. One small reason they backed off was because one of the company’s few competitors, Frontier Communications, started advertising their DSL service as uncapped in order to gain a competitive advantage. Despite the fact Frontier was previously planning to impose 5GB monthly caps on all speed tiers — said ads lambasted the cable industry as greedy. Of course now that Time Warner Cable has backed off, Frontier is testing an even more ridiculous overcharging system.

    According to a letter being sent to Frontier users in Minnesota, users who consume more than 100 GB a month are automatically having their bills bumped to $99 a month. Users who consume more than 250 GB a month are having their bills bumped to a staggering $250 a month. Users who don’t respond within fifteen days get their service disconnected (throwing away a potential customer is always a brilliant business model). Keep in mind that Frontier is one of many American telcos that — thanks to limited competition — hasn’t kept pace with demand or upgraded their network from last generation DSL technology in most markets. As such, many Frontier users don’t see speeds above 3 Mbps to begin with, and that service can cost around $55 a month for a standalone (no voice landline) connection.

    As in most of these efforts to overcharge broadband users, the letter being sent to consumers adds insult to injury — informing users that the changes are being applied to provide "the best possible internet experience." The letter also informs users that anyone who uses over 5 GB of bandwidth a month is engaging in "unreasonable usage" according to the Frontier terms of service. Of course 5 GB is eaten up by one high definition film — and as multi-user households use an ever-increasing array of services, 100 GB is quickly becoming a low ceiling as well.

    These more aggressive pricing models are only employed by carriers who operate in uncompetitive markets (Time Warner’s caps never appeared in markets where they competed with Verizon FiOS). While the pricing changes are almost always portrayed as an issue of "fairness" targeting a carrier’s heaviest users, the changes eventually wind up hitting all of an ISP’s subscribers. After these price hikes are portrayed as some sort of altruism, carriers will frequently try to trot out the argument that if they can’t overcharge you for bandwidth, the Internet will simply explode (aka the Exaflood), something we’ve debunked countless times as the product of carrier lobbyists. Of course all of this is going on while the cost of bandwidth and networking hardware drops.

    Frontier’s timing also isn’t particularly smart, given they’re exploring this overcharging scheme just as they’re trying to gain regulatory approval for their $8.5 billion plan to acquire millions of Verizon DSL and landline customers across fourteen states. Part of that deal involves a few thousand FiOS customers in Washington State, who’ll be thrilled to learn that their state-of-the-art fiber to the home connection is about to get much more expensive and much less useful.

    Permalink | Comments | Email This Story





  • Library Of Congress To Store Your Inane Twitter Chatter For All Eternity

    The United States Library of Congress is getting plenty of attention for announcing that they’re planning to digitally archive every single tweet ever made since Twitter’s inception in March of 2006 (the first ever tweet is here if you’re interested). Twitter now processes something like 50 million tweets every day, or about 600 tweets per second. While relatively easy to store due to that 140-character limit (the LOC already stores roughly 167 terabytes of online content for your grandkids to peruse), the vast majority of it will be people talking about the Twilight films, shampoo choices or weekend plans. Obviously offering easy access to the pertinent (to you) and historical bits of data is going to be important.

    The LOC blog post is utterly devoid of any real information on that subject, though an accompanying Twitter blog entry indicates that only after six months will Tweets qualify for inclusion into the Library (so start deleting your offensive and incoherent tweets now), and direct or private tweets won’t be archived. Google seems to be helping on the accessibility angle by announcing a replay system, allowing people to examine snapshots in Twitter time (for instance take a look at this snapshot of the 2010 Winter Olympics). Given that history can often be written with a heavy focus on the elite, the Library of Congress focuses on the fact that this everyday chatter about events could provide very useful context for historians:

    "Expect to see an emphasis on the scholarly and research implications of the acquisition. I’m no Ph.D., but it boggles my mind to think what we might be able to learn about ourselves and the world around us from this wealth of data. And I’m certain we’ll learn things that none of us now can even possibly conceive."

    All of this raises the question of whether or not permanence will impact the way people use Twitter. As it stands (whether they should or not), most people treat Twitter as an off-the-cuff conversation. And while most people who use the service for business act professionally, even many corporate representatives are a little more candid and conversational while using Twitter. Getting a glimpse of the real human beings behind the brand has helped many companies immeasurably in dealing with customer support and public perception. Does all of this change once the participants realize their customer promises, clever barbs and burrito recipes are going down on their permanent record?

    Permalink | Comments | Email This Story





  • IBM Helps Florida Predict Just How Delinquent Your Child’s Going To Be

    We’ve covered several different instances where the country has been taking baby steps toward the kind of precognitive crime prevention featured in the movie Minority Report — sans naked gibbering women floating in bathtubs. The most recent effort was courtesy of the Homeland Security Department, who is busily developing a body language analysis prediction system dubbed "Future Attribute Screening Technologies" (FAST) — which aims to detect "shifty" people who may be getting ready to commit a crime of some sort (or just drank way too much coffee).

    More common approaches simply involve software that analyzes a database of offenders and cherry picks out the most likely future offenders (very popular in the UK), or analyzes crime patterns to predict future criminal trends. Along those lines, it looks like the Florida Department of Juvenile Justice has decided to start using IBM predictive analytics software (via Gizmodo) to help them determine which of the 85,000 kids who enter their system each year poses the biggest future threat. IBM has this to say about the new system — which was an upgrade from Excel:

    "Predictive analytics gives government organizations worldwide a highly-sophisticated and intelligent source to create safer communities by identifying, predicting, responding to and preventing criminal activities. It gives the criminal justice system the ability to draw upon the wealth of data available to detect patterns, make reliable projections and then take the appropriate action in real time to combat crime and protect citizens."

    Of course many of these patterns simply become evident when people bother to pay attention and use their intellect, and these tools are often just an extension of that. When prediction technology is used, the technology will only be as good as the people using it (in this case to choose rehabilitation paths for kids). But you still have to wonder how accurate these kinds of systems are and how independently verifiable the evidence will be. Can kids who feel they were unfairly, preemptively declared to be bad asses in 2014 see the "reliable" source code?

    Permalink | Comments | Email This Story





  • U.S. Leaders Should Heed Their Own Advice On Internet Filters

    It has been kind of entertaining (some would say frightening) watching the Australian government’s futile efforts to clean the Internet of its naughty bits. As part of their filtering plans, the government conducted trials with a handful of ISPs, many of whom have been very vocal in their beliefs that the filters won’t technically work. These ongoing trials had no quantifiable metric to determine whether the trials were a success or failure, so obviously, Australian Communications Minister Stephen Conroy proudly announced that the trials proved the filters to be 100% effective. Political leaders in favor of the filters haven’t exactly been open to feedback on the dangers of filters, and the country learned nothing early on, when a teenage kid hacked their original filter system in all of half an hour.

    Recently, U.S. politicians have been ramping up their criticism of Australia’s filtering efforts, with the State Department last month issuing a rather vague statement indicating "we have raised our concerns on this matter." This week, the U.S. Ambassador to Australia Jeff Bleich was willing to get a little more specific in a low-quality poetry sort of way, insisting that the Internet "needs to be free" in much the same way "the polar caps have to be free" (whatever that means). Bleich went out of his way to state that there are other methods to deal with extremism and child pornography, like addressing them at the source:

    "We have been able to accomplish the goals that Australia has described, which is to capture and prosecute child pornographers and others who use the Internet for terrible purposes, without having to use Internet filters. We have other means and we are willing to share our efforts with them in order to allow them to at least look at a range of choices, as opposed to moving in one particular direction. It’s an ongoing conversation."

    While Bleich insists it’s a conversation, all indications are Australia’s government isn’t listening. They’ve already spent a fortune on the idea, and have ignored critics every single step of the way. As is usually the case when talk of imposing filters fires up, the specter of child pornography and other societal menaces are used as the scary red herring. Given how susceptible U.S. citizens are to sales pitches involving "protecting the children," it seems like only a matter of time (and lobbyist effort) before the United States requires ISPs to impose copyright filters at the behest of the entertainment industry and Bono. We’ve already had a few close calls, like with the ACTA or with U.S. lawmakers trying to bury filtering plans into the broadband stimulus effort — so it sounds like Uncle Sam should heed his own advice.

    Permalink | Comments | Email This Story





  • FCC Slowly Realizing Science And Data Are Kind Of Important

    Once upon a time, FCC Commissioners were engineers, thinkers and experts across a variety of fields. These days the well-lobbied agency’s stable of Commissioners is populated exclusively with lawyers, politicians and revolving-door lobbyists, and as you might expect — its primary product (no matter which party is in control) is quite often partisan bickering and broken policy. The nation’s recently unveiled first-ever national broadband plan is only the latest example of the kind of product the agency now creates, paying lip service to a myriad of industry problems but doing very little about the state of competition in the sector. Granted, to some, the plan looks good — focusing on feel-good efforts like "digital eduction" — but there’s very little in the plan that really challenges the status quo.

    The majority of bad FCC policies are unsurprisingly driven by bad data. The agency has made huge broadband industry policy decisions over the last decade using completely useless data that overestimated the volume of competition in the market. The rosy picture painted by the FCC was in part thanks to the confidential, unverifiable data provided by carriers, who have a vested interest in data that doesn’t try very hard to highlight limited coverage, slow speeds, or high prices. The FCC is only just now getting around to actually collecting comprehensive broadband price data or mapping broadband availability, though in many states this latter job was simply doled out to friends of the phone companies.

    While the FCC is still pushing into territory that may be better suited to the FTC, there’s at least a few signs the FCC is trying to fulfill their recent promises that they’ll be a more data-driven agency. In a post over at the FCC blog, the FCC’s Dave Vorhaus notes that the agency has picked UK speedtest firm SamKnows to help them test the real-world speeds obtained by home users. SamKnows does similar testing for British regulator Ofcom, and it helps the regulator determine if a consumer is getting what they pay for. While normal speedtests will illustrate whether a user is getting full speed, SamKnows uses in-home residential routers with modified firmware to specifically determine why. According to Vorhaus, the FCC is looking for volunteers to help them collect data:


    In a couple of weeks, we will be asking for consumers from across the country to voluntarily install hardware in their homes (on an opt-in basis) that is capable of measuring broadband performance. The measurements will give us results across a broad swath of providers, service tiers and geographic areas. More details on how to volunteer will follow in the coming weeks. We are tremendously excited about this announcement, the next step in the process of increasing transparency and competition in the broadband market and better informing consumers about their broadband service.

    While the selection of a UK firm might raise the hackles of those who think that job should have been given to a U.S. company (a Wall Street Journal blog headline makes a jab about stimulating the British economy), SamKnows is among the best in the telecom sector at this particular job, and is also used by UK ISPs to assess their own network performance. Of course quality data won’t mean anything if the FCC doesn’t use it to make smart policy choices (like realizing that fixing competition helps fix things like network neutrality without additional regulation). You also have to wonder if the FCC’s going to have a lot of free time, given the recent Comcast ruling all but ensures the agency is going to spend the next two years bogged down in a bare-knuckled fight with carrier lobbyists.

    Permalink | Comments | Email This Story





  • Telcos Still Pretending Google Gets “Free Ride”

    Back in 2005, former AT&T CEO Ed Whitacre (now the head of GM) boldly proclaimed that Google was getting a "free ride" on his company’s "pipes," and that they should be charged an additional toll (you know, just because). As we’ve discussed several times now, Whitacre’s argument made absolutely no sense, given that Google not only pays plenty for bandwidth (as do AT&T’s customers), but the company owns billions in international and oceanic fiber runs, data centers and network infrastructure. Despite making no sense, this idea that Google was some kind of free ride parasite quickly became the cornerstone of the telco argument against network neutrality. In response,Techdirt has suggested that telco spokespeople should pay for Google’s bandwidth bill for a month if it’s so low — with no takers.

    Of course, lost under the circus of the network neutrality debate was Whitacre’s real goal: to get content providers to subsidize AT&T’s network upgrades, something many myopic investors don’t want to pay for. Whitacre was also afraid; he understood Google poses an evolutionary threat, the likes of which traditional phone companies like AT&T had never seen before. Incumbent phone companies had grown comfortable sucking down regulatory favors, subsidies and tax cuts while operating in non-competitive markets. Suddenly, increasingly-ubiquitous broadband allowed companies like Google to enter "their" telecom space, gobbling up ad dollars and offering disruptive products like Google Voice — which threaten sacred cash cows like SMS and voice minutes.

    Instead of competing with Google by out-innovating them, Whitacre’s first reaction was to impose an anti-competitive toll system like some kind of bridge troll — which should tell you plenty about pampered phone company thinking. Whitacre’s fuzzy logic was given a new coat of paint in pseudo-scientific studies paid for by phone carriers, and has since floated overseas. In the UK, incumbent phone companies have taken a page from Whitacre, insisting that the BBC should pay them extra money — just because people were using the BBC iPlayer. Now Google’s non-existent free ride has popped up in Europe this week, with Telefonica, France Telecom and Deutsche Telekom all jointly insisting that Google should be paying them a special toll for carrying Google traffic:

    Cesar Alierta, chairman of Telefonica, said Google should share some of its online advertising revenue with the telecoms groups, so as to compensate the network operators for carrying the technology company’s bandwidth-hungry content over their infrastructure. "These guys [Google] are using the networks and they don’t pay anybody," he said.

    Yes, Google doesn’t pay anything — except for the billions they pay for bandwidth and extensive infrastructure. Were Google a telecommunications carrier, they’d be the world’s third biggest according to Arbor Networks. It’s absolutely stunning that such a ridiculous argument remains in circulation (and that many press outlets don’t debunk the concept as painful nonsense). If electric companies went to AT&T or Telefonica to inform them that they wanted a cut of revenues on top of payment for electricity "just because" — they’d be laughed out the building. Yet somehow we’re supposed to take phone companies seriously, when in reality they’re simply repeating total nonsense in the hopes that repetition will magically make it true.

    Permalink | Comments | Email This Story





  • GAO Concludes Piracy Stats Are Usually Junk, File Sharing Can Help Sales

    For many years we’ve explored how entertainment and software industry piracy statistics are very reliable — at least in terms of being consistently and notoriously wrong on an annual basis. Each year companies (especially the BSA) like to throw out marginally-coherent data "proving" the supposedly-huge impact piracy has on the economy, national security or employment. The claims are quickly debunked as nonsense — yet the same claims return year after year, and often get cited by U.S. politicians as gospel.

    Carl was the first amongst many to direct our attention to a new study by the GAO on the effects of piracy (covering all sectors, even toys, clothing, automobile parts, and medicine). The GAO’s study unsurprisingly found that U.S. government and industry claims that piracy damages the economy to the tune of billions of dollars "cannot be substantiated due to the absence of underlying studies." The full GAO report is worth a read, and not only argues that claims of economic impact have not been based on substantive science — but that file sharing can actually have a positive impact on sales:

    "Some experts we interviewed and literature we reviewed identified potential positive economic effects of counterfeiting and piracy," The GAO wrote. "Some consumers may knowingly purchase a counterfeit or pirated product because it is less expensive than the genuine good or because the genuine good is unavailable, and they may experience positive effects from such purchases. Consumers may use pirated goods to ‘sample’ music, movies, software, or electronic games before purchasing legitimate copies," the GAO continued. "(This) may lead to increased sales of legitimate goods."

    Study after study have supported the conclusion that file sharers purchase more media — though the idea never resonates the same way as claims of economic armageddon caused by file sharing. While the GAO’s report does obviously highlight some of the negative impacts of counterfeiting, the GAO goes on to argue that any overarching conclusions of piracy’s impact on the broader economy may not even be possible. The GAO was instructed to study piracy’s impact as part of the Intellectual Property Act of 2008 (PRO-IP Act) — which delivered plenty of handouts to the entertainment industry. ProIP was ironically pushed through using unreliable studies to justify its creation. Of course we’ll soon be swimming in new dubious data “proving” the GAO wrong — and around and around we go.

    Permalink | Comments | Email This Story





  • Nation’s First Major Broadband Over Powerline Deployment Shuts Down

    We’ve talked for many years about how broadband over powerline (BPL) technology has failed to live up to what little potential it possessed as a viable major third broadband pipe. Pretty much since the technology’s inception, the FCC praised BPL as the "great broadband hope," going out of its way to ignore the technology’s rather nasty potential to interfere with local emergency and ham radio signals. Engineers repeatedly criticized the unshielded delivery system as impractical, but the FCC (and pretty much everybody else) repeatedly ignored them.

    To help gloss over FCC broadband policy failures that fortified monopoly or duopoly markets, the FCC told anyone who would listen that BPL was the magic elixir that would save us all from the lack of competition in the broadband sector. Think tanks with ties to BPL hardware vendors issued study after study informing everyone that this would be the year that BPL finally took off. The technology press by and large believed it, flooding the wires with a long line of stories about BPL’s immense, untapped potential. The problem was that in addition to the technology not really working, many utilities simply didn’t want to get into the broadband business, and speeds delivered via BPL were quickly overshadowed by wireless and faster cable technologies like DOCSIS 3.0.

    Manassas, Virginia has long been the poster child for BPL’s supposed successes, and was the first real non-trial deployment of the technology in the United States. The city’s network, built by a company named COMTek, offered city residents speeds slower than 1 Mbps for $24.95 a month. By 2008, COMTek was starting to realize their fortunes would never be made in residential broadband using an inherently flawed technology, so they sold the network to the city — and it has been sucking Manassas dry ever since. After pouring $1.6 million into the network and losing about $166,000 a year — the city this week finally voted to shut the network down. All 520 residents have been told to find a new ISP, and the remnants of the city’s "great broadband hope" are being sold off for scrap.

    COMTek, who often took an adversarial approach to dealing with local ham radio interference complaints, is now focusing on selling intelligent power network monitoring systems to utilities. Former FCC boss Michael Powell has since moved on to a lucrative career working for broadband carrier funded think tanks, where he’s still busily hallucinating about competition in the broadband sector. As for the residents of Manassas, if it’s any comfort, most of them have the cozy local duopoly of Verizon and Comcast to fall back on.

    There’s still a few scattered deployments of BPL left in the country, but they serve only a few thousand people and will likely be supplanted in time by faster next-generation wireless. There’s absolutely no doubt Manassas leaders should have done their homework before buying the network, though the FCC’s whitewashing of the technology’s problems didn’t exactly help. Maybe next time the FCC should listen to actual engineers instead of salesmen for companies trying to sell the public a broken product.

    Permalink | Comments | Email This Story





  • PS3 Owner Given Refund After Sony Makes PS3 Less Useful

    We recently discussed how Sony has decided to eliminate some rather useful functionality from their Playstation 3 — specifically the ability to run other operating systems like Linux. This annoyed a number of PS3 owners, given that thanks to a Sony "update," the product they thought they purchased is not the product currently sitting in their living room. Sony is obviously interested in keeping their hardware locked down as part of an attempt to retain control in their fight against pirates (and apparently hobbyists). But if the console you bought suddenly does less, are you due a refund? One UK PS3 owner apparently thought so, and was able to use a law created in 2002 to get Amazon to refund about 20% of his original purchase price. The law in question specifically applies to retailers not manufacturers, and requires that goods:

    • comply with the description given by the seller and posses the same qualities and characteristics as other similar goods.
    • be fit for the purpose which the consumer requires them and which was made known to the seller at the time of purchase.

    Given that Sony "made it known to the seller at the time of purchase" that the PS3 would be able to run other operating systems, Amazon ponied up the refund — without the user having to return the unit. Of course the refund will be kicked up to Sony, who isn’t going to want a significant chunk of the UK suddenly demanding their money back. So the question then becomes whether Sony backs down and re-instates a feature many of their customers found useful, or just points to their user agreement. Said agreement claims Sony has the legal authority to do whatever the hell they’d like if the changes are applied in order to "prevent access to unauthorized or pirated content."

    Less broad consumer protection laws in the States means users here probably won’t see refunds, but you may see your obligatory dollar or two should Sony’s decision result in a class action lawsuit.

    Permalink | Comments | Email This Story





  • Netflix Agrees To Delay Fox And Universal New Releases, Annoy Avatar Fans

    Netflix recently decided it would be a good idea to strike a deal with Warner Brothers that involved delaying all new Warner Brothers releases by 28 days. Film industry executives somehow believe this strategy is going to help them sell more DVDs, though as we’ve been discussing, the deal as designed seems just as likely to confuse the hell out of consumers as it tries (and fails) to prop up less innovative companies. Why would Netflix agree to such a deal? It was the only way they could get Hollywood to loosen their vice-like licensing grip on the number of titles they allow Netflix to stream via broadband.

    Of course the deal doesn’t apply to Blockbuster, who ponied up the cash to the studios so they can apparently mock Netflix and Redbox in advertisements instead of actually innovating. None of this, including the fact that Netflix is facing a class action lawsuit, has apparently fazed Netflix or the studios — as Netflix has now signed similar delayed-release deals with both Twentieth Century Fox and Unviersal Studios. As with the Warner Brothers arrangement, this will ramp up Netflix’s access to both studios’ libraries for streaming, though it looks like it won’t necessarily save Netflix any money:

    Netflix says its deal with Universal will give it the "benefits of reduced product costs;" it does not make a similar assertion about Fox. Both deals do however let Netflix build up its instant-streaming catalogue. Fox, for instance, says it will make all prior seasons of several hit TV series, including 24, Bones and King of the Hill, available to Netflix instant-streaming subscribers, while Universal says it is doing the same with some "premium domestic titles," like Gosford Park.

    Not too surprisingly, the press release announcing the deal tries to pretend that the deal is about "providing consumers with attractive options" when it does the exact opposite. Netflix goes on to insist that by restricting how consumers can consume studio content, they’re actually making film delivery more "flexible" and "convenient" and that the deal is just "a win all around."

    Granted, Netflix customers who really only use Netflix’s streaming service may not care about this, especially if they’re not all that interested in new releases. Still, that doesn’t make keeping your product out of customer hands any smarter of a business plan when you’re trying to compete with piracy. One of the first major titles to be impacted by the deal will be Avatar, which thanks to this "convenient" deal won’t be available on Netflix in any form until 28 days after its April 22 street release date. Customers annoyed by that delay might go buy the DVD, or hey, they might just go download it via Bit Torrent, where they aren’t forced to wait for no particularly good reason.

    Permalink | Comments | Email This Story