Author: Stacey Higginbotham

  • Smartphones Are the New Stethoscope

    Smartphones could be the most important diagnostic tool of this century as part of a revolution in digital wireless medical devices, according to Dr. Eric Topol, a cardiologist working at The Scripps Research Institute, speaking at the TedMed conference last year. In a video released earlier this week, Topol shows off patches communicating with his smartphone to continuously monitor his vital signs.

    I spoke briefly about this intersection of medicine and technology with Dr. Mohit Kaushal, director of health care for the FCC’s Broadband Strategy Initiative, earlier this week, who said that the FCC and the FDA were trying to establish rules to approve regulations that would allow diagnostics to move to smartphones. And who knows, maybe if the smartphone becomes useful for medical monitoring, perhaps those costly data plans will be reimbursed by insurance providers.

  • Facebook Friends Austin, But It’s Complicated

    Facebook is coming to Austin with plans to create 200 jobs, according to the Texas Governor Rick Perry’s office. Perry announced today that the state would offer $1.4 million in incentives through the state’s Emerging Technology Fund Texas Enterprise Fund if Facebook chooses Austin its first big U.S. expansion location. That’s right — if.

    The Austin expansion and the ETF TEF funding is conditioned on a city incentive package worth $200,000. The city will vote in March on whether or not it will approve the incentives, likely before South by Southwest, which starts on March 12. If all goes well with the city approval, a Facebook spokeswoman says the company could open an Austin office in May for its online operations team. Meanwhile city documents and Facebook say that the  company is still exploring other locales.

    Facebook, which employs 1,200 people and has 400 million registered users, is following in the tradition of several Silicon Valley companies by locating deep in the heart of Texas. Google actually opened an Austin office in 2008, but then backpedaled a few months later. Intel, Borland Software and AMD have all had public Austin expansion plans, sometimes followed by equally public contractions.

    However, I’m irritated by the use of incentives to draw Facebook to Austin, even as I realize that it could help our local tech community. I feel like Austin is a strong enough contender to stand among the top cities for a Facebook’s expansion given our talent, lower cost of living and “hip” vibe, and I hate the arms race and entitlement among corporations that the practice of offering incentives perpetrates.

    Anyhow, welcome to Austin, Facebook. As a tip, don’t try to build anything above the Edwards aquifer, and you’ll be fine.

    Thumbnail image courtesy of Flickr user Igor Bespamnatyov

  • Say What? AT&T Lauded for Protecting Privacy

    AT&T was named as a “most trusted company in privacy” by a survey of 99,000 consumers, according to the Ponemon Institute, an information security research company. AT&T ranked No. 20 in a survey conducted during the fourth quarter of last year. I flew in late last night from San Francisco, so as first I thought I was seeing things. AT&T, the company called out in 2005 for illegal wiretapping on behalf of the U.S. government, was on a list of companies being honored for privacy?

    After a cup of coffee I realized that it was true, and AT&T attributes it all — not to federal immunity and short-term memory loss on behalf of those surveyed — but to improvements to its labyrinthine privacy policies. You see, last summer Ma Bell replaced 17 separate privacy policies with one and now they link to it on every single page of the web site. That’s worth a spot, right? No? Well AT&T even asked its users to comment on the policy before it went into effect. You know, kind of like Facebook did. It also has videos and cut the privacy policy down by 29,000 words.

    And the revised privacy policies aren’t terrible (the policy promises an opt-in prior to using deep packet inspection to monitor web surfing), although in most cases the policies adhere to existing federal and state privacy rules rather than go above and beyond them.

    However, this is a company that blatantly abused its power at the request of the U.S,. government and even sent emails and web-surfing history to federal officials without telling customers and sans a court order. Is a fresh face on standard privacy policies enough to warrant commendations? Regardless, looks like AT&T’s dollars to found the Future of Privacy think tank is money well-spent.

    Related content from GigaOM Pro (sub req’d):

    How Facebook Should Fix Its Privacy Problem

  • Clearwire Races to Grab 4G Customers

    Clearwire today reported fourth-quarter and 2009 results showing how heavily the carrier is spending to gain customers. As part of its efforts to triple its subscribers this year,  Clearwire reiterated its plans to cover 120 million people by the close of the decade and said it will spend up to $3.2 billion. For Clearwire, 2010 is the year it makes it or breaks it.

    After this year, not only will other carriers start heavily promoting their own 4G service, but some $2 billion of Clearwire’s debts will start to come due, up from $586 million in payments it’s obligated to make in 2010. Plus, Wall Street needs fast proof that Clearwire and Sprint didn’t bet on the wrong technology. Everyone is waiting for them to fail.

    But despite our skepticism about WiMAX vs. the Long Term Evolution mobile broadband that carriers like Verizon and AT&T are deploying, Clearwire does have two opportunities to survive. The first is if it can sign up a lot of customers quickly on the network, through aggressive promotion and pricing.

    When I tested WiMAX in Austin, I was frustrated by the giant holes in coverage, but where there was coverage, it was a much better downloading experience than 3G. Given that Clearwire uses Sprint’s 3G network to blanket those holes, the service isn’t a bad deal. With its cable partners promoting it and most consumers unaware of the differences between WiMAX and LTE, there’s a market for fast, which Clearwire will certainly provide. T-Mobile is hoping to take advantage of this with its HPSA+ rollout as well.

    The other advantage Clearwire has is spectrum. For more on the importance of spectrum check out our report, Everybody Hertz: the Looming Spectrum Crisis on GigaOM Pro (sub req’d). The company has up to 120 MHz of spectrum available and is currently deploying its WiMAX service in large, 30 MHz chunks which it can fill with customer’s bits. In contrast, those deploying LTE in the 700 MHz band will use 10 MHz chunks, which can’t offer the same capacity. For now, it means Clearwire doesn’t have a data cap and can offer a better user experience even as more customers join the network.

    So can Clearwire deploy quickly, get customers, and then keep them as its competitors roll out LTE? If so, finding success with WiMAX may not be an impossible dream.

    Image courtesy of Flickr user omniNate

  • Will U.S. TeleMedicine Be DOA?

    Dr. Mohit Kaushaul

    Intel, GE and the Mayo Clinic today said they would conduct a year-long study to find out if remotely monitoring patients via gear made by the two companies and hooked up to a home broadband connection can keep them out of emergency rooms. The move ties in nicely with a chat I had yesterday with Dr. Mohit Kaushal, director of health care for the FCC’s Broadband Strategy Initiative, about encouraging broadband use in medical care.

    In many ways, the U.S. is still behind the curve when it comes to using technology in medical care, not because the technology industry isn’t interested in providing software and hardware, but because of a failure on the part of doctors to use the technology and on the part of Medicare and other government-sponsored health programs to pay for health outcomes rather than procedures, said Dr. Kaushal.

    Plus, in many areas, especially rural ones where physicians are scarce, the broadband network just isn’t up to speed. The Mayo Clinic/Intel study will involve video conferencing between patients and doctors to ascertain patient fitness — something that could benefit from better broadband and even HD quality. In a presentation before the FCC last week, Dr. Kaushal said that using e-care could save $700 billion over the next 15-20 years.

    In the meantime, the U.S. is starting down the path to e-care with electronic health records. Currently a patient’s medical records are text-based, but Dr. Kaushal says he sees a future where in addition to large image files, a patient’s medical record would include video and audio notes as well. He declined to speculate on the size of such a file nor would he disclose the quality of the bandwidth he thought would be needed to deliver such information over networks.

    However, in addition to the quality of the broadband network, Dr. Kaushal said that there are regulatory issues at play when considering broadband’s role in monitoring patient health. For example, using smartphones to help diagnose illnesses or even to monitor vital signs is a new area of innovation. Dr. Kaushal said that currently a lack of regulatory oversight on such apps may make investors afraid to put money behind such innovations.

    To counter that problem, he said the FCC is working with the Food and Drug Administration knowing that at some point in the future devices for home health monitoring will need both an FCC stamp of approval and one from the FDA. The meetings and work so far haven’t created a framework to write such regulations, but have merely acknowledged that such rules will soon be needed. Unfortunately, much of the health portion of the National Broadband Plan so far is like that.

    It’s all about recommendations and setting “frameworks,” with very little ability to actually enforce policy. For example, Dr. Kaushal said that part of the plan will include a taxonomy of terms such as e-health, telemedicine, telehealth and e-care. That’s wonderful — and certainly doctors and insurance companies will have to use those definitions in their care and for reimbursements — but at the same time it makes clear just how far off we are from driving health care into the 21st century. Indeed, we’re still defining our terms.

    However, Dr. Kaushal can certainly articulate the value of broadband connectivity when it comes to improving health care. “Through the mechanism of broadband, time and geography barriers are reduced,” Dr. Kaushal said. “Within the health care world we monitor people in hospitals and at primary care physician offices, and what is exciting is that we could continuously monitor people via broadband, gathering info all of the time.”

    That could mean fewer hospital visits, but also a better quality of life for many people who live with chronic disease. Much like broadband has reduced the distance between people, enabling parents to communicate with their children at college or people to telecommute, broadband will help deliver health care as soon as the FCC and related agencies figure out how to drive fatter pipes and doctor adoption. But connectivity is merely a first step.

    “Building connectivity doesn’t improve outcomes,” Dr. Kaushal said. “Networks, the applications on those networks, and the data capture and the analytics can improve outcomes.” That’s a job for the technology companies. I know they’re eager to get started.

  • Austin Still Pinning Its Hopes on Hardware Startups

    Back during the first technology boom, Austin was a force when it came to silicon and enterprise software startups, but the escalating cost of the plants where chips were manufactured and the SaaS-ification of enterprise software hit Austin’s key sectors hard. Other than cursing Salesforce.com and semiconductor foundries,  Austin has tried to make a tech comeback with wireless, clean energy and even a few consumer web startups, but the city is in the midst of an identity crisis, and its startups are lagging those built in Silicon Valley.

    Austin has Gowalla, which is trailing behind Foursquare in location-based services, and Heliovolt, which is trailing Valley startup Solyndra in the race to develop better solar technology. But Bart Bohn, a director at the Austin Technology Incubator is hoping to resurrect Austin’s hardware past with a collection of data center infrastructure and wireless startups  in the University of Texas-backed incubation program. Given the emphasis on cloud computing and wireless innovation, ATI’s bet may pay off.

    In the video below Bohn talks about how infrastructure startups are raising money in Texas, and names some of the exciting companies at ATI, from Smooth-Stone, which is building servers with ARM chips to Savara, a biomed startup that has developed a method for delivering drugs via an inhaler. Maybe, with a newly active ATI, plus a new hardware-focused venture fund in Austin, the city can regain some of its former glory.

    Thumbnail courtesy of Flickr user turtlemom4bacon’s

  • ARM Launches a Smarter Brain for The Internet of Things

    ARM makes more than just smartphone cores.

    The Internet of things is coming, but in addition to ubiquitous broadband connecting devices wherever they are, we also need  low-power, cheap chips that are smart enough to collect information and then communicate it back to the web. ARM, the chip licensing company, has figured out one way to do this with a new microcontroller, one of those low-end chips that reside on devices from your microwave to your Bluetooth headset. ARM has added some higher-level math functionality to its microcontroller line with its Cortex-M4 chip. If we’re gonna connect everything to the web, that means even the tiny brains inside relatively dumb devices need a boost.

    So ARM added some digital signal processing capabilities to the chip. Think of it as sending the silicon to school so it can learn algebra–after realizing that basic math doesn’t cut it anymore. Already NXP, Texas Instruments and ST Microelectronics have licensed the cores and expect products containing the chip to hit the market in 2011. Some areas we’ll see it are in smart appliances for “talking to” to the smart grid (GigaOM Pro sub. req’d), and as a way to add better audio quality to everything from headsets to Mp3 players without adding a lot of cost.

    Like all ARM chips, the M4 was developed for low-power usage, which will be essential for building out a web-connected sensor network in places where electrical outlets don’t reach. Machine to machine connectivity isn’t just something the carriers are counting on for growth– it’s also a potential boon for the chip industry. ARM’s new chips will likely compete with stand-alone microcontrollers from companies such as Atmel or Renesas. The microcontroller market is a $15 billion industry, according to NXP, and adding the brainpower to enable this low-end silicon to reach out to the web is good for ARM and good for our connected future.

  • Broadband Fans, We Have an Innovation Problem

    Google last week said it plans to build an experimental fiber-to-the-home network that would deliver speeds of up to 1 Gbps. And this week FCC chairman Julius Genachowski outlined a goal of delivering 100 Mbps broadband to 100 million homes as part a “2020 vision” associated with the National Broadband Plan. However, amid what many perceive as good news for the wired broadband industry, the Telecommunications Industry Association and United States Telecom Association said they would not produce Supercomm, an industry trade show, due to “financial projections.” Translation: Wired broadband is in trouble. And it’s the fault of ISPs and Silicon Valley.

    Despite a rollout of faster technology from some cable providers, and Verizon’s continued fiber-to-the-home buildout, the wired broadband world isn’t looking terribly exciting outside Google’s testbed project. A close inspection of the long-range FCC plan doesn’t have me overly inspired, especially as other areas of the world invest in 1 Gbps networks today.

    Meanwhile, in the same time two-week period as all of this wired broadband news, the mobile industry’s largest trade show, Mobile World Congress, took place. It was chock-full of the usual mobile players as well as a who’s who of anyone in the tech scene. And issues associated with mobile broadband, from new networks to spectrum shortages (GigaOM Pro, sub req’d) and how to build applications for mobile handsets (GigaOM Pro), were all anyone could talk about.

    Who Needs Wires Anyway?

    Wired was tired, and mobile was basking in the glow of the spotlight and investment. But even amid the mobile lovefest, a few discordant notes were sounded. For example, Stephen Bye, VP of wireless services at Cox — a cable company that’s deploying a 3G and later a 4G wireless network — emphasized the limits of wireless broadband.

    Sure, Cox has a wired network to sell, but Bye has a point when he notes the shortfalls of wireless when compared to wired broadband. Cox’s wireless  LTE tests offered speeds between 10 and 25 Mbps, which are much slower than Cox’s wired Docsis 3.0 network that can deliver 50 Mbps or more. He also mentioned the increased demands Cox has seen on its wired network and said that sending that kind of traffic over wireless networks wouldn’t work. And wireless broadband traffic is only going to rise. AT&T  already saw its double from 2008 to 2009 and doesn’t expect that rate of growth to slow, even as it uses more and more of its spectrum. And Cisco released information this month expecting mobile traffic to reach 3.6 exabytes per month — 39 times what it was in 2009.

    I happen to agree with Bye, and I don’t have a network to sell, but I think the events of the last weeks  paint a pretty depressing picture of broadband in America. And we can only place some of the blame for the lackluster state of broadband on carriers. Some belongs with Silicon Valley and the tech community at large.

    Wireless Isn’t The Answer.

    For example, the idea that wireless broadband could be a real substitute for wired broadband showcases how crappy our current quality of broadband is. I’ve even weighed whether or not LTE or WiMAX would make a good substitute.

    How could I not, when I’m stuck with residential broadband service that delivers 7 Mbps down and 400 kbps up? Wireless services are within striking range of that offering right now. It’s possible I might even get better upload speeds on some wireless networks within the year. My husband is even preparing to dump his pricey T-1 at the office in exchange for WiMAX service from Sprint.

    Yes, I have limited choice on the wired side thanks to ISPs failing to invest, but why haven’t tech innovators and entrepreneurs given me something so compelling, and requiring so much bandwidth, that I wouldn’t even consider dumping my wired connection, lest I give up that killer application.

    Think Big. Build Big.

    So I will blame my willingness to cut the broadband cord on the ISPs’ failure to invest in their networks, but also on a failure of innovation and imagination from technology firms trying to deliver services over fat pipes. Give me something that needs 100 Mbps, so everyone knows why faster broadband is important. Much like Foursquare gets everyone stoked about location, we need an application that requires multiple megabits per second.  I understand that there’s a bit of a chicken-and-an-egg issue here, since delivering a service before too many people have fat pipes will slow adoption, but at least 55 million homes already have the infrastructure to get 100 Mbps. Build something for them.

    An emphasis on building products for fat pipes will help make wired broadband exciting again. And despite the investment required by ISPs, many — especially those with mobile networks — will win. After all, as wireless speeds get faster, consumers think they should be able to do just about everything on a mobile network that they can on a wired one.

    And for most of today’s applications, that’s actually true. But if we had a bigger performance divide and different applications between wired broadband and mobile broadband then consumers might have an understanding that sometimes the mobile web just can’t compete with the wired one, and that we really need both. For carriers and consumers, that could be a winning proposition.

    To learn more about this topic, join GigaOM Pro on Wed., Feb. 24, for the latest Bunker Session event: The New Broadband Buildout.

  • Finally, a PSA for Geeks

    Connect a Million Minds, a $100 million initiative from Time Warner Cable that tries to hook kids up with after-school activities that promote math and science, has released a public service announcement showing geeks as, well, if not the inheritors of the Earth, certainly its rulers. The clip is cute, and as someone who conducts experiments with ice cubes during my 3-year-old’s bath time in an effort to introduce her to the scientific method, I wholeheartedly support encouraging math and science education.

    Sure, it might be better if these kids were celebrating a love of technology, rather than the fact that they will one day get to boss their peers around, but whatever works. For those interested in participating, Connect a Million minds has a web site that offers appropriate events by zip codes. Truly ambitious folks can also pledge to host a program or event themselves. Here’s hoping that along with its commitment to math and science projects, Time Warner can boost its competition to provide better broadband. I know all the young geeks would appreciate it.

  • Connected Gadgets Need a Business Model That Works

    We’re big fans of adding connectivity to everything — from GPS systems to thermostats — but for every wireless connection there’s a price, and figuring out who pays that price and how they pay it is a roadblock when it comes to enabling smart appliances and gadgets, according to a survey by Accenture. The consulting firm surveyed businesses and found that 89 percent are interested in adding connectivity, but 63 percent of companies were concerned about the business models.

    So far we’ve seen two examples of successful business models for adding wireless connectivity: buying service monthly from a cell phone company such as for a data card, or a device maker pricing the cost of wireless into the goods it sells as Amazon does on the Kindle. But buying additional subscriptions for a smarter photo frame or a connected navigation system hasn’t really panned out, never mind connected refrigerators. Consumers don’t want 20 different bills for wireless service associated with their devices, nor do they want a refrigerator that uses the T-Mobile network if they don’t have T-Mobile coverage at their home.

    We’ve written about this problem before and touched on a possible solution: Wi-Fi. Personal hotspots that use the cellular network for connectivity and convert that signal to Wi-Fi are slowly creeping into the consumer world as a way to turn an iPod touch into an iPhone or merely replace a data card. That covers a range of devices with Wi-Fi chips on the go, and even in the car.

    Inside a home, Wi-Fi is even easier to defend, as it’s a technology many already have. Your large appliances never leave the home, so Wi-Fi in a refrigerator or washing machine that talks to the WiFi-enabled box connected to the smart grid to monitor energy usage is a pretty safe bet for consumer appliance vendors to make. Why shell out the big bucks for a cellular connection for devices that stay home?

    At a 4G conference in Florida, a Verizon executive gave a presentation outlining a possible use case by which GE would use LTE inside a refrigerator. The refrigerator could monitor things like the water filter, and through the LTE connection, offer broadband to a screen in the fridge and tell GE when the filter needed replacing. Then GE could ask the customer to click to buy a filter on the fridge. In a situation like that the consumer might pay for the access for broadband on the screen and GE might pay for the access to enable it to make more filter sales. That sounds great — for GE — but as a refrigerator-buying consumer, I’m not sold.

    So while I’m glad to hear that device makers want to add connectivity to everything, I’m equally glad that they’re thinking hard about how to do it. Broadband will add value to a bunch of different devices, but it may not always have to come from the high-priced cellular network, especially inside the home. And if we are going to deliver it over the cell network, perhaps Wi-Fi is still the best way to go.

    Image courtesy of Flickr user fihu

    Related GigaOM Pro Research (sub. req’d):

    Broadband Service Providers Are About to Ride the Home Energy Wave

  • Comcast Gives the Gift of Storage: Does Anyone Want That?

    Like the aunt who always gave you underwear at Christmas, Comcast is offering an unwanted (although useful) service for customers. It’s giving its users access to automatic online storage through a partnership with Mozy. Qwest and Verizon also have a similar online backup services, but it’s not clear how many consumers want a storage service from their ISP.

    Like ISP email addresses, how many people really will use such a service on a regular basis? Based on an admittedly unscientific poll of our readers a month ago, only 10 percent of readers wanted storage services from their ISP. Here is Comcast’s pricing for backup:

    • 2 GB free with broadband subscription
    • 50 GB for $4.99 a month or $49.99 per year
    • 200 GB for$9.99 a month or $99.99 per year

    Comcast helpfully tells people that with 2 GB of storage, a person could store one of the following: 200 high-resolution photos, 480 music files, one standard definition movie file or 10,000 average MS Word documents. Readers, what’s your take?

    Related GigaOM Pro content (sub req’d):

    Who Owns Your Data in the Cloud?

    Image courtesy of Flickr User MarcinMoga

  • TIme Warner Cable Is Nuts: $300 for 20 Mbps Broadband Connection

    Updated: Time Warner Cable’s super-fast broadband roll out to the rest of its markets is under way — although we have no idea how fast since the company has not responded to my questions about the deployment details. However, earlier this week it said it was offering DOCSIS 3.0-supported business class service in Cincinnati with two tiers that cost more than $300 a month. The tiers are:

    • Up to 20 megabits per second (Mbps) downstream/2 Mbps upstream: $309.95
    • Up to 50 Mbps downstream/5 Mbps upstream $349.95

    For that kind of money a customer gets the fast broadband, static IP addresses, more email accounts and better customers service, but is still on a shared line. However, I’m pretty sure plenty of business customers will take it, if only for the improved upload speeds. For example, in my market a T-1 line costs about that much and delivers 1.5 Mbps on a dedicated line. Update: However, Om points out that Comcast charges business customers $189 for 50 Mbps in San Francisco which makes TWC’s pricing crazy high. Gotta love that lack of competition.

    Meanwhile, residential pricing or service hasn’t been announced yet, which has me thinking that Time Warner, which has said it will deploy DOCSIS 3.0 “surgically,” is still cherry-picking its markets and customers for super-fast broadband. Meanwhile, Comcast has deployed DOCSIS 3.0 to 38 million homes and will cover 100 percent of its footprint by the end of the year.

    Related GigaOM Pro content (sub required):

    When It Comes to Pain at the Pipe, Upstream Is the New Downstream

  • Spectrum Shortage Will Strike in 2013

    The demand for mobile broadband will surpass the spectrum available to meet it in mid-2013, according to Peter Rysavy, a wireless analyst. In a report on the looming spectrum crisis that was sponsored by Research in Motion for the Mobile World Congress in Barcelona, Rysavy explains how the demand for bandwidth-consuming services used by more and more people will lead to a crappy user experience, or heavy-handed pricing (GigaOM Pro, sub req’d) and limitations on mobile application from carriers absent new spectrum allocations.

    He begins with data showing the increasing demand on mobile networks and lays out how much bandwidth a variety of services need, from up to 12 kbps for voice calls to 1-2 Mbps to stream HD YouTube videos. After illustrating the capacity crunch, he starts to tie it to spectrum. In many cities, carriers have 55-90 MHz of capacity, only some of which is allocated to data services. Even today, some providers such as AT&T have indicated they’re using up to half of their spectrum resources in heavily populated markets.

    But more spectrum is only part of the issue. Operators have options ranging from more cell sites (either towers or even femtocells) to data offloading (sub req’d) to next-generation radio technologies such as Long Term Evolution to even antenna optimization. I detail many of these measures as well as AT&T’s spectrum shortage in a GigaOM Pro report published today called Everybody Hertz: the Looming Spectrum Crisis.

    The Federal Communication Commission and Congress are playing a role in the spectrum issue, but it would be insane to think that handing over more airwaves will be enough, or that it will happen quickly. Useful spectrum for mobile broadband isn’t an infinite resource, so everyone from the developers building more resource-aware mobile applications to the folks in Washington allocating the spectrum and dictating regulations around mobile broadband will have to work together in order to make sure our desires for the mobile web are met.

  • Stop Cramming the Mobile Web Into the PC Box

    For a while the consensus has been that the mobile web is the same as the PC web, in that a person should be able to access whatever content they can via a wired PC connection on their phone, without suffering through WAP browsers or limits. I disagree. The mobile web is still different than the wired web, and it’s far more important.

    Which means that developers shouldn’t only think of the web in terms of a wired connection for a PC. Google Chairman and CEO Eric Schmidt said it best when he discussed the future of the mobile web during his speech at the Mobile World Congress today.

    In essence, Schmidt was saying that in many ways, your smartphone is about to supplant your computer — with huge implications for content providers and operators. For while PCs are about content creation (blogging, graphic design, etc.), and both current-generation smartphones and future devices like the iPad about content consumption, tomorrow’s smartphones — or superphones — will be about both, and may even do both equally well.

    So those developing applications with an eye toward consumption on personal computers over wired connections should stop thinking about mobile as an addendum to the web-based project and start thinking of it as a separate application in its own right. Note how your application uses data and how it will perform on relatively slower mobile networks. Good navigation is essential. Consider how your application (or web page) deals with connection errors and how it delivers data over a thin pipe, as Elizabeth Churchill, a principal research scientist and manager of the Internet Experiences Group at Yahoo, describes in the video below from a GigaOM event last October. The mobile section is about five and half minutes in, and hits on some really good design tips.

    Other than the difference in the speed and stability of the connection, developers will also need to think about the platform. Will the content or applications be used on an iPad, a phone, inside a moving car? If everything has a connection, and multiple operating systems such as Android or the newly launched MeeGo are the framework for a variety of end devices, then the mobile web isn’t a singular experience, or screen, anymore.

    For example. there’s no reason that my connected car has to offer the Internet in a web-biased format. Navigation in a car shouldn’t be mouse-based; it should be voice-based. Viewed through such a lens, it becomes clear that mobile connectivity sets developers free — free to play with different user interfaces, site navigation and experiences that will make surfing the mobile web a background activity in service of the user, rather than the primary one designed strictly to entertain.

    Related GigaOM Pro Content (sub req’d):

  • Global Crossing Counts on Smaller Price Declines and Cloud for Growth

    Global Crossing, the provider of bandwidth and IP-based services to corporations around the world, today reported fourth-quarter and full-year 2009 results that included a 6 percent boost in revenue and lowered losses for the year. Still the company has a hard slog ahead of it as the world recovers from an economic crisis that stifled business spending. The nature of business communications and the value of services to an enterprise are in flux.

    I chatted for a few minutes today with John Legere, the CEO of Global Crossing, and Gary Breauninger, its chief financial officer for North America and Worldwide Carrier Services. I sought to understand how Global Crossing was planning to grow, and how the overall market for bandwidth will. The bad news is that prices are declining, the less bad news is that they’re not declining as rapidly as they were in the years after the telecom bust, when capacity was cheap and plentiful.

    GigaOM: The services you provide are dropping in price across all of your business lines. The company is barely profitable if we take out your debt, but with prices  dropping, how will that affect your margins and growth prospects?

    Legere: Prices are declining at a declining rate, which is good. And compared to the incumbents, market price declines are less of an issue for us because our revenue growth is replacing old technology for our customers. So when customers come to us they are converging on IP services with us at a significantly higher margin, and we’re taking market share from the major players.

    GigaOM: There is a seeming surge in demand for IP services and connectivity, how is Global Crossing taking advantage of this, and profiting from this demand?

    Legere: The stats show that customers start adopting IP as a platform for some of what they do, then move into a second phase where they converge applications on IP,  and then move to a third phase where adoption grows wildly. We are late in the early cycle somewhere between adoption and convergence, and this move to IP has years and years left associated with it. Customers are moving from legacy technologies to IP voice and collaboration. We are offering services to them with cost savings of around 50 percent to them and high margins of between 60 and 80 percent for us. The contracts are long-term and the services are sticky to the customer.

    GigaOM: So what are the macro opportunities around these IP services? Cloud computing?

    Breauninger: With the Impstat acquisition in Latin America we have some data center operations and have since expanded into London and Amsterdam. Up to this point we have a set of on-demand data center products and a clear opportunity would be to venture into the cloud computing arena. That’s clearly an area of growth in the next 18-24 months.

    GigaOM: Will you make acquisitions as part of that expansion? Other providers like AT&T already have existing cloud products today, how will you compete in a year or two?

    Breauninger: We have grown the business organically from the data center business we have. And there’s no set definition for what constitutes cloud computing, and it’s an evolving space so we can still grow with products around our network and data centers as accessed through the cloud.

    Related GigaOM Pro Content (sub req’d):

    Report: Delivering Content in the Cloud

  • FCC Promotes 100 Mbps for 100M Americans

    Julius Genachowski, chairman of the Federal Communication Commission, has outlined his vision for broadband in America: 100 Mbps connections to 100 million homes. As part of an update on the National Broadband Plan due to Congress in mid-March, Genachowski sketched out a plan that would keep the U.S. competitive with other nations and enable 90 percent of the population to have and use broadband, up from about 65 percent today.

    The proposed speeds seem pretty damn exciting, but the devil is in the details. Currently, at least 55 million homes have the infrastructure to get 100 Mbps deployments through fiber-to-the-home or through a cable DOCSIS 3.0 deployment (the ISP may not offer 100 Mbps to the home, but it could be delivered). The time frame for getting 100 Mbps connections to 100 million homes was undefined, although Genachowski called this a “2020 vision.” While I think a decade is too long to wait for 100 Mbps to a third of the nation, getting that deployed is by far the easiest aspect of the plan.

    Speaking at the National Association of Regulatory Utility Commissioners Conference in Washington, D.C., Genachowski said:

    “Our plan will set goals for the U.S. to have the world’s largest market of very high-speed broadband users. A “100 Squared” initiative — 100 million households at 100 megabits per second — to unleash American ingenuity and ensure that businesses, large and small, are created here, move here, and stay here.

    And we should stretch beyond 100 megabits. The U.S. should lead the world in ultra-high-speed broadband testbeds as fast, or faster, than anywhere in the world. In the global race to the top, this will help ensure that America has the infrastructure to host the boldest innovations that can be imagined. Google announced a one gigabit testbed initiative just a few days ago — and we need others to drive competition to invent the future.”

    In addition to delivering 100 Mbps to almost a third of the population, he laid out several areas where the FCC would act to provide small businesses and rural areas with broadband. There were also hints as to how the FCC will convince laggards why broadband is a good thing. It sounds like some of that convincing will come from lower access costs in some areas combined with an overall shift in delivering services from medicine to schooling via broadband networks. The plan outlined by Genachowski includes the following recommendations:

    • To improve the E-Rate program for Internet connections in classrooms and libraries.
    • To modernize the FCC’s rural telemedicine program to connect thousands of additional clinics and eliminate bureaucratic barriers to telehealth.
    • To take the steps necessary to deploy broadband for the smart grid.
    • To develop public/private partnerships to increase Internet adoption, so children can use the Internet proficiently and safely. Programs like the NCTA’s new A+ program are a model.
    • To free up a significant amount of spectrum in the years ahead for ample licensed and unlicensed use.
    • To use government rights of way and conduits to lower the cost of wired and wireless broadband deployments.
    • To build an interoperable public safety network to replace the current system.

    Genachowski also said that while other countries with broadband plans have universality goals ranging from speeds of up to 1-2 megabits per second, the U.S. goal for universal service will be higher. He talked up digital literacy as well, saying every child must be digitally literate by the time he or she leaves high school. He also offered scary stats that Om and I called on the FCC to address last year:

    • Right now, roughly 14 million Americans do not have access to broadband.
    • The U.S. broadband adoption rate is about 65 percent, compared with 88 percent in Singapore and 95 percent in South Korea.
    • The U.S. adoption rate is even lower than 65 percent among low-income, minority, rural, tribal, and disabled households.
    • Some 26 percent of rural business sites do not have access to a standard cable modem and 9 percent don’t have DSL.
    • More than 70 percent of small businesses have little or no mobile broadband.

    So this appears to be a decent sketch, although it’s far less revolutionary than it might seem. Filling in the details around lowering costs and delivering actual services are where the plan could have the most impact. Getting a 100 Mbps pipe to a few million more people over the next decade will happen regardless of the FCC putting it in the National Broadband plan. Delivering faster universal service to rural and low-income areas, real telehealth, a smart grid, broadband to schools and creating a digital literacy programs will be the real challenges.

    Related GigaOM Pro content (sub req’d):

    When It Comes to Pain at the Pipe, Upstream Is the New Downstream


    This article also appeared on BusinessWeek.com.

  • Can Qualcomm Compete As Smartphones Become Computers?

    Our mobile devices are getting smarter, faster and are increasingly mimicking the functionality of a full-fledged PC. New capabilities such as multicore processors in phones and the ability to send HDMI video out mean that the brains inside our phones need more performance while they sip power. To that end, several chipmakers are coming to market with chipsets that combine multiple processors, high-end graphics cores and other design features to make truly killer end devices. As the top wireless chipmaker, Qualcomm has long been the “Intel inside” for mobile phones, but can it compete against a host of new processors with better graphics and more performance?

    Qualcomm’s Snapdragon processor is the brains behind the Nexus One phone and will also star as the processor inside some small yet powerful computers called smartbooks, but rivals such as Texas Instruments, Nvidia and Marvell are gunning for those same design wins. And from a feature perspective, it looks like Qualcomm’s competitors may bring more to the party. Its current 1 GHz Snapdragon (a 1.5 GHz version with 1080p will be in later handsets) delivers 720p video, and has a 3-D graphics engine that’s less impressive than those from Marvell or Nvidia.

    Yes, Qualcomm has won big so far. Nvidia launched an application processor at the 2008 Mobile World Congress, which was one sexy hunk of silicon. Later, it became the foundation of the Tegra chipset for mobile devices. After seeing what that could do, I predicted it would revolutionize computing and graphics consumption on the phone. So far, it’s in the Zune, but hasn’t taken off like I expected.

    Last year Texas Instruments talked up its OMAP 4 chipset, which seemed to exceed Tegra in terms of graphics performance (1080p, supports up to a 20-megapixel camera, etc), and actually had me giddy with excitement. This year at MWC it launched with TI talking up the chip’s ability to enable gesture recognition on handsets. Also today Marvell, which has really made a big push into application processors for mobile devices in the last year or so, launched its own 1 GHz chip capable of delivering 1080p HD video and hosting real-time, graphic-intensive applications.

    Not to be outdone, ST Ericsson, another top wireless chipmaker, announced at the MWC show a dual-core smartphone chip that can deliver 1.2 Ghz on each core. That’s about what my laptop offered five years ago, and seems like far more performance than any phone needs, until you take into consideration that the phone form factor is just one of many mobile connected form factors and that ST Ericsson has also created a chip for mobile devices that allows for HDMI out of the phone.

    We predicted such a port in our phone of the not-too-distant-future, but ST Ericsson has the silicon to make it happen. That means with an HDMI cable your phone becomes a DVD player for any content downloaded from the web. One hopes that online stores can get their act together when it comes to selling HD versions of video on mobile devices.

    But the question still remains, in a world offering silicon that enables HDMI content to be stored and processed on a handset, or gesture recognition thanks to a high-end camera and a powerful processors, can Qualcomm compete? For the last two years I’ve waited for Qualcomm to be dethroned, but I’m still waiting. Maybe 2010 is the year.

    Related content from GigaOM Pro (sub req’d):


    Better Battery Life Motivates Mobile Chipmakers

  • Check Out the FCC’s Useless Broadband Competition Map

    The Federal Communications Commission released data today detailing the spread of high-speed Internet connections across the nation as of the end of 2008, including this map (click on image for an expanded view). You might be thinking, “Wow, that’s awesome — so why are we spending $350 million to create such a map as part of the broadband stimulus bill?” It’s because the FCC map is worthless.

    The map defines broadband as any technology (excluding mobile broadband providers) delivering speeds of 200 kbps down. I challenge folks to surf to Facebook, the new video-heavy CNN site or even get their Gmail over such a connection. It’s not a fun experience. Plus, at those speeds video streaming isn’t going to happen at all.

    However, there are only a few areas of the nation that don’t have access to at least 200 kbps at the end of 2008, and according to the map many folks have a choice of between four and six providers. However, given that some of those are undoubtedly meeting the old minimum standard of 200 kbps or even the new minimum standard of 768 kbps, I can’t say this map really proves a competitive broadband market for anyone who wants to do anything more than get email.

    Even if the map’s not your thing, the report does have some good data, such as the nifty chart below that shows the distribution of access technologies based on the speeds they provide. Given that we’re moving toward a video-centric web that’s going to require faster download and upload speeds, I think the title of this chart should be, “Why DSL is Doomed.”

    Related GigaOM Pro Content (subscription required): 

    When It Comes to Pain at the Pipe, Upstream Is the New Downstream

  • Coders Get a New Colleague — Barbie

    There’s not a lot to say about this, but Barbie today got two new careers, and in one she’s a computer engineer! That’s right, coders of the world can now count Mattel’s best-selling toy among their ranks. Sorry, systems administrators!

    Barbie’s new role was chosen through a contest Mattel created this year that let consumers pick what Barbie’s next job would be. Computer Engineer Barbie won the popular vote, while girls selected news anchor as Barbie doll’s next career (what’s wrong with blogger, girls?). Computer Engineer Barbie will go on sale sometime this winter. That’s a binary code patterned T-shirt she’s wearing, and a smartphone, Bluetooth headset and laptop bag she’s toting. But my question is, does Barbie use a Mac or a PC?

  • Time Warner Cable to Boost Broadband in Texas, Ohio and New York?

    Time Warner Cable plans to expand its DOCSIS 3.0 broadband upgrades in portions of Texas, Ohio and upstate New York during the first half of this year, according to Light Reading. I’ve emailed the company for confirmation, but really I’m hoping that it’s true. Currently TWC offers me the best deal I can get for broadband in terms of speed, but it’s a decent 7 Mbps (12-13 Mbps with Turbo Boost) on the download and a glacial 400 kbps on the upload side. I can consume, but sending up a video is a time-consuming nightmare.

    The upgrade to DOCSIS 3.0, which Time Warner has implemented in New York City already, would offer speeds of 50 Mbps down and 5 Mbps up for about $100 a month. Everyone knows that I have incredible pipe envy thanks to my colleagues with FiOS or Comcast’s wideband services, but maybe my time spent on the outside looking in is over. Light Reading suggests that Time Warner will deploy D3 in Dallas, Austin, San Antonio, and maybe the unlucky Beaumont in Texas; Akron, Columbus and Dayton in Ohio; and Albany, Buffalo, and Schenectady in upstate New York. The roll-out would coincide with areas where Time Warner Cable is feeling pressure from Verizon’s FiOS or AT&T’s U-verse service.

    Of course, given the last time I asked about DOCSIS 3.0 upgrades, a Time Warner Cable spokesman inferred that it would come with tiered broadband, I may change my tune when I learn the details of Time Warner Cable’s roll-out. But for now, I’m hoping that by mid-2010 Austin will be one of the lucky portions of Texas to get superfast broadband. Especially the faster upload speeds.

    Related content from GigaOM Pro (sub req’d):

    In Fox-Time Warner Cable Blackout, Will Hulu Keep the Lights On?