Author: Mike Kirkwood

  • BlueLock Lets You Customize Your Cloud Infrastructure

    cloudWisps.jpgIf you’re a software-as-a-service company you are probably thinking about how cloud computing can save you money and time. The same thing is true for departments in the enterprise that wants to spin up a new service for customers.

    We had a chance to sit down with BlueLock, one of the leaders in cloud-based infrastructure providers. Their solutions range from quick provisioning using an online form, to becoming your infrastructure team for mission-critical applications. BlueLock represents a part of the trend in virtualization that not only extends physical servers, but allows companies to leverage infrastructure investments to meet the needs of application developers.

    Sponsor

    One of the benefits of cloud computing is the ability to optimize the experience of experts and save money in hosting applications. However, one of the areas that is terminally difficult in the enterprise is configuration management in between the layers of OS, storage, security and network. This is getting more and more focus from the biggest providers in IT – Cisco, Microsoft, EMC, NetApp – but where the rubber hits the road, IT leaders are doing a majority of the work tuning configurations to find the right mix of infrastructure to meet their needs.

    Custom is King

    greenRideFeb2010.jpgFully outsourced configuration management of the infrastructure is one of the areas that BlueLock offers. One benefit of this service is the opportunity for an enterprise to configure a cloud to be both production ready, and also have a mirror environment for testing and development. Not all applications are created equally and it is a key part of the job to optimize across layers.

    This area seems ripe for change and the future is unknown in the sense that today it is unclear which traditional infrastructure tools will evolve into the holy grail of “one click” deploy-and-customize solutions. So, if you want a solution that is cloud hosted today and configurable, it’s worth giving BlueLock a look.

    blueLockLogoFeb2010.jpgThis week at VMware Partner Exchange announced the BlueLock CloudSuite. One key feature in the mix is a tailored selection of infrastructure environments.

    Instead of just cloning infrastructure, BlueLock focuses on tailoring it around the successful patterns we’ve seen in the enterprise. Additionally, its offerings are based on VMware virtualization technology.

    The mix of cloud computing with customization seem like an ideal marriage for IT managers looking to build towards the cloud.

    Is customization a key capability that you look for in a cloud hosting provider for your applications?

    Photo credits: thebestofmyself & Phillip Pessar

    Discuss


  • Mobile Health: Will Network Applications Help Us Get Healthy?

    tvHealth.jpgLast week, we were at the mHealth initiative conference in Washington D.C. The keynotes were all about the impact mobile health applications are having in shaping the future of the health care system. Nothing demonstrates that more than the iPhone. In the 18 months since it was released, it has been perhaps the biggest thing to happen to health care electronic records, which has seen billions of dollars worth of investment in past decades.

    Mobile and wireless health applications directly impact the individual’s health and have the promise of ensuring that when a patient leaves a doctor visit, they don’t become “lost” in the system. It allows consumers to be engaged with health and wellness in their daily lives and connect back to their health care provider.

    Sponsor

    For citizens in the United States, this movement could offer a future where there is allocated wireless spectrum that brings a wealth of health information into our homes and to our personal devices. This could be in the form of streaming health record transactions and content targeted to us where we consume our daily media and social interactions.

    fccMo.jpgDr. Mohit Kaushal, health care director for the National Broadband Taskforce, gave a summary of the issues surrounding health care and mobile health in a keynote today. We had a chance to catch up with him afterwords and dig in deeper to a few the key considerations of health IT as part of FCC investment.

    Barriers

    First, he described a few barriers that have existed in the past.

    • Connectivity between systems is a major issue and challenge in health care.
    • Adoption of electronic health has been slowed by reimbursement incentives and regulatory issues.
    • Data utilization is growing to become a part of the mobile spectrum and also is being driven by intensive applications such as video and imaging.

    Key Challenges

    Next, Dr. Kaushel shared a few of the hurdles to overcome with a national broadband policy to support health applications.

    • The US needs to invest in infrastructure to meet the growing needs of a mobile-enabled population.
    • Spectrum must be allocated (or reallocated) to meet the needs and the right areas of growth
    • Regulations need to be designed to maximize incentives for innovation in care delivery.
    • There must be reimbursement incentives and viable business models for companies to succeed in delivering profitable services. In the health care system, we know that fee for service doesn’t work nearly as well as an outcome based approach for delivery of health, rather than more procedures.

    The question is how we can take this learning and apply it to spectrum or infrastructure that is allocated to consumer facing health care solutions. Should the U.S. include mobile health care in its considerations for the next phase of allocation of spectrum?

    Discuss


  • VMware Partner Exchange 2010: What happens in Vegas Comes to your Enterprise

    VMwarePartnerFeb2010.jpgIf you’re releasing products integrated into the VMware ecosystem, you’re likely enjoying enjoying the Las Vegas Strip this week. VMware Partner Exchange 2010 kicked off at the Mandalay Bay hotel today, and it is the place to learn about the current state of affairs and how to quantify tangible benefits of virtualization for partners and customers.

    We’ve found that the virtualization layer is becoming a key place to launch enterprise products. All of this momentum is being translated in how to more effectively sell virtualization into the enterprise – and VMware isn’t holding back in building the relationships to sell into the channel.

    Sponsor

    So, we ask, is the virtual layer the new platform for delivering value to the enterprise? If so, what tangible benefits are being offered today?

    First an analysis of the momentum. According to this release from VMware , despite the down economy there has been 60% increase in participants this week from counties all around the world. What happens when we crunch the language in that release? We find that “partners”, “exchange”, “customers”, “network”, “cloud”, and “desktop” are the most important subjects.

    VMwarePartnerExchangeWordle.jpg

    Using those ideas, let’s condense that statement into our own words: Virtualization partners are exchanging information on how to win customers by leveraging the network and cloud to reach the desktop.

    At the partner exchange, there are early previews of next-generation products and programs as well as in-depth technical training. The conference agenda is packed with sessions including practical training and a host of supported discussions from over 50 sponsors, including HP, IBM, Intel, Cisco, Netapp and EMC.

    Looking through the sessions, several things stand out. First, networking and storage are merging together. Second, security is catching up – quickly – to support and define how virtualization and cloud applications are deployed and managed for business-critical applications. VMware is being baked into partner go-to-market strategies and product releases. There is more partner surface area and more angles for sales. All of this bodes well for virtualization.

    Are VMware partners building an ecosystem that makes you want to move faster with your deployment of business-critical applications? What is missing?

    Discuss


  • Walking Among Giants: Who Wins With Virtualization?

    giantRedwoodsSmallJan2010.jpgIn this short analysis, we take a snapshot of a handful of key American technology leaders and what they stand to gain from virtualization. We believe that this trend is becoming a building block for dynamic infrastructure deployments for the enterprise and wanted to check in with some of our favorite technology brands to see what they are doing to grow the space.

    Instead of looking at the virtualization software vendors themselves, we’ll look at what drives the current virtualization momentum of companies like Intel, Cisco, and IBM that are already entrenched in the data center .

    Sponsor

    Investors in VMware

    Several companies have vested interest in the outcome of virtualization through their direct investment in VMware and involvement with the VMware board of directors. We’ll cover them first.

    Intel seems to be gaining a lot on all sides of the virtualization trend. The Intel architecture is being relied on by Macs that run Windows and Windows hosts that run Hyper-V. It’s also dominant in the virtualization options being deployed en mass including VMware, XENServer and KVM, which is shipped with many Linux distributions today. The processor is finally free to flex its muscle, and it is clear that its fluent in many languages and customs. Intel is investing in core technology to increase the ability for virtualization to be deployed in high reliability settings.

    Cisco Systems is positioned as a dominant networking provider in the core of many data centers and in telcos around the world. Optimizing networks and network management around deployment of virtual hosts is a clear benefit to Cisco.

    Who Else?

    Cisco may be in a position to drive even more power into the network with the ability to further streamline the connections between data, processing, applications and network. In a world where a company can spin up new hosts at a whim, it is clear that network management and configuration is key to success. Cisco maintains a virtualization blog to share its progress in bringing this technology to networks.

    EMC is clearly tied to the history of server virtualization with the purchase and subsequent spinout of VMware. Optimizing the data storage fabric to be ready to acknowledge and server virtualized data centers is critical for the management and concurrency of data across systems. Promoting virtualization is intrinsically tied to data resources and an ability to connect the two together seamlessly.

    privateCloud.jpgEMC has a set of products designed to help organizations manage and configure their virtual environments. Cisco, EMC and VMware have put together an enterprise cloud destination that shares information on the momentum of virtualization and cloud infrastructure, called PrivateCloud.

    Industry Leaders Who Benefit From Virtualization

    IBM with its massive investments in application servers, blade computing, open systems and Linux seems to be an obvious choice to benefit from this trend. IBM seems serious about helping enterprises focus their efforts on efficient and cost-effective data centers, and is positioned to benefit from its position in key technology trends that ride along side this movement. IBM shares information and white papers on their capabilities to lead the enterprise charge towards virtualization.

    HP is benefiting from its dominant position in delivering hosts into the enterprise and its cozy relationship with both Microsoft and increasing relationships with Linux distributions. HP has recently announced its ambition to win network business from Cisco and is in a position to deliver turnkey systems for deploying applications. HP is leveraging its Proliant line of servers as a way to offer companies VMware enabled architecture that will save time and money. This can a huge win for companies who rely on HP today and want to jump into virtualization with a partner at their side.

    ‘Mac Hardware: Best Place to Run Windows’

    Apple has been dipping its toe into the virtualziation game too by supporting VMware Fusion and Parallels in order to run both Mac OSX and Windows on Apple hardware. One of our favorite trends in computing is hearing users use the phrase, “Mac hardware: Best place to run Windows”. On a sidenote, Apple is also doing some interesting work in another area of computer core optimization with its release of Grand Central Dispatch in Snow Leopard. It’s designed to create high-performing systems for processing graphics and other computing functions by joining the massive power of the GPU to the CPU and optimizing the computing power of the OS based on the number of cores enabled in the hardware.

    Microsoft is involved in both sides of the virtualization trend; on their site they ask, “How far will you take virtual?” One big opportunity for Microsoft has been booking more instances of Windows Server running on the same machine, whether under VMware or under Microsoft Hyper-V. More licenses sold equals more profit and happy customers. However, considering Microsoft’s strategic position as the dominant layer sitting on top of the hardware, the company is also working hard with both software and server virtualization to prove that it has a strong case to be a strong contender to VMware and other solutions in the market. Virtualization is making Microsoft stronger as a competitor and rippling across Microsoft’s approach to the enterprise.

    In a surprisingly short time, all of these key technology leaders have embraced their offerings for a virtualized world. Through products, alliances and technology we will see even more support from them in the future.

    In future posts, we’ll take a look at how these companies use the technology in their own operations. Also, we’ll be looking at other companies that will benefit from virtualization, including small innovators and dominant global brands such as Hitachi, Sony and Samsung.

    Tell us what you think. Which companies are going to be the big winners in the virtualized computing world? Are there pieces that be overlooked by these giants?

    Photo credit: miguelvieira

    Discuss


  • Will The Cloud Deliver A ‘Computer’ to Every Person and Education to Every Child?

    resume.jpgThese days, having computer skills is an important part of a person’s resume. Likewise, it is important for a country’s ability to navigate through economic opportunities of our world. In recent years, countries like India have changed their position in the world based on their population’s ability to deliver computer skills and support to the rest of the world.

    At the same time that these skills are becoming increasingly needed, prices are dropping dramatically for computing hardware. The netbook enjoyed a huge year in 2009. New form factors of computing devices also gained that attention of the masses, and 2010 started with a bang with the $499 iPad tablet computing device. The technology world is continuing to innovate computing systems that continue to get smaller and more portable. This video from a researcher in Japan shows a entirely new form factor of computer coming to market, in the shape of a pen with a projector and camera at the tips.

    So we’re asking this question: For the world to live in harmony do we all need to own one (or more) personal computers?

    Sponsor

    Does the Cloud Redefine What a Computer Is?

    A computer is a machine that manipulates data according to a set of instructions. The definition was designed for a time when the components were all co-located within a single piece of hardware. But now there’s a more important question: Does a computer have to be “a machine”?

    With the concept of virtualization and cloud computing, the definition is being stretched in both directions. Virtualization is a concept where one machine can be “multiple computers”. With cloud computing, it is possible for multiple machines and networks that aren’t not co-located to operate as one computer.

    Maybe that’s the reason that the cloud is such an important concept in technology today – that it blurs the lines intentionally between the data, the instructions and the data manipulation. Conveniently, as a concept, clouds come in many shapes and forms, just like the current crop of computing systems being built and innovated upon.

    What About 6 Billion Nodes Instead of 6 Billion PCs?

    In 2010 there are several powerful trends taking hold of computing that are changing the paradigm of computing forever. One is the reality that there will be more mobile computing devices that computers, as mobile phone shipments today are much larger than PC shipments. And mobile phones, especially smartphones, are becoming more capable because they’re tied together with pervasive communication technology like SMS or MMS, and are powering applications like Facebook and Twitter that are becoming more powerful with location-aware services.

    Delivering Computers in the Cloud

    nivio_logo.jpg Nivio, a startup from India, is an example of the cloud evolving into a personal computing platform. Instead of delivering each person a physical device as the center of our computing experience, Nivio provides combination of storage, applications and “view anywhere” operating systems frame a cloud-powered vision of a universe. Nivio is assuming that it’s customers don’t each have a local machine with personal storage. Instead, these users will have network storage and applications and will access their computing power with the device they have on hand at the moment.

    Take, for instance, the One Laptop Per Child project. It has struggled to keep its hardware as cheap as it once hoped. But one way they’ve kept costs down is by embedding cloud tools to keep the overall profile of the machine down and creating a future of cloud-based services for the population.

    How many children are there in the world? How many nodes is that?

    Where is your company positioned in this continuum of cloud computing?

    Image credit: juhansonin

    Discuss


  • Extraordinary Measures: Computing in the Cloud for Cancer

    caBIGSmallJan2010.jpg

    One of the promises of the cloud is the power to join computing resources to solve the scientific mysteries of our time. On the backdrop of biomedical research, the challenges to join minds and computers together are also immense. Not only is the subject material complicated, it also is sensitive from both a time and privacy point of view. It is critical to get it right as people’s lives are at stake, and any new discovery requires comprehensive peer review and an unerring trail of evidence.

    With these considerations in mind, the National Cancer Institute has been making significant progress with the caBIG (Cancer Biomedical Informatics Grid) project. It is focused on setting standards for sharing computing resources and data in the effort to cure cancer.

    Sponsor

    The caBIG charter is enormous and visionary: “The National Cancer Institute is launching a 21st century information initiative that will transform the way we do cancer research. We are creating a network that will freely connect the entire cancer community. In doing so, we are leveraging valuable resources and saving precious time toward new discoveries.”

    Imaging

    A number of peer-reviewed articles on the use of imaging using the caBIG have been released. A paper titled “e-Science, caGrid, and Translational Biomedical Research” offers insight into the mindset of biomedical researchers.

    “Translational research projects target a wide variety of diseases, test many different kinds of biomedical hypotheses, and employ a large assortment of experimental methodologies. Diverse data, complex execution environments, and demanding security and reliability requirements make the implementation of these projects extremely challenging and require novel e-Science technologies.”

    To make its shared research work across the industry requires unique approaches in computing architecture. An architecture diagram gives a peek into the system that has been designed to meet the challenge.

    caBIGArchitecture.jpg

    Tools, Federation, and Semantics

    One of the most impressive things about the caBIG project is the focus on metadata vocabularies and semantics. This organization is one of the first to move beyond shared computing resources to a shared conceptual workspace. Not only is this one of the hardest things left to do in computing, it also may yield the greatest results in building a set of common understanding for the work going on across the world in biomedical research.

    caBIG resources include open-source tools provided by partners

  • . Also, the data-sharing framework is the core capability of caBIG.

    caBIG shows us a glimpse of the future in cloud computing, where the computing resources are a given and power is harnessed by having a powerful data-sharing framework.

    Social Networking Meets the Cloud for Research

    Margaret Anderson, executive director of FasterCures at the The Center for Accelerating Medical Solutions, Top 10 Medication Research Trends for 2010 recently wrote a post for The Huffington Post.

    The number four item on her list was caBIG, and she elaborates on the power of the grid plus human inputs. Not surprisingly, connecting raw computing power to the power of Facebook social networking yields some amazing results.

    “It takes an Army, and some new methods. More than 300,000 women from across the U.S. have signed up for the Love/Avon Army of Women as potential volunteers for breast cancer clinical research studies. Eighty percent of them have never had breast cancer, and most were recruited through social networking tools like Facebook. These numbers speak to the power of social media to spread the word, fast.

    The Army has recently partnered with the National Cancer Institute’s Cancer Biomedical Informatics Grid (caBIG) to create a cohort for an online longitudinal study and to make its data available via caBIG to the cancer research community. Is this kind of standing Army the answer to our perennial clinical trials recruitment challenge? And what alternatives are there if you’re not likely to gather an army of 300,000?”

    Cancer is a big issue for the health of our citizens. With new resource coordination and teamwork, researchers are starting to learn how it works and what can be done to treat it. Improving our overall quality of life may be the best reason to pool our resources and minds together.

    What do you think, is caBIG a model that the enterprise could use for sharing information across teams and partners?

    Discuss


  • Getting Started With Virtualization: What You Can Do On a Shoestring

    dollarCoinsJan2010.jpgVirtualization technology can improve the cost efficiency of the data center by running servers at higher utilization. Deploying virtual servers can yield a more energy efficient data center and reduce the total footprint of a computing environment.

    Although IT managers have started adopting virtualization for critical infrastructure, reports such as the poll conducted by Network Instruments in 2009 and reported on by Information World suggest that many IT managers could not confirm that they got the expected savings by virtualizing their environment. In that context, it is very nice to report that all of the key virtualization software vendors provide free or low-cost solutions that can be deployed and managed by IT teams. These solutions can be deployed in production environments so that teams can get started and determine for themselves if it is appropriate for their environment to host applications on virtual servers.

    Sponsor

    VMware

    Virtualization technology has evolved considerable over the past years. The market-leader, VMware has been adding features and improving the performance of their offerings, including addressing applications that use intensive I/O operations and offering robust management tools.

    The free ESXi version is a low-cost solution in the market that is useful for many scenarios. While ESXi does not include vCenter (formerly virtual center, centralized management tool for managing multiple VMware physical servers) or VMotion (dynamic migration) or clustering, in most cases these issues are easily addressed with proper planning. These features are clearly in the upgrade path for teams that need them.

    We interviewed Tom Moore, an IT leader who has managed virtual production environments working for a Global Fortune 100 company for over eight years. We asked him to characterize his teams use of the free VMware ESXi. His team considered this when his organization was faced with a budget crunch, he said it was a “no brainer” to switch from full ESX to ESXi instead of open source XEN that’s baked into various Linux distributions.

    “Also, for the minimal charge of couple hundred dollars you can add vCenter support to ESXi, which is cheap compared to the thousands of dollars per CPU to get the full VMware product now called vSphere,” he said.

    Moore continued, “As a standard approach, our company has switched to ESXi when high availability features like VMware clustering and VMotion are not required. Use of a load balancer in front of your VMware farm is sufficient in most cases. This was our approach at before VMware even had ESX and clustering built into the product.”

    Offering ESXi proves to be a good starting place for organizations looking for a solution that provides higher density and the team is ready to design the environment around the pieces the free offering doesn’t offer.

    Citrix XENServer

    Citrix XENServer states its path into the enterprise: Enterprise-class. Cloud-proven. Free. A quote from their website:

    “Citrix XenServer is the only enterprise-class, cloud-proven server virtualization platform that delivers the critical features of live migration and centralized multi-server management at no cost.”

    Citrix XENServer’s free solution supports clustering and dynamic migration. Adding more features into the free solution is a competitive edge that Citrix brings to managers considering XENSever in the data center. This alone might be a good reason to consider it for a high availability environment on a budget.

    One key area to consider is the XENServer management tools. These features are still evolving to meet the level of sophistication that VMware offers. For some, it might not be worth the trade-off. However, Citrix XENServer does provide a lot of features that are attractive in the free product.

    CitrixXENVMWAREESXICompare.jpg

    Windows Server 2008 R2 Hyper-V

    As the dominant operating system provider, Microsoft is well positioned to be a lead provider of solutions in the area of server virtualization. The company has been playing catch up in this area and is diligently working through hard issues such as pricing, licensing and product features.

    Microsoft’s Window Server 2008 R2 Hyper-V has gained traction with key customers and has been focused on key features like live migration as its pull for adoption in the data center.

    “With Hyper-V, it’s easier than ever to take advantage of the cost savings of virtualization through Windows Server 2008 R2. Optimize your server hardware investments by consolidating multiple server roles as separate virtual machines running on a single physical machine, efficiently run multiple different operating systems in parallel, on a single server, and fully leverage the power of x64 computing.”

    While Hyper-V is included with the cost of Windows Server 2008 R2, centralized management features and dynamic migration will cost extra through Microsoft’s Virtual Machine Manager (VMM).

    Hyper-V is making significant progress as a product, but it also seems that it will take Microsoft a few years to evolved Hyper-V into a real contender in the market It may start getting traction for shops predominantly using Microsoft technology today and have deep relationships to the mothership.

    Last week, Microsoft announced a significant partnership and investment with HP that focuses on optimizing hardware and software solutions. Microsoft and HP are developing patterns where virtualization, SQL Sever, and Exchange server are deployed together and optimized in both hardware, software, and OS. This is a key competitive offering for enterprises.

    Summary

    Microsoft is a logical leader in this space and can’t be considered out of the race. The partnerships and investments in the channel will likely pay off. Hyper-V is growing features and has a distinct advantage in reaching customers through the natural relationships Microsoft has in the enterprise.

    Citrix XENServer is a real option and is competing on price and features. The product is backed by a leader in the desktop virtualization space, Citrix. Considering the array of features offered in the free product, it is definitely worth taking a close look at.

    For full featured end-to-end server virtualization, it is hard to beat VMware. VMware solutions have the features, performance and price. Whether your needs are on the low end or high end VMware is there. If you are on the high end, it won’t come cheap but VMware knows they still have a considerable lead in the market for features customers want.

    Considering the price to get started, it may be worth it to try each of them out with your team to see which fits best with your environment.

    What do you think, is free a good price for server virtualization?

    Photo credit: pfala

    Discuss


  • The Healthcare System: An Apple Tablet’s Biggest Opportunity

    iTablet.jpgApple’s “iTablet” – whatever it may be – could be destined to transform our care delivery system in a major way. For years, key hardware vendors like Panasonic, Toshiba, HP and Intel have been working hard to embed tablet computers into hospitals.

    The promise of improved clinical information systems, based on real-time information updates across patient touchpoints could be a workflow game changer. If the tablet becomes the tool that is carried with a nurse or doctor on their travels from patient to patient, it will save time, money and lives by enabling the first “always updated” system.

    Sponsor

    Unfulfilled Opportunity

    Considering the massive expense of implementing an electronic health record (EHR) system – for example the $4 billion spent by Kaiser Permanente – data synchronization is a huge investment for the healthcare system. At the national level, the Office of National Coordinator (ONC) is administering billions of dollars of stimulus dollars to help systems move forward into the electronic realm.

    But early today, the ONC’s Charles Friedman told a FDA interoperability meeting that in 2008, a mere 4% of systems in the United States qualify as “fully functional” electronic health record systems. With all the fantastic and innovative work that has gone into creating a healthcare specific devices, such as Panasonic’s series of tablet PCs, it’s not the mainstream yet.

    A big part of this reason is usability of the software. Clearly, vendors have been building creative and durable machines. But in a similar way that earlier smartphones now seem clumsy compared to the iPhone, we haven’t yet seen a product that is amazing. Something like what we think the Apple tablet could be would change this landscape overnight and may be priced at a point that’s much less than other medical devices on the market.

    Mobile Health Momentum

    The iPhone has already changed the face of healthcare. Apple shared this fact at last year’s iPhone OS 3.0 release and within the keynote at WWDC. The momentum that started with consumer applications has moved to forward-looking doctors and health providers. We know that it is becoming common practice for some doctors and nurses to carry both their company-issued Blackberry and their personally purchased iPhone.

    airstripOBJan2010.jpg There are already amazing applications in the market. AirStrip allows doctors to monitor patient vital signs and receive alerts from afar. There are now personal health records that can be carried and updated from anywhere.

    Additionally, there are information-rich applications that allow nurses, doctors and patients to look up health information in real time. Last week during the Haiti tragedy, an injured individual was able to use an iPhone to treat himself using an first aid application on the iPhone.

    Clinician Ready

    Haiku-3Jan2010Small.jpgApple and EPIC systems have been collaborating to release the first version of MyChartManager on the iPhone. EPIC is a leading provider of EHR in the United States, and powers systems such as Kaiser Permanente and Palo Alto Medical Foundation in the Bay Area, to name a few. The application, named Haiku was released on Jan. 13, 2010, and several health systems are in the process of testing it. It’s a clear contender for the “killer app” in the hospital setting. Looking at the screenshots, it’s clear that more screen real estate would be ideal – which means it may be just the right time for an iTablet-like device to emerge on the market.

    It’s the Apps

    It is nearly certain that iPhone OS 4.0 will create a path for existing applications to “upsize” to a tablet device, and this includes size. The medical category today is already the highest-aggregate-priced category on the App Store today, and with the promise of applications inside the clinical walls, the opportunity gets much larger.

    The iPhone-to-tablet combination may be the biggest reason that a tablet is successful in the market, since the entire iPhone developer community will be able to deliver on this new platform. With Apple’s success in having an integrated OS that shares core libraries across both the Mac and iPhone, it is likely that a table device will also connect with apps from both the iPhone and the Mac.

    Workflow Wish List

    Having had the opportunity to observe clinical workflow and talk with several healthcare providers – including Kaiser Permanente and the Lucille Packard Children’s Hospital – we’ve compiled a list of device capabilities that would change healthcare. Our wishlist includes:

    • Real-time observations, including vitals signs: It is amazing that many systems still require doctors or nurses to take down vitals on pencil and paper, even when an EHR is in place.
    • Shift changes: Shift transitions between nurses can be greatly improved by having a device that is mobile and moves freely with each part of the staff, so that the shift exchange is a workflow generated process that isn’t tied to a physical location. Nurses move, the system should too.
    • Rich content delivery: The ability to share with a patient what is going to happen in rich detail, including video, can be a major force in improving readiness of the patient.
    • Video: Bringing remote feeds right into the emergency room, outpatient setting or other environment should be easier than ever before.
    • Family and friends: Offering a feature for family and friends to directly communicate with the patient is a huge opportunity. A tablet may be the perfect device to enable more personal discussion and check-ins with family members in the hospital, near or far.

    Prediction

    If Apple does in fact show a tablet device at the Jan. 27 event, hospitals around the country will react with pilot programs, and we will see tablets and Macs join the iPhone in helping deliver healthcare with a new era of style and grace. It is also true that Apple will have an uphill battle getting past corporate IT; getting support in the enterprise as a new class of device is a daunting challenge. But the “iTablet” will give visionary IT leaders more opportunity to change the status quo and look to the future.

    We can hear the doctor already: “Take two moments to fire up your iTablet, and teleconference me in the morning.”

    What do you think, could a tablet be the product that brings Apple inside the hospital walls and improve the system?

    Photo credit: Balazs Gal.

    Discuss


  • Is Virtualization Magic? (And Other Questions Your Manager May Ask)

    rabbitHat.jpgOne of the fun things about being a leader in IT is the opportunity to see new technology and explain it to others for the first time. We love to see people’s eyes get big and excited when some new wizardry is introduced and we’re the first one to explain it.

    In the simplest terms, a key outcome of virtualization that the virtual engine software layer divides either hardware or software into more pieces than originally existed. Instead of one operating system running, the same hardware can run several concurrently. For example, these can even be of different flavors such as Linux and Windows running on the same Intel hardware. Where it gets interesting is that each additional system running doesn’t divide computing power in half, like you might expect. Instead, an overall gain in system utilization is found in this approach. This “unused” power frees computing resources without the need to procure new hardware.

    Sponsor

    Since it is new, there are inevitable questions an IT leader will receive when explaining this to non-technical members of your team – a.k.a management.

    A Quick Primer

    Here’s a short list of resources that give an overview of the technical underpinnings of virtualization:

    The Impact Question

    Does virtualization it have a performance or operational impact to the systems?

    Yes, there is some impact to performance when running a virtual layer on software or hardware. However it is much less of an impact than the cost of buying a new system. VMware’s cost analysis summary breaks down the increased system utilization like this:

    • Expected increased utilization rates from 5-15%. In some cases up to 80%.
    • Deferred datacenter construction costs by $1,000 per square foot ( IDC’s Datacenter Trends Survey, 2007).
    • Attain 50-70% higher VM density per host than is possible with commodity offerings.
    • Achieve 20-30% lower cost-per-application.

    Below is a visualization of some of the data-center expense reduction opportunities from a virtual environment.

    virtualSavings.jpg

    Perhaps one of the most important areas for consideration is how virtualization not only extends the investment in hardware, but also allows an IT department to be ready to ramp up more systems when needed, without waiting to procure more hardware.

    Virtualization is becoming mainstream, and it is worth the fight to earn the right to deploy and operate a virtual environment. Virtualization is a nice gateway into cloud computing services and it prepares a team for new processes and tools. One benefit we’ve seen is that IT teams can use virtualization to get ready for and configure new projects quickly.

    Is it free? No. Is it magic? Yes. Especially for the non-technical business executive. You’ll soon receive new questions, such as “How did you get that new system setup so quickly?” It will remain up to you, the IT leader, to decide whether you want to share the secrets up your sleeve.

    Do you have an interesting story explaining how virtual environments work in your company? Share it with us in the comments.

    Photo credit: pokpok313

    Discuss