Author: [email protected] (Tony Byrne)

  • Seeking UK/Europe Sales and Customer Supt Exec

    We’re looking to hire a sales and support executive. Must be UK- or Europe-based (with native English).  You can read more and apply here. If you know someone who would make a good match, we’d be grateful if you could forward this along.

     

  • Performance is a requirement, too

    Performance testing is a notoriously difficult undertaking. So much so, in fact, that it is sometimes not done at all, or only done when a performance problem arises in production, making some sort of investigation unavoidable.

    Testing the performance characteristics of a system in advance of its rollout is particularly difficult, because it’s hard to know how to simulate real-world usage situations. Developer and QA-lab setups rarely replicate real-world environments. In the real world, machines have fragmented hard disks, superfluous extra files on the file system, anti-virus and other software running, etc., while end-users do crazy things like start and stop applications, run hard-disk searches, flush the browser cache, leave Gmail, Skype, and other "chatty" applications running in the background, and so on.  This can all affect the performance of WCM or DAM applications in particular. The real world of end-users (and of real servers running in real data centers) is not easily duplicated in a sandbox environment.

    Some CMS vendors (for example, PaperThin and Sitecore) provide built-in reporting capability for determining time-to-render for various content elements. But in general, onboard profiling capability is woefully lacking from most WCM and DAM systems.

    A few vendors are beginning to delve more deeply into this area. One that does is Day Software: The next release of its Communiqué offering (version 5.3, slated for March) has what I might call (tongue in cheek) pervasive thermometry. Almost any operation that takes (or can take) a noticeable length of time has a thermometer bar or other progress indicator associated with it, and in many cases a comparative bar-graph is available at the click of a button. The bar graphs are drawn using the Google Charts API, which means that a graph can be stored, sent, and managed as a bookmark — the charts are essentially REST URLs.

    As with any vendor — and especially Day — you need to be careful that the engineering vision of a reporting subsystem is matched by its usability.  So Day customers will want to test closely when it comes out. 

    Until more CMS or DAM systems start offering good tools for profiling or performance monitoring, we recommend that you be sure to address those requirements up front.  Make it part of your requirements process, before undertaking a system implementation, or for that matter, before choosing a vendor. Determine ahead of time what your performance goals are — then put them in writing, in your RFPs and RFIs. Structure them into purchase agreements as well. "Pay for performance" isn’t a bad policy. But if you don’t spell it out, it’s like anything else; it won’t get implemented.

  • SDL moving into targeted marketing and e-commerce

    For those vendors with solid profits, it’s a buyer’s market. SDL, parent company of such products as Tridion, Trados, Trisoft, and XySoft, announced the acquisition of Dutch e-commerce vendor Fredhopper today.

    SDL’s streak of acquisitions is something I recently checked up on while writing the Tridion review for the Web CMS Report 2010. Notwithstanding its ever-expanding portfolio of products, you shouldn’t forget that in essence, SDL is largely a translation services company (rather than a software company), with the revenue from those services fueling expansion in the software market. After having added translation services companies, and then translation management software companies, the focus the past few years has been on XML and component management companies.

    At first glance, adding Fredhopper’s "marketing and merchandising optimization software for e-commerce" to that portfolio makes little sense. However, SDL sees it as a logical move, moving from multi-lingual consistency (translation), to brand consistency (XML and component management). And SDL Tridion, which accounts for about a third of SDL’s revenue, is now going to be the connection between that side and online marketing plus e-commerce.

    In fact, Tridion has already announced "SmartTarget," which will use Fredhopper in combination with Tridion’s personalization and statistics. Combinations like that are all the rage now in Europe (Sitecore’s Online Marketing Suite and EPiServer’s partnership with Mediachase come to mind), perhaps because targeted marketing and e-commerce across several countries and languages is, in fact, very hard to do.

    So how good is Fredhopper, anyway? Well, the company (now to become SDL’s "eCommerce Technologies" division) certainly has some impressive customers; mail-order companies like Otto and Neckermann are household names here in The Netherlands and Germany (and in fact, Otto Group is "the second largest e-commerce business in the world behind Amazon"). Unfortunately, Fredhopper’s results on, say, the otto.nl site are less impressive. For example, I entered "tshirt" as a query; then refined on "men’s wear"; and then refined on "suits". I got two results for my faceted, refined query. A hat and a belt.

    Of course, a solution is only as good as its implementation. When I asked SDL about my Otto example, they suggested trying the same query on another Dutch site, which renders much more relevant results. And I doubt a U.S. vendor like Endeca (famous for its e-commerce implementations) would be able to do better. In fact, Endeca also lists the Otto Group as a reference, but tellingly — Endeca’s implementations are in English, only. But it illustrates that marketing and e-commerce across multiple languages and countries is still very challenging.

    So the acquisition probably makes sense for SDL and Fredhopper. However, make no mistake: SDL still isn’t a one-stop-shop with complete off-the-shelf solutions integrating all its technology flawlessly. For you, the customer, it’s still going to be a lot of hard work to get it all working together right.

  • DAM moves – Tata acquires BT Mosaic

    Today the Indian IT services giant Tata announced that it was to acquire BT Mosaic.  It’s an acquisition worth examining in a little more detail, since we will likely see more of the same over the coming years.

    BT Mosaic boasts extensive rich media services, in that their (SaaS-based) offering stretches beyond Digital Asset Management to production facilities, and most importantly digital distribution.  BT Mosaic’s customers are able to make use of BT’s (British Telecom’s) network expertise to link and push content throughout global broadcast networks.

    BT Mosaic was an interesting spin off from BT, one that had done fairly well commercially while building up it’s network reach and services. Most likely Tata will continue to support BT Mosaic’s focus on media companies, but I would expect Tata to absorb elements of the Mosaic services  and start to offer this into broader enterprise IT offerings.

    There will always be a market for standalone DAM software, meeting the needs of marketing departments for example. Rich digital media is making up an increasing percentage of enterprise content, and it’s only going to grow.  Not only is it easier to create digital media these days, but the demand and expectation is growing exponentially.  Digital media is becoming pervasive and needs to be managed alongside and in lock step with all our enterprise content, not as an exotic exception.

    The next couple of years for DAM will be fascinating to watch, as one way or another Digital Media is going to continue to grow in importance far beyond its traditional uses and constituents, and become a key element of any Enterprise Information strategy.

  • Have you considered the V in DAM?





    Last week, I blogged about the increasing trend toward specialization in the Search & Information Access space. As you may know if you’ve been reading our Digital & Media Asset Management Research, the DAM industry is yet another area where specialization is ongoing. One trend that’s helping drive the verticalization of DAM is the broader use of video in Web publishing and in enterprise scenarios.

    Video, as a digital mode of communication, is nearly ubiquitous. This means video asset management (as a capability within DAM) will assume ever-greater importance in the months to come. If you’re in the market for a DAM system, you’ll want to think about what this may mean for your overall content management strategy — and take video requirements into account when shopping for a DAM system.

    Depending on the business you’re in, your use of video may not be extensive today, but it may well become a key content category for you in the near future. Just as the podcast phenomenon suddenly found many companies in the business of managing MP3 files "overnight," pervasive video will likely find many Web CMS owners wishing they’d thought through the vicissitudes of Flash and MPEG4 ahead of time.

    Video is becoming more important in broad intra-enterprise cases as well.  Many (if not most) companies have security cameras stationed in their stores, offices, or on company grounds. What happens to all the security-video footage? In some cases, old material is simply destroyed after a certain amount of time. But it still has to be cataloged and stored short-term (then dispositioned appropriately). Is it safe to just manage such footage in ad-hoc fashion? Maybe. But maybe not. What happens if an employee sues the company after (for example) suffering an accident on the job? If the accident was caught on video, the video becomes a key piece of evidence. What if the employee’s lawyers claim that the accident was part of a series of similar events? If archival footage of all similar events, across time, is available (and can be found with the company’s search technology), it could decide the case.

    Video also plays an important (and increasingly critical) role in health care. Nowadays, at major hospitals, all surgical procedures are recorded, for legal reasons. This results in huge volumes of video files that need to be cataloged, archived, and dispositioned.

    Highway-patrol cars are (more often than not) videocamera-equipped. Every traffic ticket, every arrest, every roadside assist, is video-recorded. All of that material has to end up somewhere. It’s best if it ends up in a repository, managed.

    Law enforcement agencies routinely videotape suspect interrogations. Again, this creates enormous quantities of video information that needs to be cataloged and managed — preferably in such a way that footage can be semantically searched later on, if needed. According to Herndon, VA-based MediaSolv Corporation (which sells video asset management systems specifically designed for police use — a prime example of the increased verticalization we’re seeing in DAM), 28% of U.S. states currently require the recording of "custodial interviews" (i.e., police interrogations), and fully half of all states have already passed evidence-preservation legislation. This essentially amounts to state-mandated use of DAM.

    Take a look at your own organization. Do you see video management in your future? If the answer is "yes" (and it probably is), you’ll want to consider availing yourself of our Digital & Media Asset Management Research, where we rate each of the 20+ vendors we evaluate on their video-management capabilities. We can help you get a handle on your media management needs — even if you’re still deciding what they are.


  • Who remembers the Deep Web?

    I heard the words "Deep Web" used this week at an industry gathering. It’s something I have not heard in quite a while, and looking around me in the room I figured that few people there had any idea what this term actually refers to.

    In essence (though you can go and read up further on this for yourself), the Deep Web refers to the non-public, non-indexed web. Which amounts to a volume of content that is at least (depending on whose calculations you believe) 1,000 times bigger than the public web. It’s interesting and somewhat counter-intuitive to think that the vast bulk of the world’s web is not indexed on Google and likely never will be.

    I am not going to cheerlead for the revival of Deep Web as term, but I think it’s something we all need to think about at times, since that Deep stuff is the stuff we all too easily forget about.

    You might also ask yourself: what’s Deep within your enterprise, potentially accessible, but not easily found?

  • The trouble with WCM market-sizing

    Today we received an inquiry from a research customer, and I gave what felt like a rather unsatisfying response. Here’s the question and my answer. Perhaps you can suggest other ideas via comments, below.

    The Question

    "I am the marketing manager at a consulting firm that is a CMS Watch customer. I need to establish the monetary value of the UK CMS market. Can you help?"

    My Answer

    The short answer is, "no one really knows, but probably larger than we’d first guess."

    Challenges in coming up with meaningfully accurate data include:

    • The vast majority of Web CMS vendors are privately held and do not report revenues
    • Publicly-traded software vendors who sell WCM tools typically also sell many other products, but don’t break out their WCM income in their financial reports
    • Open source tools comprise a significant portion of the market
    • Likely 70-80% of buyers’ budgets end up getting spent with integration companies like yours
    • Definitions about what constitutes CMS vary… e.g., do you include SharePoint if someone deploys an Intranet on it? What about social media technologies that incorporate some content management services?

    Every year or so, IDC (major analyst firm) comes out with well-regarded marketplace sizing estimates, but typically they cover broader marketplaces (like "content management" in general) and are global or regional in scope.

    We do know that nearly every major CMS vendor in the world wants to participate in the UK market, so at least for internationally-oriented suppliers, it is arguably the 2nd most significant national marketplace behind the USA. (Germany and Japan could also take that mantle, albeit for different reasons, but that’s another story…)

    Final Thoughts

    So, I could plausibly guess that the UK WCM marketplace totals £250m or just as easily estimate £1.5bn annually.  However, I’d reserve greatest confidence in the conclusion that both figures are wrong.

    In the end, market size is important for consultancies and investors considering where to allocate their resources. For end-user enterprises, this data is less important. Focus on the horses, and not the horse race or the total purse.

  • It is Document Management from here on in…

    At CMS Watch we frequently have to explain to people why we have separate research streams for WCM (Web Content Management) and ECM (Enterprise Content Management). The explanation is frequently a response to the question, "aren’t they just the same thing?" The simple answer is no, they are not.

    ECM was originally and quite usefully coined by AIIM back in the day as a term to describe an overarching approach to managing all forms of content. Unfortunately, the term has been royally misused since. For some it is a rationale for installing a single big technology platform (or suite of content management tools). For others ECM is a business practice that encompasses all the different methods and process of managing any enterprise information. For others it’s simply a really big WCM system. For still others ECM is a layer in an Enterprise Architecture diagram.

    I think the term ECM still has a place in the acronym pantheon, but that place is an increasingly limited one. (It would seem that we are not the only one thinking hard about this very set of issues, as quite coincidently John Mancini of AIIM blogged on this here yesterday…)

    I believe that most buyers around the world actually want and buy document and records management systems. They don’t want or need a single system to manage all their enterprise content, no matter how wonderful or magical such a system may sound. Their specific needs include such applications as: the accounts payable process, handling medical records, managing legal case matter, and so on. In short, enterprises need to implement process-specific solutions.

    We call these solutions document and records management systems. That’s what they have always been, and likely always will be. That is the key reason why we changed the name of our "mid-market" enterprise content management vendor category to "Document Management." We’ve also begun to separate out our market overviews (see our recent slideshare presentation), and will work toward renaming our entire ECM research stream in the next quarter.

    ECM is an aspirational term for many, one that suggests a single layer/platform/system/methodology that will address your enterprise content needs no matter how complex, diverse, or voluminous. Some major vendors promote this approach, and buyers for such systems also exist, but they make up only a small minority in this market. So, though it may seem a little dull by comparison, from now on we will use the terms Document Management and Records Management where they apply, and will reserve the exotic ECM moniker for that rare breed of big, complex, and typically very expensive platforms that actually merit such a grandiose term.

  • Search gets (even more) specialized

    The search technology marketplace is getting much more specialized, with important implications for you the customer.

    There are at least two different ways that software can specialize: Functionally and Vertically. Functional specialization — sometimes referred to as "horizontal" specialization — entails targeting a particular, well-known business use-case, such as e-discovery, that spans multiple different industries. Verticalization happens when a product gets tailored for a specific market "vertical" or industry segment, such as Manufacturing. (Verticalization typically implies functional specialization of some kind as well, but different industries beget different types of functional needs.)

    In the opposite direction, some vendors try to create omnibus platforms, making a product or product family more broadly capable, and therefore theoretically applicable to many different verticals and business scenarios. This can be an effective strategy, but at some level is the opposite of specialization. An example of this in the WCM world is Ektron‘s Web CMS, which started out essentially as a rich-text editor and has broadened, functionally, in a manner that would make the designers of the Swiss Army knife proud. Many other WCM and ECM tools have followed a similar path.

    The search marketplace has been quite different. Our latest Search & Information Access Research has found such specialization more the rule than the exception. In fact, it is a major — maybe the major — industry trend right now.

    For example, when a vendor such as Recommind continues to tailor its products in ways that favor addressing enterprise e-discovery use-cases and the legal marketplace, that’s an example of of both functional and vertical specialization.

    Other examples of functional and vertical specialization in search are legion.

    • Endeca continues to pursue categorization, clustering, and other techniques particularly relevant to the online-retail market (and the intelligence community, as well).
    • Surfray is striving to differentiate itself as the "SharePoint search solution."
    • Adobe Omniture‘s SiteSearch has concentrated heavily on an analytics-rich approach to search that appeals to marketers.
    • Sinequa, with its emphasis on linguistic analysis, continues to tout its prowess as a multi-repository "knowledge access tool," applicable for KM use-cases and the law enforcement sector.
    • Coveo and ISYS, meanwhile, continue to strengthen their capabilities in multi-client (palm/lap/desktop) enterprise search. Coveo, in addition, continues to beef up its BI capabilities; it now offers a search solution tailored for call-centers.

    For more examples, consult our newly updated Search & Information Access Research.

    Of course, the trend towards specialization search is not universal. Mega-vendor Autonomy offers a dizzying array of diverse search and access tools. Even then, Autonomy increasingly emphasizes "meaning-based computing." I’m not sure what that term means, but we’ve seen demos for functional use cases like e-commerce.

    Overall, we view specialization as a positive trend. It means search vendors (and presumably their customers) understand that search is not really a one-solution-fits-all problem. Search is as much a knowledge-discovery problem as it is a problem of "looking things up." It’s about finding things you didn’t necessarily know existed. And the tactics for doing that are as varied as the information world itself. There is no one right way to find something, so it stands to reason there is no one general-purpose system that can do it all. Getting the right search solution in place means first and foremost understanding your needs. And that’s something we can help you with — through our research and consultation. Don’t hesitate to call on us when the time comes.

  • SurfRay Ontolica 2010

    Regular readers of this blog know that Surfray’s MondoSearch and Ontolica (search for SharePoint) products have a bit of a checkered past. First vendor MondoSoft went bankrupt; it was then sold to SurfRay; then SurfRay went bankrupt; then it was restructured to SurfRay 2009. But I’m happy to report that the Danish company now seems to be doing a lot better.

    It’s now owned by a Danish venture capital outfit called Vækstfonden (a "government backed investment fund") with Vækstfonden’s Søren Pallesen as CEO. And even though the company went through several rough patches, it seems the revenue stream for its software remained  solid enough; the company claims its U.S. subsidiary kept turning profits throughout.

    While that is promising for future stability, the vendor has remained relatively quiet on other fronts. Unsurprisingly, for some time no new products were coming out.

    But it seems that, too, is changing: in the past few months new minor versions of MondoSearch and Ontolica were announced. MondoSearch is still going to need a lot of work before it’ll be up to par with modern offerings, but Ontolica is still a pretty good alternative to SharePoint 2007’s rather basic search functionality. It now also includes search analytics (previously a separate product), something which alone might merit a license — assuming you have sufficient resources to analyze and act upon the potential goldmine of search metrics.

    As a company, SurfRay is now expanding beyond the Danish borders. Though it has always maintained a presence in the U.S. (and had a booth at last year’s SharePoint conference), we’ll now see an office here in The Netherlands, as well.

    However, the elephant in Ontolica’s room, of course, is SharePoint 2010. Microsoft has finally addressed one of the oddest deficiencies: the lack of wildcard search. That’s always been one of the things Ontolica marketing pivoted on (for a while, you could get the free "Ontolica wildcard search add-on" as a teaser preview for the complete product.) So how’s SurfRay going to keep competitive, in between SharePoint 2010 and FAST for SharePoint 2010? Well, Ontolica 2010 has already been announced, and it offers many of the modern search interface features that SharePoint lacks, without adding all of the complexity of a FAST server.

    While we’ll have to see whether Ontolica 2010 will deliver on its promises, SurfRay is certainly positioning the product in exactly the gap Microsoft’s coming line-up is leaving. We’ll be keeping tabs on how that works out, and sharing our findings with our Search & Information Access subscribers.