Tags: Captiva, Capture, EMC OnDemand, EMC World, Momentum 2011
This is my notes from Robert Frey’s Introducing Captiva 6.5. A number of people I spoke to today and yesterday were surprised that I was interested in Captiva. It’s probably true that ECM and its sexier cousin, Case Management, have the wider and possibly more interesting problems to solve. However capture projects have some compelling benefits too namely that they have the potential for short payback periods. Also many interesting content management projects don’t work without capture (we are still a very paper-based world).
- 6.5 was released in March this year
- The key focus was on Performance and Intelligence:
- ‘Headline’ stats of 10M pages per day
- Better (ie more automatic) deployment of certain features
- Global capture
- Undertook benchmarking
- Used a simple 10 step process
- Used multiple clients to ensure client requests were not the bottleneck
- ‘Hammered’ Captiva server – identified and eliminated bottlenecks
- Stressed that this is indicative only as performance depends on a number of variables ‘your mileage may vary’
- There is a performance tuning guide on Powerlink
- Key performance metric is number of tasks not number of pages
- performance guide shows how to calculate tasks from the modules you have configured for your process
- A sizing guide can help you turn this information into concrete capacity recommendations
- Also some specific support for Documentum High Volume server e.g. can create Lightweight Sysobjects
Production Auto Learning
- Applicable to Structure and Semi-Structured documents
- 2 new modules for use in processes:
- Dispatcher Collector
- Dispatcher Supervisor
- Use Case is as follows:
- You have a new document type (say a new type of supplier sending you invoices)
- This will exit from the classification stage as an exception that must be handled manually
- Prior to 6.5 you would need a process to manually create the new template, test and release to production
- Dispatcher Collector module ‘watches’ the operator during the manual classification stage and tries to create a new template based on the operator’s actions
- new template is passed to the Dispatcher Supervisor, a UI step that allows a supervisor to view and approve the new template and if required release into production
- Caveat: this won’t cover every single situation but it is another tool that can be used to reduce manual intervention.
InputAccel Capture Flow Designer
- A graphical design tool to replace the previous process designer very much in the xCP mould
- Used drag and drop to build the process
- Can drop down to code if necessary
- Will be the strategic development tool going forward
- Useful routing functionality to make it easy to send a processed document to different operators depending on e.g. language in the doc or security clearance required.
- No extra charge – bundled with InputAccel
- Image converter module replaces Image generator
- Handles Asian languages
- recognises around 130 languages
- 5.3 life has been extended
- 5.3sp3 and sp4 can be upgraded directly to 6.5
- previous versions should upgrade to 5.3sp4 first
Migration to 6.5
- Rick Devenutti announced EMC OnDemand today
- Captiva one of the products in the first launch
- Take away the pain of system management -> can now concentrate on developing and supporting the business proceses
- Should allow customers to innovate and differentiate faster (instead of dealing with system management problems)
Tags: EMC World, Momentum 2011
Just finished breakfast at EMC World so it’s time to think about the week ahead. As usual there is so much I could see but there just won’t be time to cover everything. This is my first time at EMC World itself which compared to Momentum is just huge. So what will I be looking at and why?
First the future of Documentum. Primarily this will involve visits to Jeroem van Rotterdam’s architecture sessions but I wonder to what extent it will permeate the other sessions?Jeroem gave a tantalising glimpse in Lisbon of the concept of a Next Generation Information Server to replace the venerable Content Server. This time we are promised a demo of NGIS! Also he will be discussing the new scalable architecture. Make no mistake this is Important stuff. Much progress is being made to make Documentum suite VMWare ready but that is not the same as cloud-ready despite some of the messages that will probably come out this week. Assuming this ‘cloud thing’ will be more than just a buzzword then these developments will be essential to the future of Documentum.
Next Captiva. 6 months ago at Lisbon I was pleasantly surprised at the progress made with capture technologies. Most of my previous experience was with old Kofax versions, with VB release code, separator sheets, operator-keyed keywords and metadata. Capture was usually labour -intensive. I’m interested in seeing document learning, automated metadata capture and advanced recognition technologies driving up the ROI to make many more capture projects possible.
The next item is xCP-this is more a watching brief but I’m interested to see to what extent xCP is positioned as more than just case management.
Finally Atmos. This is a bit of a new one for me but cloud-based storage will become more and more important. It makes sense to get an understanding of what EMC has to offer in this area.
Very exciting, now for Joe Tucci’s keynote!
Tags: fatwire, Momentum, webpublisher
This post is the 3rd on Momentum but the 2nd on my thoughts on Mark Arbour’s road map session. It covers Web Content Management, WebPublisher and Fatwire.
The basic story is that 6 months ago after surveying the market EMC have invested in a Web experience software company, Fatwire. The first interesting point I think is that EMC invested rather than buying the company. Too early to say whether this is a change of strategy for EMC but it’s interesting to note that it’s a possible tool EMC will consider in it’s quest for the complete. I don’t know how many other examples of this approach EMC has used before; maybe it’s a prelude to buying the company, maybe it’s a longterm approach to this segment of the market.
You might ask what is web experience management as opposed to web content management. In Fatwire’s case it involves all the standard wcm features such as authoring, approval, publishing and content management. However it also adds analytics, personalisation, customer engagement and segmentation all, apparently, packaged together conveniently to allow an iterative cycle of segment-publish-analyse.
It’s clear that the prime use-case is large customer focussed Internet sites where personalising and fine-tuning the customer experience is key to commercial success. It’s less obvious what this brings for the standard intranet site which is what really interested me.
So the question was how does this tie up affect existing WebPublisher implementations? Many mature sites have considerable investment in WebPublisher templates which presumably can’t be just migrated to Fatwire.
It seems that WebPublisher will continue for a few years yet. A bit like webtop, WebPublisher is now in ‘sustaining’ mode. There is likely to be platform-tracking versions (I’m sure D6.7 was mentioned and maybe D7), there will be new certifications as necessary (ie9 anyone?) and bug fixes. however there will be no investment in new features.
From a strategic point of view I can’t see a compelling reason to purchase a conversion to Fatwire right now if you already have an intranet or Internet site unless you specifically want to utilise the additional features made available. New sites and sites considering a rewrite are obvious choices for a Fatwire investment. Obviously this is an area that needs to be tracked as I’m sure the next couple of years will see plenty of development.
A last interesting message was a general one. It seems there is now a general recognition that it is not sensible to tie UI and application releases with backend releases. UIs and business applications are likely to change rapidly , and perhaps unpredictably over the next few years. The back-end platform is likely to be more stable (until Jeroem’s next generation information server starts to become reality perhaps!).
Tags: centerstage, documentum, ECM, ECM vision, EMC World, xCP
Like Lee I wasn’t able to get to EMC World. Interestingly however I did experience much of it through twitter. Of course I didn’t get the first class, you-had-to-be-there type of experience but it was a significant experience nonetheless. Many people were tweeting during sessions and bloggers were putting up summaries of sessions almost immediately afterwards. What this meant was that not only did the facts come through but also some of the emotional reaction to announcements as well.
ECM vision required
I’ve watched (most of) the Mark Lewis keynote and I’ve read most of the blog summaries of the keynotes and other sessions. I have certainly been left with the following impressions:
- EMC appears to be retreating from core content management as a selling point
- As a corollary of the first point CenterStage is not getting the resources or attention it could
- Case Management seems to have become an over-riding priority
That’s the impression – it may not be what Mark Lewis intended but that is certainly what comes across. Given the above it is hardly surprising that EMC don’t have a particularly inspiring Enterprise Content Management vision.
So what should/could an Enterprise Content Management vision look like. First off I don’t like the idea of buying a Content Management platform so the vision has to be more than ‘you have lots of information to manage so buy our software to solve your problems’. It certainly seems that core content management functionality has been commoditised so that you can get content metadata, versioning, renditions, full-text and metadata querying and basic workflow from anywhere.
But content management functionality is not Enterprise Content Management. ECM needs arise when an organisation scales (in terms of people, numbers of teams or document volumes) such that additional problems or obstacles arise. Some of these problems are stuff like archiving or large-scale ingestion. It’s easy to see why these types of problems fit well for EMC as a primarily hardware company.
Other problems seem to require more finesse. They would include things like:
- discoverability – getting the right information to the right people
- rich content – going beyond mere content and metadata
- analytics – mining the information for enhanced value
- Building knowledge communities – to turn data and information into knowledge
- Incentives – providing some way of encouraging people to go to the trouble of making content available e.g. by tagging, writing blogs, contributing to Wikis and so on.
I would like to see EMC come out with something that shows how EMC might be the solution. That won’t solve all of these right now but I’d like to know, 3-5 years down the line, what their software might enable us to do.
One product that should be clearly at the centre (sic) of this strategy is CenterStage. For some reason this product seems to have lost management focus. It seems to have taken ages to get a GA release shipped and we are still waiting for some features that really should be there. However I think EMC should be proud of the type of product that is embodied in CenterStage and should be looking to push this as a major ECM product. I think it is much more than a simple Sharepoint competitor although that is how the marketing comes across.
One of the features of CenterStage that is not well sold is facets and in particular facets generated from analytical processing of content and comments. A facet is essentially a drill-down capability that allows the user to narrow down the results of a search. Obvious examples are the format of the document or the content size. This type of drill-down – based on author-supplied intrinsic metadata collected by any self-respecting content management system – seems so obvious you wonder why this type of feature hasn’t been standard in Content Management search for years.
However 3 other facets are available with CenterStage:
These facets are not based on metadata recorded by content authors, they are generated from a textual analysis performed on each piece of content by Content Intelligence Services (which utilises Temis Luxid as the text analysis engine). Since discoverability – getting the right information to the right people – is one of the key issues/problems in effective information management, enhancing content in this way is important.
This kind of content enrichment is not something that is provided out of the box by Sharepoint. This really never came across in any presentations I have seen and I only really got this after downloading and playing around with CenterStage. Of course it needs some further development to really make this feature great but I can’t understand why EMC aren’t shouting this from the roof-tops.
xCP and Case Management
I really want to believe that EMC don’t think that ECM and Case Management are one and the same. My initial impression from Momentum Athens (Nov 2009) was that xCP was a way of developing EMC content-based application using more configuration and less coding. Case Management was simply the first application area to get the xCP treatment.
I liked the implementation of ‘configure not code’ and it also appeared that a lot of effort and thought had gone into how to market this idea. It’s clear that a lot of resource has gone into Case Management, possibly at some expense to CenterStage, but I’d like to think that the xCP treatment will be passed on to CenterStage and other applications. I’d like EMC to show me this vision rather for me to assume all of this.
Tags: momentum emcworld strategy
I won’t be at EMC World/Momentum this week however I noted a number of trends at the last Momentum in November. Hopefully bloggers and tweeters at EMC World can comment on whether these observations still hold.
First I detected a refreshing emphasis on ‘execution’, things like not releasing too early, not being driven by macho release dates and so on. This is also evident in the desire to provide plenty of resources to support xCP such as best practices, reference applications and documentation /videos.
Secondly there was the sense that xCP really was CMAs focal point , eclipsing Centerstage in terms of importance. How important is xCP seen by EMC?
Finally a development since November are the tie-ups with Fatwire (WCM) and FirstDoc (compliance). What do these mean for EMCs CMA strategy and are they a sign of strength or weakness? More specifically what do they mean for existing WCM and compliance customers?
Tags: Case Management, Momentum 2009
This post I’ll be continuing the theme of looking at Pie’s EMC World posts as a reference point for what I see here at Momentum. I’ll try and get round to other EMC world posters as well! Andrew Chapman tells me I need to be on twitter as well. Damn I’ve been trying to avoid twitter for as long as possible. Next he’ll be telling me I need an iPhone.
emc-and-mark-lewis-focus-on-return-on-information mentioned a seeming lack of vision at EMC World. Well it’s 6 months later and perhaps EMC have been working hard on a vision. In his keynote speech Mark Lewis talked about the 5 Cs (I can’t remember all of them, but they included Cloud and probably compliance) and also ROI (Return on Information). But here are some other things that I detected.
Case Management. The CMA division has been reorganised around 3 areas, Information Access (CenterStage, WCM, Captiva, MyDocumentum, MySAP), Information Governance (the compliance and discovery stuff) and Case Management. To me the first 2 are just reorganisation of product suites that probably make sense to product managers and it certainly a certain coherence to disparate product sets. However the big thing here was a focus (big focus) on Case Management. In essence EMC Case Management as a halfway house between the old Knowledge Worker (WCM, CenterStage, Web2.0) and the Transactional processing (BPM, Archiving, etc). The focus is on things like loan processing, account opening, HR on-boarding and many other things. These processes look at bit like BPM type problems but they are not really ameanable to traditional BPM technologies; they are too constricting. So products like Task Manager, BPM suite and Composer are being re-oriented to meet this need. It looks to me like EMC is taking a big bet on this area. As far as I know there are no large vendors offering comprehensive products in this area (I expect to get comments to the contrary and welcome the chance to become better educated). It was a very impressive vision particularly a later talk by Dan Cirulli where he talked about some design and development practices involved in Case Management solutions.
Yesterday evening there was a technical keynote speech and one of the more interesting things that transpired was a move to Controlled Releases of EMC products. Basically major product or architecture changes are released to a small number of clients for validation before a general access release. This maybe why some of the releases mentioned last year don’t seem to have arrived as quickly as expected. If so I think it is a brave and commendable move. Software product companies always seem to have a rather ‘macho’ attitude to delivering products possibly to satisfy stockmarket analysts. Personally I’d rather have faith the product was going to work when it was installed. I hope the stockmarket analysts take note.
Tags: Momentum 2009
Well here I am at Momentum 2009 and as usual I will be blogging about what I see and some of my impressions on where EMC is heading. It’s now Wednesday afternoon so a large part of the conference has already taken place so I have a lot of catching up to do.
It’s interesting when doing these posts to notice the trends from previous Momentums and also EMC world which is usually scheduled about half between sucessive Momentums. As a reminder of what was happening at EMC world this year I took a look at a couple of Pie’s posts from that event. emc-world-2009-the-case-of-the-incredibly-shrinking-momentum commented on the shrinkage of CMA topics at EMC world (which covers all of EMCs offerings unlike Momentum which just covers CMA).
The number of topics seems to have stayed steady, there are 6 tracks with sessions running from monday through to Thursday morning. However the size of the conference is somewhat smaller than previous years I have attended, just over a thousand attendees. I’m not sure if this is due to the recession or just that the venue is smaller (there are definitely less exhibitors and they tend to have smaller stands). For me this is no bad thing as there is definitely a cosier feel and you keep bumping into old friends. I just hope this is not a permanent trend. As you will probably see from my subsequent posts I’m quite excited about some of the directions EMC is taking.
Tags: Composer, Continuous Integration
Ever since I got back from Momentum it’s been work, work, work. That’s what happens when you take 4 days off to look around at what’s going on. I recall that I was going to post some more thoughts on some of the other products that I saw.
I went to David Louie’s presentation on Composer. Have to say I was impressed with what I saw. This maybe because I’ve been developing with Eclipse for a while now, so having something that integrates natively with this environment is a big plus. Whilst there are many interesting functional features of Composer I was most interested in a single slide that compared Composer with Application Builder.
First Composer doesn’t require a connection to the docbase to get your work done. You can of course import objects from a docbase, but you can also import from a docapp archive.
Secondly Composer can install your application (a DAR, similar to a DocApp in concept) into a docbase via a GUI installer but you can also use something called Headless Composer which is a GUI-less installer that runs from the command line. Not absolutely sure on the specifics at this point but possibly uses ant. David said that there are details in the documentation – I will be sure to try it out and post my findings at a later date.
This last point was of great interest to me as I’m currently investigating how to run Documentum development using a continuous integration approach. Being able to deploy your artifacts from the command line, and therefore from some overall automated controlling process is essential to making continuous integration a reality. On this note I also spoke to Erin Samuels (Sharepoint Product Manager) and Jenny Dornoy (Director, Customer Deployments). I hope that the sharepoint web parts SDK that is likely to integrate into MS Visual Studio will also have support for a headless installer, and also that Documentum/EMC products generally support the continuous integration approach.
Tags: centerstage, D6, momentum 2008
Of course the star of the show was Centrestage. If you don’t know what Centrestage is (where have you been?), in a single sentence, it’ s the next generation of Documentum client providing Web 2.0 features, a significantly different customisation model (compared with WDK) and no-cost/low-cost licencing model.
I won’t go into too much detail about the features except to say they include basic content services, personal spaces, team spaces, blogs, wikis, rss, tagging and faceted search. The time line was set as 1.0 to be released April 2009 (the beta version is available on the download site), 1.5 to be released after that and then a D7 version released by the end of 2009.
The UI is composed from numerous separate components which, in concept at least, are like Sharepoint WebParts. Since each component needs to be rendered on the page separately I wondered whether this would mean that a page with, say, 20 components would need 20 separate network calls to display the page. In a high-latency network environment this could be a performance nightmare. Apparently the DWR library allows for batching of requests – it means that having numerous components on the page could be displayed using a smaller number of network requests.
Tags: Advanced Site Caching Services, Momentum, XML Store
On Tuesday and Wednesday I attended a load more sessions covering XML Store, Centrestage, Composer, Sharepoint and Web Content Management. In the next few posts I’ll share some of my thoughts and impressions, starting with XML Store.
For those that don’t know, EMC purchased a company called X-hive a while back. X-hive have an XML database product and that has now been integrated into the full Content Server stack. The easiest way to picture this is to take the old picture of the repository as consisting of a relational database and a file system and add in a third element, the XML Store.
From 6.5 (possibly sp1, I don’t remember) all XML is stored in the XML store. The XML Store is built around the many XML standards that are in existence such as XQuery, XSL and the XML full-text query standard.
The XML is not stored in the usual textual XML format but in a DOM format. This presumably is to allow them to implement various types of index and to optimise the query access patterns. The performance claims for the database are impressive although they need to be taken with a pinch of salt. As with all benchmarking, vendors will target specific goals in the benchmark. However your real-life workloads could be very different. If you are expecting high-throughput for an application using the XML store I suggest you put some work into designing and executing your own benchmarks.
In addition to indexes there is also a caching facility. This was only talked about at a high-level, however just as relational database performance experts made a career in 1990s out of sizing the buffer cache properly so we may see something similar with XML database installations. We may see them suffering poor performance as a result of under-sized hardware and mis-configuration. As always don’t expect this to just work without a little effort and research.
One other point I should make is that the XML Store is not limited to the integrated Content Server implementation. You can also install instances of XML Store separately. For example the forthcoming Advanced Site Caching Servicees product provides for a WebXML target. This is essentially an XML Store database installed alongside the traditional file system target that you currently get with SCS. You can then use the published XML to drive all sorts of clever dynamic and interactive web sites.