Mixing and matching
- Article 3 of 6
- Database Marketing, October 2003
Robert Buckley considers the challenges involved in customer data integration and how careful planning is needed to make both initial and ongoing data management a success
Page 1 | Page 2 | Page 3 | All 3 Pages
But Jon Cano-Lopez, MD of bureau Attwood, says that if companies are prepared to invest sufficiently, they are more than capable of performing the integration themselves. “People do get it right if they invest enough and they employ the right sort of skillset. I would just recommend they put in the core of the system at an early stage, and deal with names and addresses specialists to gain as much information as possible, then make a decision about when or where to bring in data validation to deal with ambiguities.”
Outsourcing the initial integration also holds its own problems. Andrew Greenyer, director of customer relationship solutions for Group 1 Software, believes the initial data analysis to determine the extent of the integration problem needs to be done completely in-house as a result. “An outside consultant needs briefing on the format, range, losses and time-scale of the project,” he points out. A bureau simply does not know an organisation’s data and the uses to which it puts it as well as it does itself, although that will change over time.
Chris Morgan, MD of information management consultants Data Liberation, also argues that attempts to keep the initial set-up costs down when outsourcing results in processes getting set in stone without the flexibility to adapt as new feeds come online. Then, “Organisations don’t get to ask new questions and update surveys as often as they’d like.”
Steve Clarke, MD of bureau and software developer CDMS, agrees. “Once the project is up running, if there’s a change or an extra feed, a lot of development work goes into that. Many consultants will argue that these changes weren’t in the original brief or the requirement never came up, you’ll have to change this or that which will take so many man-hours or days.” You need to pick a technology that will keep down development costs and allow change.”
A hybrid approach, in which a flexible, single-view database is developed by an external bureau before it is handed to an in-house team for daily maintenance, is usually the best fit for most companies. With rules and processes set up to ensure feeds come in a standardised, known format, ongoing integration is not a significant problem or even a problem at all.
If the system is flexible, new feeds are also manageable once they have been cleansed and run past suppression files. If this is passed to a bureau initially, the regular investment in external datasets and the processes required can be reduced, while control of the database remains in-house.
But David Haigh, CTO of integration software developer Avanade, warns that outsourcing the regular maintenance of the database to an external company leads to a slower update cycle. “Marketing campaign-planning slows to a crawl, typically taking weeks where it should be taking days and in some cases hours. In addition, meta-data and contextual information about customers may remain with the bureaux rather the organisations.”
Clarke, who has built databases for Lloyds TSB and Nationwide, agrees the update cycle can suffer as a result of outsourcing. “It depends on what kind of database you want. Lloyds TSB have more than 40 feeds per month and occasionally they get another one. The problem of an external monthly update is that the data can get quite old. By the time they’ve pulled it off their systems, put in on CDs, and the bureaux do their load, you potentially have a database that’s two months old, which isn’t fresh enough for the marketers.” For the annual Christmas mail-out, it is usually sufficient, however, and as Peter Kempsey, MD of Intellidata, points out, the costs involved in creating real-time or near real-time marketing databases are “horrendously high” – any attempt to do real-time integration is usually impossible, since 10% of the data is almost always impossible to integrate without human intervention. So the cost-effectiveness of outsourcing is heavily influenced by the regularity and number of additional data feeds you have. The likelihood is that unless a company is as heavy a mailer as Reader’s Digest, it is unlikely to benefit considerably by running its own in-house suppression and cleansing system on an ongoing basis.
Nevertheless, managing a database internally requires discipline, resources and the right attitude. According to Cano-Lopez, the success of internal database management hinges on the politics of the IT department and their skills. “The way the IT department works will determine the success or failure of a project. They need to give it the respect it deserves.” A marketing project will often be considerably further down the IT department’s priority list than the operational requirements of the business.
With the right sort of project leader, argues Intellidata’s Peter Kempsey, “the IT guys will sign on board with enthusiasm and devote a lot of energy. But if you don’t have that leadership, no end of training and teaching will make it work at the end of the day.”
Page 1 | Page 2 | Page 3 | All 3 Pages
