Mixing and matching
- Article 3 of 6
- Database Marketing, October 2003
Robert Buckley considers the challenges involved in customer data integration and how careful planning is needed to make both initial and ongoing data management a success
Page 1 | Page 2 | Page 3 | All 3 Pages
Data integration is one of those tasks that sound easier than they are. Just about any company without a CRM system from day one will have most of its customer data in separate silos spread across departments, subsidiaries, external bureaux and bought-in data feeds. And every time the company buys in a new feed or acquires a new company, thousands of new records become available outside the existing storage areas – is it any surprise that data inside companies doubles every 18-24 months, according to the Office of National Statistics? To achieve a single view of customers and prospects, these different sources have to be reconciled and brought together in a single database if the company is to avoid duplicate mailings, imprecise campaign targeting and the continual expense of trying to link these data sources.
But unless there is a simple way to match records in different data sources to each other and decide which data takes priority when there is conflict, the job of integrating suddenly becomes a lot more risky and a lot harder. In fact, according to analysis firm Forrester Research, while 98% of companies recognise the value of an accurate single customer view, only 2% claim to have built one successfully.
So, is it better to step away from the problem and let the bureaux handle these difficulties, or can the IT department use its existing skills to bring the information together for you?
According to Lynne Clafton, product leader at data management company Acxiom, and other data integration specialists, once any company starts considering developing a single view of its customers and prospects, it needs to look at its data integration approach immediately. “It’s the absolute core. It’s the dull, boring, absolutely essential bit that starts right at the beginning,” says Clafton. “Everyone is sick to death of talking about CRM, mainly because they’re not seeing the benefits they thought they would. But largely, where it all tends to go wrong is at the very beginning of the process, which is the integration of the data. You absolutely have to get it right for everything else to be optimised and meet company objectives.”
But with data integration upfront in the process, a chicken-and-egg situation arises. Unless the company plans out in detail how it is to integrate the data and what processes will be needed afterwards to ensure it remains successful and will not have to be repeated, it cannot be sure who will integrate it or what tools and expertise might be needed. But until it knows who is to integrate the data and the tools it will buy, it cannot determine how to integrate the data and what processes it will need. So where does it start?
Usually, by calling in consultants or if it has the in-house experience already, it can overcome this problem. And if it takes a step back from the project, it can also see what it really needs. “Over the last few years, the focus has been completely on ROI,” says Ian Hitt, marketing director at Identex. “For any data integration project to get the go-ahead, it’s needed a watertight case taken to the board to get signed off. So there have been internal review processes looking at what to achieve, and they’ve almost always gone through a discovery process internally.”
These review processes have usually had one of three outcomes: inexperienced companies trying to integrate their datasets without fully understanding the issues involved, resulting in a botched integration and a return to the beginning of the process considerably lighter in pocket; a decision to outsource at least part of the project to bureaux because of a lack of internal expertise and resources; or a decision to buy in expertise and resources and complete the project internally.
According to Sophie Sail, customer management consultant for Experian's Truvue data integration service, the former approach is by far the norm. “Most companies try it themselves. It’s very much standard in the UK marketplace. Software seems like a cheaper option, but companies often underestimate the resources needed to use it and maintain it. So the clients we tend to talk to have gone down that route, found it wasn’t meeting the requirements of the business because it wasn’t accurate enough and it was draining too many internal resources.”
The key to determining who will perform the data integration is the biggest potential pitfall: record matching. Using name and address rather than unique customer IDs to link databases is going to be more of an art than a science. But when integrating data from different departments or from external companies, there will probably be no other choice. Even after a concerted data-cleansing effort to standardise address formats and details to PAF, change details using the national change of address file, and eliminate customers in suppression files, all of which are necessary on an ongoing basis as well as in the initial data integration, there are still many problem areas that force companies without the resources and the right experience to outsource.
Experian’s Sail envisages situations where two people who live at the same address and have similar names ?��Ǩ��� such as, Mr S Brown and Sam Brown ?��Ǩ��� could be integrated during a project. However, with access to a superior “reference” database that an external bureau might have, the company would never have combined the records of Mr Samuel Brown and Mrs Samantha Brown. Experian’s own Truevue service, since it contains data on the previous addresses and names of its subjects, can help integration projects match records that may not have either names or addresses in common ?��Ǩ��� the kinds of records typically causing the single view to break down and customers data ending up stored in separate records.
But Jon Cano-Lopez, MD of bureau Attwood, says that if companies are prepared to invest sufficiently, they are more than capable of performing the integration themselves. “People do get it right if they invest enough and they employ the right sort of skillset. I would just recommend they put in the core of the system at an early stage, and deal with names and addresses specialists to gain as much information as possible, then make a decision about when or where to bring in data validation to deal with ambiguities.”
Outsourcing the initial integration also holds its own problems. Andrew Greenyer, director of customer relationship solutions for Group 1 Software, believes the initial data analysis to determine the extent of the integration problem needs to be done completely in-house as a result. “An outside consultant needs briefing on the format, range, losses and time-scale of the project,” he points out. A bureau simply does not know an organisation’s data and the uses to which it puts it as well as it does itself, although that will change over time.
Chris Morgan, MD of information management consultants Data Liberation, also argues that attempts to keep the initial set-up costs down when outsourcing results in processes getting set in stone without the flexibility to adapt as new feeds come online. Then, “Organisations don’t get to ask new questions and update surveys as often as they’d like.”
Steve Clarke, MD of bureau and software developer CDMS, agrees. “Once the project is up running, if there’s a change or an extra feed, a lot of development work goes into that. Many consultants will argue that these changes weren’t in the original brief or the requirement never came up, you’ll have to change this or that which will take so many man-hours or days.” You need to pick a technology that will keep down development costs and allow change.”
A hybrid approach, in which a flexible, single-view database is developed by an external bureau before it is handed to an in-house team for daily maintenance, is usually the best fit for most companies. With rules and processes set up to ensure feeds come in a standardised, known format, ongoing integration is not a significant problem or even a problem at all.
If the system is flexible, new feeds are also manageable once they have been cleansed and run past suppression files. If this is passed to a bureau initially, the regular investment in external datasets and the processes required can be reduced, while control of the database remains in-house.
But David Haigh, CTO of integration software developer Avanade, warns that outsourcing the regular maintenance of the database to an external company leads to a slower update cycle. “Marketing campaign-planning slows to a crawl, typically taking weeks where it should be taking days and in some cases hours. In addition, meta-data and contextual information about customers may remain with the bureaux rather the organisations.”
Clarke, who has built databases for Lloyds TSB and Nationwide, agrees the update cycle can suffer as a result of outsourcing. “It depends on what kind of database you want. Lloyds TSB have more than 40 feeds per month and occasionally they get another one. The problem of an external monthly update is that the data can get quite old. By the time they’ve pulled it off their systems, put in on CDs, and the bureaux do their load, you potentially have a database that’s two months old, which isn’t fresh enough for the marketers.” For the annual Christmas mail-out, it is usually sufficient, however, and as Peter Kempsey, MD of Intellidata, points out, the costs involved in creating real-time or near real-time marketing databases are “horrendously high” – any attempt to do real-time integration is usually impossible, since 10% of the data is almost always impossible to integrate without human intervention. So the cost-effectiveness of outsourcing is heavily influenced by the regularity and number of additional data feeds you have. The likelihood is that unless a company is as heavy a mailer as Reader’s Digest, it is unlikely to benefit considerably by running its own in-house suppression and cleansing system on an ongoing basis.
Nevertheless, managing a database internally requires discipline, resources and the right attitude. According to Cano-Lopez, the success of internal database management hinges on the politics of the IT department and their skills. “The way the IT department works will determine the success or failure of a project. They need to give it the respect it deserves.” A marketing project will often be considerably further down the IT department’s priority list than the operational requirements of the business.
With the right sort of project leader, argues Intellidata’s Peter Kempsey, “the IT guys will sign on board with enthusiasm and devote a lot of energy. But if you don’t have that leadership, no end of training and teaching will make it work at the end of the day.”
Data integration projects ultimately belong where the most expertise and the best results lie ?��Ǩ��� for the lowest cost. An organisation that understands both its own data and the resources necessary to unite all its data stores will often experience as many benefits, if not more, than those who outsource the entire process. But organisations that do not appreciate the problems involved in integration or the resources they will need to successfully achieve a single customer view are likely to spend more and achieve less than if they let the bureaux handle the job.
Page 1 | Page 2 | Page 3 | All 3 Pages
