Data and Change Management

Oct. 1, 2010
Putting systems in place for validating and managing changes to product data is the key to capturing ongoing cost savings and other benefits of data synchronization.

One of the secrets to success in public speaking is to know your audience and tailor your content to their interests. This principle of public speaking also holds true for producers of marketing data for catalogs and electronic transactional data. These two types of data are usually produced by the product's manufacturer, but how they are handled has implications for the entire value chain. Both types of data must be produced and maintained so buyers have easy access to detailed product descriptions and specifications and supply-chain partners have clean, up-to-date, detailed, high-quality transactional data.

Website content is carefully designed to make it easier for customers to find the products they need for estimating and projects, including linked specifications for details to ensure compatibility. All manufacturers and distributors know customers need to see and understand the products they are purchasing, so when technology transformed printed catalog books to websites full of marketing data with images and specifications, there was always a broad understanding of why doing this was necessary to support the manufacturers' business growth and service levels.

Transactional product data must also be built with the customers' business needs in mind. There must be a change management process built around each category of the deltas or data changes — additions, changes and deletions — to ensure a sustainable process providing consistent data quality. The benefits are not one-sided. When manufacturers produce consistent, high-quality data, they grow sales and their own return on investment. Data producers must focus on the needs of the data users in creating systemic change processes so updates are produced in the event of an actual product change.

Updating the Distributor's Data

One of the challenges of handling data in the electrical industry is that so much of the transactional data is specific to the manufacturer and requires custom fields for specifying certain attributes of the products. To improve data quality for custom fields, in the late 1990s an industry data standard called the Product Descriptor Database was created — it evolved into the Industry Data Warehouse (IDW), handled by IDEA, Arlington, Va., and continues to be developed by the IDEA Standards Committee. There are also proprietary data-standard systems employed by third-party service providers such as Trade Service Corp., San Diego, and some manufacturers also provide data directly to their distributors.

With so many data targets, each with varied attributes and requirements, is it possible to address all of these in the manufacturers' data processes for change management? Actually, yes it is.

There are some basic attributes common and necessary to all, and among those attributes is the concept of life-cycle in conjunction with flagging codes, which are used in by popular distributor enterprise resource planning (ERP) and contractor estimating systems. Those life-cycle flags are key.

Here are the common attributes that most electrical industry users value and require in their transactional data:

  • Unique Identity Key (UPC/EAN/GTIN)

  • Unique Catalog Number

  • Detailed Description

  • Distributor Published Prices (only some have this and available only to distributors)

  • Suggested Resale Prices

  • Price Unit of Measure

  • Packaging (at least 1 level)

  • Life-Cycle Flagging code

  • Product Categorization code

  • Price Categorization code ( only some have this and it's available only to distributors)

So data producers should be visualizing how their customers will be using the data for these common attributes as well as the impacts from data changes. This list is a starting point for setting up the change management process for the most commonly used and critical fields. There are more attributes provided by data producers and the change process is needed for all of these fields. It's all about the customer using your product data in their business operations to maintain an accurate inventory to sell more product and provide better service to their customers.

Staying Synchronized

A problem arises in that many manufacturers have their own unique system for identifying when products are active or when they should be planned for obsolescence or deleted (in data circles this is the “retired” state). Definitions of standardized life-cycle states are spelled out in the IDEA Standards Committee's Product Change Management white paper found on the IDEA web site at www.idea-esolutions.com. Although there are many life-cycle flagging codes used in the electrical industry today, again there are some common life-cycle states. If adopted globally by all manufacturers, these would greatly help them stay continually synchronized with their customers.

When products are being phased out (retired), very often their corresponding product update data does not reflect these changes. This common oversight becomes a primary cause of unsynchronized data between manufacturers and distributors. Here's what happens next:

  • The manufacturer may stop sending the retired item in their next update.

  • The distributor receives the new update but does not know why the item is not included in the file.

  • It requires the distributor to contact the vendor's customer service.

  • If the distributor doesn't receive a response, then the distributors' database will continue to maintain this item — the beginning of unsynchronized data.

  • The distributor may not learn about the retirement of an important product in time to buy additional stock for an important customer

  • The distributor may not learn about the replacement for the retired item and keeps trying to order the previous product, leading to delays and erosion in customer-service levels.

It's a heavy responsibility on the data producers when a mistake is not caught during the error-checking process. What seems like a very small issue can balloon into a major time-waste when distributors could have been selling or helping a customer instead of running the gauntlet of trying to figure out the right UPC to use to place an EDI order. Each component has an identity key which is usually the UPC/GTIN number. If this suddenly changes on an update, it creates problems — now the distributor will need to spend extra time researching to figure out which UPC/GTIN and which catalog number are the correct ones to use.

At Western Extralite, in Kansas City, Mo., after data is loaded into the company's Eclipse ERP system, the first order of business is to reconcile any discrepancies. “The new update file is synchronized against the historical data to identify the matches and non-matches,” Sharon Todd, purchasing manager says. “On the non-matches for key fields like UPC/GTIN, Catalog Number, Price Unit of Measure and Pricing to Price Unit of Measure integrity, if the solution is not apparent, these are sent back to the manufacturer for follow-up.” Sometimes this creates delays before data can be used.

Just as data is used to operate any kind of business, distributors are also creators and authors of data, not just users. “Employees entering a new vendor or new customer will be using a form to ensure data quality and must adhere to company business rules,” Todd says. Western Extralite also takes advantage of technology tools like reports and dashboards to ensure that data created or changed yesterday is accurate. According to Randy Collins, the company's IT manager, “to ensure data quality, they needed a system of checks and balances so daily reports are run for new customers, new vendors, new contacts, new items and more. Those responsible provide oversight of these, ensuring that mistakes are discovered and corrected quickly, which reduces errors reaching our customer-fulfillment process.”

At Elliott Electric Supply in Nacogdoches, Texas, Phil Hale, director of IT, relies on a system of rules to check incoming data for anomalies. “Inbound data is checked against historical business rules and other information to ensure high data quality before it is applied to the operational database,” he says. “Company business rules for data validation are integrated into data-entry forms to eliminate mistakes. If these kinds of mistakes are not caught at the entry points, there could be a domino effect and soon several departments are wasting time fixing the same error.”

It's important for distributors to not only find and correct errors but to use the information about the source of the errors so they can prevent the same kind of errors from occurring in the future.

Implementing Data Change Management

Many manufacturers have implemented a centralized database process called master data management (MDM) to ensure consistency and control of data.

Cooper B-Line, a division of Cooper Industries, uses MDM to eliminate the chance of duplicated data. Mike Swiney, pricing manager, shares what they do in their change management processing. “To enforce consistency and accuracy in data authoring for adds, changes or deletes, only a few individuals are authorized to make database changes,” he says. “Data is internally scrubbed and synchronized before it's pushed to any of the data targets.”

Since the switch to MDM, those creating data changes follow the company data governance policy, which incorporates company rules and industry standards, and they've improved both their data quality and consistency, Swiney says.

“We know that it wastes time for distributors when trying to order components that were already obsolete and won't be manufactured anymore,” he says. To prevent this from happening, B-Line does a synchronization check with the IDW data and their latest data extraction file. “We next create an update to obsolete any items that fall into this group and the same information is either sent to the IDW or provided to the other industry data services,” he adds.

At Shat-R-Shield Inc., Roger Leonard, IT and EDI manager said they began to make changes to the data-authoring and change-management process following last year's IDEA E-Biz Forum, when they heard several of their customers say, “We're really depending on your data.”

“We weren't managing our data standards as well as we could and that's where we started making changes,” Leonard says. Both Cooper B-Line and Shat-R-Shield have reviewed their data-authoring and data change-management processes to identify and implement systemic processes to improve data quality and greatly reduce the possibility of data errors or inconsistencies.

Data Quality Management

Ensuring high data quality gets important when data producers realize the impact of small data creation errors or missing change management processes that cause their product data to get out-of-sync with the distributors' databases. There are just a few basic steps which, once in place, can provide timely, accurate, consistent and valid data throughout the supply chain:

  • Use a centralized database system.

  • Have a systematic process to capture data changes — additions, changes and deletions — and publish these updates on the event of the change to your products.

  • Learn about and use the life-cycle flagging codes in all of your outbound data files. This is critical for those items with UPC/GTIN changes and any type of retirement state (obsolete or deleted).

  • Check your data before and after an update

  • Create written documentation about the change management process.

  • Seek feedback from all data targets on what could be better.

Summary

High data quality is not hard to achieve but it does require a broader view of actual users and their needs, in addition to systematic process management of changed data. Making these data improvements with the adoption of data governance will improve data and lower costs throughout the entire supply chain, freeing everyone to spend more time selling, satisfying and building relationships with customers.