Showing posts with label master data management. Show all posts
Showing posts with label master data management. Show all posts

Sunday, March 8, 2009

Ten Commandments for MDM Implementation Prabuddha Roy

Ten Commandments for MDM Implementation
Prabuddha Roy


Without wanting to labour the point about what MDM is, I nevertheless feel compelled to state my understanding of this relatively new technology area before I get into discussing integration of it with business intelligence. This is done mainly for the sake of clarity. Master data management is a set of processes, policies, services and technologies used to create, maintain and manage data associated with a company’s core business entities as a system of record (SOR) for the enterprise. Core entities include customer, supplier, employee, asset, etc. Note that master data management is not associated with transaction data such as orders, for example. Whether you build your own MDM system or buy a MDM solution from a MDM vendor in the marketplace, it should meet a number of key requirements. These include the ability to:

* Define and maintain metadata for master data entities in a repository.
* Acquire, clean, de-duplicate and integrate master data into a central master data store.
* Offer a common set of shared master data services for applications, processes and portals to invoke to access and maintain master data entities (i.e., system of entry [SOE] MDM services).
* Manage master data hierarchies including a history of hierarchy changes and hierarchy versions.
* Manage the synchronisation of changes to master data to all operational and analytical systems that use complete sets or subsets of this data.



Here is a brief elucidation of the Ten Golden Principles that needs to be appreciated during any MDM Investment



1. Getting Started: Irrespective of the industry, market segment, or existing IT environmentthere is a hunt for the right way to launch an MDM initiative. Many due diligence initiatives propose that MDM is the only right answer to the most critical business problems. Many Corporates have started securing budgets . for launching a sustainable MDM program. The Idea is clear "Start small -- with an initial project -- but think large-scale, long-term, and across subject areas." MDM requires new technologies, specialized skills, and a business focus. With all of these ingredients in place, the payoff is well worth the effort.

2. ROI : Perhaps the biggest issue is how to justify and get funding for an MDM project. As with data warehousing projects, some organizations are blessed with enlightened executives who understand the correlation between high-quality, consistent data and their strategic objectives; these executives may approve MDM projects without delay. Other project leaders must closely align MDM projects with business need and pain and perform a detailed cost-justification analysis. MDM project managers can easily identify and monetize cost savings, but the best approach is to align the MDM initiative with strategic objectives or shoehorn it into approved projects as a necessary underpinning for success.

3. Serendipity: MDM is a business solution that offers a myriad of unexpected benefits once implemented. Many MDM early adopters discovered that once they cleansed and reconciled data through an MDM initiative, they not only improved the quality and consistency of the data available among systems supporting key business processes, but they also enabled other business initiatives such as mergers and acquisitions support, customer relationship management, target marketing, and supply chain optimization. “Data is a corporate asset, and when carefully managed, it provides a strong, stable foundation to support any information-centric business initiative an organization wishes to pursue now or in the future," said Wayne Eckerson, director of research at TDWI. Without MDM, many organizations will spend millions of additional dollars executing information-centric strategic initiatives or won't even attempt them at all.

4. Change Management: Change management is key. From a technical perspective, understanding and socializing the impact of MDM development activities has everything to do with the perception of success. From a cultural perspective, managing the expectations of business and IT stakeholders is nothing less than a make-or-break proposition. Change management is hard and can derail an MDM project if you aren't careful, when you change the data that end users have become accustomed to receiving through reports or other means, it can cause significant angst. You have to anticipate this, implement a transition plan, and prepare the users.

5. Roadblocks : IT and business can pose significant roadblocks. IT stakeholders, many of whom are beholden to established technologies, often need to hear the MDM pitch as much as business users do. Distinguishing MDM from incumbent technologies is an often-underestimated step. Conversely, the business may not want to initiate or fund an MDM project when they already have many of the existing tools and technologies required to do MDM. A very commonly addressed concern/query from majority of the CTOs "Our business has already funded our data warehouse and a customer relationship management solution.,wasn't the data warehouse supposed to solve these issues? wasn't the CRM system supposed to reconcile customers? How can I now convince the stakeholders to take on an MDM initiative?" The answer is that MDM can optimize those existing solutions, reducing their overall expense and minimizing the risk that they will deliver bad data (which could be their kiss of death). Also it is advisable that when purchasing an MDM solution, you shouldn't pay vendors for comparable technologies you already have in house, but which come bundled in their packages.

6. Enterprise Scope : It is true Enterprise MDM may be fraught with problems. With the vision of "start small, think big”, the vast majority of the corporates asserts their wish to quickly broaden their initial implementation to encompass additional domains, systems, and data. Organizations have started supporting their CRM program with better data which facilitiaites to perform business functions that could never have done with native CRM." Thus starting Small companies now plans to extend its MDM capabilities to additional operational systems. Once an organization has implemented an MDM project and it takes root, the organization can then decide whether to widen the road by adding more domains to the existing environment or extend the road by using MDM to address other business problems. 7. Data Governance. - "Data governance is a critical path to MDM,In order to be effective data governance must be designed. A company's cultural norms, established development processes, and incumbent steering committees must all factor into its data governance framework.It is recommended to grow data governance organically and in lockstep with an MDM architecture, which evolves over time. First, one should define what policies and rules are needed by the business to formulate to support an MDM project that solves a business problem. Then, one can formalize the requirements needed to sustain the initiative. That way, the business is working in their own perceived interest, not IT's.

8. Cross-System Data Analysis :One major issue is the time and costs involved in understanding and modeling source data that spans multiple, heterogeneous systems. Microsoft had 10 people working for 100 days to analyze source systems targeted for MDM integration, while the European Patent Office has 60 people analyzing and managing patent data originating around the world. Estimates show that the services-to-software ratio in MDM deployments is 10 to 1, with cross-source data analysis consuming the lion's share of the services. Just as in the data warehousing world, when early adopters in the 1990s underestimated the quality and condition of source data required to deliver an enterprise data warehouse, many implementers have not thought much about the challenges of understanding and reconciling source data.

9. Matching :Calibrating matches between records generated by disparate systems is both art and science. Many speakers acknowledged that these matching engines -- which are typically delivered within a data quality tool -- are the brains of the MDM system.

Many a times consultants need a few go-arounds configuring their matching rules. Many Consultants shared the consequences of "over-matching" records and thus consolidating two products or customers into one faulty record. The point was that matching needs to be refined over time. One must remember that MDM is as much about business and data rules as it is about the data itself.


10. Logical Data Models. An existing logical data model can propel you forward. Data administration skills is imperative in MDM. While some vendors maintain that a logical data model isn't required to make their tools work, most agree that the exercise itself can help a company understand definitions and usage scenarios for enterprise data, which makes it easier to gain consensus around data policies. Carl Gerber, senior manager of data warehousing at Tween Brands, shared how his company's data modeling and stewardship skills were a large part of his team's successful MDM delivery. Tween has created a series of data integration hubs to manage various subject areas, such as product, inventory, suppliers, and merchandising hierarchies. All data exchanged between systems passes through these hubs to ensure data consistency and reconciliation. The architecture has eliminated numerous point-to-point interfaces and improved operational efficiency, decision making, and revenue-generation objectives.

Ten Commandments for MDM Implementation Prabuddha Roy

Ten Commandments for MDM Implementation
Prabuddha Roy


Without wanting to labour the point about what MDM is, I nevertheless feel compelled to state my understanding of this relatively new technology area before I get into discussing integration of it with business intelligence. This is done mainly for the sake of clarity. Master data management is a set of processes, policies, services and technologies used to create, maintain and manage data associated with a company’s core business entities as a system of record (SOR) for the enterprise. Core entities include customer, supplier, employee, asset, etc. Note that master data management is not associated with transaction data such as orders, for example. Whether you build your own MDM system or buy a MDM solution from a MDM vendor in the marketplace, it should meet a number of key requirements. These include the ability to:

* Define and maintain metadata for master data entities in a repository.
* Acquire, clean, de-duplicate and integrate master data into a central master data store.
* Offer a common set of shared master data services for applications, processes and portals to invoke to access and maintain master data entities (i.e., system of entry [SOE] MDM services).
* Manage master data hierarchies including a history of hierarchy changes and hierarchy versions.
* Manage the synchronisation of changes to master data to all operational and analytical systems that use complete sets or subsets of this data.



Here is a brief elucidation of the Ten Golden Principles that needs to be appreciated during any MDM Investment



1. Getting Started: Irrespective of the industry, market segment, or existing IT environmentthere is a hunt for the right way to launch an MDM initiative. Many due diligence initiatives propose that MDM is the only right answer to the most critical business problems. Many Corporates have started securing budgets . for launching a sustainable MDM program. The Idea is clear "Start small -- with an initial project -- but think large-scale, long-term, and across subject areas." MDM requires new technologies, specialized skills, and a business focus. With all of these ingredients in place, the payoff is well worth the effort.

2. ROI : Perhaps the biggest issue is how to justify and get funding for an MDM project. As with data warehousing projects, some organizations are blessed with enlightened executives who understand the correlation between high-quality, consistent data and their strategic objectives; these executives may approve MDM projects without delay. Other project leaders must closely align MDM projects with business need and pain and perform a detailed cost-justification analysis. MDM project managers can easily identify and monetize cost savings, but the best approach is to align the MDM initiative with strategic objectives or shoehorn it into approved projects as a necessary underpinning for success.

3. Serendipity: MDM is a business solution that offers a myriad of unexpected benefits once implemented. Many MDM early adopters discovered that once they cleansed and reconciled data through an MDM initiative, they not only improved the quality and consistency of the data available among systems supporting key business processes, but they also enabled other business initiatives such as mergers and acquisitions support, customer relationship management, target marketing, and supply chain optimization. “Data is a corporate asset, and when carefully managed, it provides a strong, stable foundation to support any information-centric business initiative an organization wishes to pursue now or in the future," said Wayne Eckerson, director of research at TDWI. Without MDM, many organizations will spend millions of additional dollars executing information-centric strategic initiatives or won't even attempt them at all.

4. Change Management: Change management is key. From a technical perspective, understanding and socializing the impact of MDM development activities has everything to do with the perception of success. From a cultural perspective, managing the expectations of business and IT stakeholders is nothing less than a make-or-break proposition. Change management is hard and can derail an MDM project if you aren't careful, when you change the data that end users have become accustomed to receiving through reports or other means, it can cause significant angst. You have to anticipate this, implement a transition plan, and prepare the users.

5. Roadblocks : IT and business can pose significant roadblocks. IT stakeholders, many of whom are beholden to established technologies, often need to hear the MDM pitch as much as business users do. Distinguishing MDM from incumbent technologies is an often-underestimated step. Conversely, the business may not want to initiate or fund an MDM project when they already have many of the existing tools and technologies required to do MDM. A very commonly addressed concern/query from majority of the CTOs "Our business has already funded our data warehouse and a customer relationship management solution.,wasn't the data warehouse supposed to solve these issues? wasn't the CRM system supposed to reconcile customers? How can I now convince the stakeholders to take on an MDM initiative?" The answer is that MDM can optimize those existing solutions, reducing their overall expense and minimizing the risk that they will deliver bad data (which could be their kiss of death). Also it is advisable that when purchasing an MDM solution, you shouldn't pay vendors for comparable technologies you already have in house, but which come bundled in their packages.

6. Enterprise Scope : It is true Enterprise MDM may be fraught with problems. With the vision of "start small, think big”, the vast majority of the corporates asserts their wish to quickly broaden their initial implementation to encompass additional domains, systems, and data. Organizations have started supporting their CRM program with better data which facilitiaites to perform business functions that could never have done with native CRM." Thus starting Small companies now plans to extend its MDM capabilities to additional operational systems. Once an organization has implemented an MDM project and it takes root, the organization can then decide whether to widen the road by adding more domains to the existing environment or extend the road by using MDM to address other business problems. 7. Data Governance. - "Data governance is a critical path to MDM,In order to be effective data governance must be designed. A company's cultural norms, established development processes, and incumbent steering committees must all factor into its data governance framework.It is recommended to grow data governance organically and in lockstep with an MDM architecture, which evolves over time. First, one should define what policies and rules are needed by the business to formulate to support an MDM project that solves a business problem. Then, one can formalize the requirements needed to sustain the initiative. That way, the business is working in their own perceived interest, not IT's.

8. Cross-System Data Analysis :One major issue is the time and costs involved in understanding and modeling source data that spans multiple, heterogeneous systems. Microsoft had 10 people working for 100 days to analyze source systems targeted for MDM integration, while the European Patent Office has 60 people analyzing and managing patent data originating around the world. Estimates show that the services-to-software ratio in MDM deployments is 10 to 1, with cross-source data analysis consuming the lion's share of the services. Just as in the data warehousing world, when early adopters in the 1990s underestimated the quality and condition of source data required to deliver an enterprise data warehouse, many implementers have not thought much about the challenges of understanding and reconciling source data.

9. Matching :Calibrating matches between records generated by disparate systems is both art and science. Many speakers acknowledged that these matching engines -- which are typically delivered within a data quality tool -- are the brains of the MDM system.

Many a times consultants need a few go-arounds configuring their matching rules. Many Consultants shared the consequences of "over-matching" records and thus consolidating two products or customers into one faulty record. The point was that matching needs to be refined over time. One must remember that MDM is as much about business and data rules as it is about the data itself.


10. Logical Data Models. An existing logical data model can propel you forward. Data administration skills is imperative in MDM. While some vendors maintain that a logical data model isn't required to make their tools work, most agree that the exercise itself can help a company understand definitions and usage scenarios for enterprise data, which makes it easier to gain consensus around data policies. Carl Gerber, senior manager of data warehousing at Tween Brands, shared how his company's data modeling and stewardship skills were a large part of his team's successful MDM delivery. Tween has created a series of data integration hubs to manage various subject areas, such as product, inventory, suppliers, and merchandising hierarchies. All data exchanged between systems passes through these hubs to ensure data consistency and reconciliation. The architecture has eliminated numerous point-to-point interfaces and improved operational efficiency, decision making, and revenue-generation objectives.

Monday, February 16, 2009

Using MDM Java APIs to add Text(type) taxonomy attribute allowable values Paras Arora

Using MDM Java APIs to add Text(type) taxonomy attribute allowable values
Paras Arora



Now consider this a continuation of my previous blog on MDM Java APIs (taxonomy attributes):

Using MDM Java APIs to retrieve Taxonomy attribute Values - this was while attempting to replicate all capabilities of MDM Data Manager using MDM Java APIs.

During the same exercise I had made up my mind, to develop a highly customised data manager using MDM Java APIs & Portal UI technologies. As of now, my Custom Data Manger is under build and would have to wait before I share the same with the community.

For now, I am sharing the following solution for the scenario, which propped up as a result of brief conversation over a cup of Tea.

"MDM Taxonomy attributes can have a set of allowable values for taxonomy attributes of type text for e.g. different country names for attribute 'Country', different values for attribute 'type' of a Material etc.While MDM Data Manger provides option to add allowable values for a Taxonomy attribute at design time, which are utilised in custom built MDM - Portal applications or MDM Business packages. What if the user wants to add a value to the set of allowable values for a text type taxonomy attribute at runtime? , which would enable him an additional option to select from a list of allowable values for a Taxonomy attribute. I couldn't find an out of the box set of API methods/Interfaces, which can help us achieve the same"

After a lot of analysis and indepth study of MDM Java APIs, I could find out a mechanism using which we can replicate this design time functionality of MDM Data Manager, at runtime by using MDM Java APIs.

The algo/code snippet given below can be re-used, after further customizing or extending as is needed.

We start at the point where we have retrieved the attribute object (retrieved using Attribute ID, which in turn is retrieved from current record object, refer embedded blog link for details)

Depending upon the screen design i.e. the number of entries you want the end user to enter for addition to list of allowable values for text type taxonomy attribute one can utilize the following lines of code.

int textAttrValueCount = 10; // allows the user to add 10 values to the set of allowable values for a text type taxonomy attribute

for (int i = 0; i < textAttrValueCount; i++)

{
TextAttributeValueProperties textValue = new TextAttributeValueProperties();
baseName = "TextValue" + i + System.currentTimeMillis();
MultilingualString textValueName = MultilingualHelper.createMultilingualString(regDefs, baseName);
textValue.setName(textValueName);

baseName = "Tooltip" + i + System.currentTimeMillis();
MultilingualString tooltip = MultilingualHelper.createMultilingualString(regDefs, baseName);
attr.addTextAttributeValue(textValue);
}

attr = text type taxonomy attribute object to which allowable values are to be added

Utilizing the code piece or the approach outlined above one can customize and extend MDM Business Packages (so that it gives end user the option to add to allowable values for a text type taxonomy attribute)or integrate the same into a webdynpro application providing cover to MDM repository and give end user the option which is otherwise available on at design time i.e. using MDM Data Manager.

Thursday, February 5, 2009

MDM Java API 2 an introductive series part IV Tobias Grunow

MDM Java API 2 an introductive series part IV
Tobias Grunow


Introduction

In my last Blog I showed you how to get an authenticated user session using a trusted connection. In this part I want to show you how to gather MDM system information from a SAP Portal system object.
After that I will introduce the basic concept behind working with MDM and the Java API 2, showing you how the system works (RecordID, Lookup Values, Display Fields...).

If you build a solution based on MDM you will sooner or later face a two or three tier MDM system landscape. Meaning that there will be a development system (D-System) and there might be a quality assurance system (Q-System) and there will be a productive system (P-System). This scenario can include multiple MDM Server or maybe only different Repositories on the same server and in addition maybe (D-, Q-, P-Clients). So now you face the problem how to address the different repositories from your code without having to take care about addressing the right one dependant on which landscape you are in.
In case you are using a SAP Portal I want to point to a very useful thread I have found on SDN.

Hard-code credentials - any other solution exists?

This forum post shows you how to work with the SAP Portal system objects and how to retrieve information out of it. I have used this concept to address the problem I have just pointed out to you in addition with the trusted connection user session authentication and it worked fine for me.

Let us continue with the next topic...
MDM Java API 2 introducing the concept

First of all I want to give u a brief introduction into the concept of MDM to better understand my future coding. As a coder you need a much better understanding of the concept behind the MDM than anyone else. Simple GUI-Users (Graphical User Interface) don’t have the need to understand the technical details behind the data storage and operations like Java programmers. So at the beginning the most important task is to understand how MDM will store data and connect the data with each other.
So let’s start with the first graphic:
image
Figure 1: Table concept in MDM

The first graphic shows the table concept of MDM. The central table called “Main table” is the centre of the model. From the main table there will be references going towards any kind of sub tables. In the Main table will be fields. Those fields can hold values which are stored directly in the table or hold a reference to any record of a sub table. Sub tables can store data values or hold references to other sub tables.
To illustrate the possible layout of a main table take a look at figure 2.

[Click the picture to enlarge!]
image
Figure 2: Possible layout in main table (MDM Console view)

As you can see the main table stores e.g. text values directly as well as references to other sub tables (Lookup’s of different kinds). To find out where the lookup’s are pointing to, you can open up the MDM Console and click on an entry in the list of fields in the main table to see its details (Figure 3).


image
Figure 3: Main table field details (Bottom part of MDM Console)


If we look at the details we have to notice two important things.
First, figure 3 shows us a field detail named “CODE”. This code is very important for us because it is the name we will use in our code to address this field in the table. Second, we have to notice that the field is of “Type” Lookup [Flat]. This tells us, that the value we will find in this field will be of type RecordID.
A RecordID is the ID of the record in the sub table (e.g. Sales Product Key – Table [Detail: Lookup Table]). This means that we will not be able to access the value directly by accessing the field in the main table. We will only get the reference to the sub table which holds the actual value desired. In my future Blogs I will give more details on the concepts behind MDM and give examples of other techniques used.
So enough of MDM concepts and let’s get to the Java API and some examples.
Searching in MDM

Searching in MDM will be one of the most common used functionality there is.
Searching in a repository has some prerequisites. First we have to have a connection to the MDM Server and second we have to have an authenticated user session to a repository. In my Blogs published before I showed you how to setup those prerequisites. In the class provided I have combined all the necessary steps to get the connection and the user session. So now let’s get to the code.

package com.sap.sdn.examples;

import java.util.Locale;

import com.sap.mdm.commands.AuthenticateUserSessionCommand;
import com.sap.mdm.commands.CommandException;
import com.sap.mdm.commands.CreateUserSessionCommand;
import com.sap.mdm.commands.SetUnicodeNormalizationCommand;
import com.sap.mdm.commands.TrustedUserSessionCommand;
import com.sap.mdm.data.RecordResultSet;
import com.sap.mdm.data.RegionProperties;
import com.sap.mdm.data.ResultDefinition;
import com.sap.mdm.data.commands.RetrieveLimitedRecordsCommand;
import com.sap.mdm.ids.FieldId;
import com.sap.mdm.ids.TableId;
import com.sap.mdm.net.ConnectionAccessor;
import com.sap.mdm.net.ConnectionException;
import com.sap.mdm.net.SimpleConnectionFactory;
import com.sap.mdm.search.FieldSearchDimension;
import com.sap.mdm.search.Search;
import com.sap.mdm.search.TextSearchConstraint;
import com.sap.mdm.server.DBMSType;
import com.sap.mdm.server.RepositoryIdentifier;

public class SearchExamples {

// Instance variables needed for processing
private ConnectionAccessor mySimpleConnection;
// Name of the server that mdm runns on
private String serverName = "IBSOLUTI-D790B6";
// Name of the repository shown in the mdm console
private String RepositoryNameAsString = "SDN_Repository";
// Name of the DB-Server this could be an IP address only
private String DBServerNameAsString = "IBSOLUTI-D790B6\\SQLEXPRESS";
// Define the Database type (MS SQL Server)
private DBMSType DBMSTypeUsed = DBMSType.MS_SQL;
// Create a new data region
private RegionProperties dataRegion = new RegionProperties();
// Session which will be used for searching
private String userSession;
// Default user name
private String userName = "Admin";
// Password is empty on default setup
private String userPassword ="";
// result we will get from mdm
public RecordResultSet Result;

/**
* Constructor for class
*/
public SearchExamples(){
// Set the Data Region
dataRegion.setRegionCode("engUSA");
// Set the locale on data region
dataRegion.setLocale(new Locale("en", "US"));
// Set the name of data region
dataRegion.setName("US");
// get a connection to the server
this.getConnection();
// Authenticate a user session
try {
this.getAuthenticatedUserSession();
} catch (ConnectionException e) {
// Do something with exception
e.printStackTrace();
} catch (CommandException e) {
// Do something with exception
e.printStackTrace();
}
// Get resulting records
Result = this.SearchTypes();

}

// ResultDefinition Main Table declaration
private ResultDefinition rdMain;


/**
* Method that will search for all records in main table of a certain type
*
* @return RecordResultSet that holds all resulting records from search
*/
public RecordResultSet SearchTypes() {

/**
* 1. First we create the Result Definition. This result definition will
* tell the search which fields are of interest to us. The list could
* include all fields of the table or only the ones we are interested
* in.
*/
// Define which table should be represented by this ResultDefintion
// In my repository this is the table MAINTABLE
rdMain = new ResultDefinition(new TableId(1));
// Add the desired FieldId's to the result definition
// In my repository this is the field PRODUCT_NAME
rdMain.addSelectField(new FieldId(2));
// In my repository this is the field TYP
rdMain.addSelectField(new FieldId(27));

/**
* 2. Create the needed search parameters.
* Define what to search for and where.
*/
// Create the field search dimension [Where to search!?]
FieldSearchDimension fsdMaintableType = new FieldSearchDimension(new FieldId(27));
// Create the text search constraint [What to search for?! (Every record that contains ROOT)]
TextSearchConstraint tscTypeRoot = new TextSearchConstraint("ROOT", TextSearchConstraint.CONTAINS);

/**
* 3.
* Create the search object with the given search parameters.
*/
// Create the search
Search seSearchTypeRoot = new Search(new TableId(1));
// Add the parameters to the search
seSearchTypeRoot.addSearchItem(fsdMaintableType, tscTypeRoot);

/**
* 4.
* Create the command to search with and retrieve the result
*/
// Build the command
RetrieveLimitedRecordsCommand rlrcGetRecordsOfTypeRoot = new RetrieveLimitedRecordsCommand(mySimpleConnection);
// Set the search to use for command
rlrcGetRecordsOfTypeRoot.setSearch(seSearchTypeRoot);
// Set the session to use for command
rlrcGetRecordsOfTypeRoot.setSession(this.userSession);
// Set the result definition to use
rlrcGetRecordsOfTypeRoot.setResultDefinition(rdMain);
// Try to execute the command
try {
rlrcGetRecordsOfTypeRoot.execute();
} catch (CommandException e) {
// Do something with the exception
e.printStackTrace();
}
// Return the result
return rlrcGetRecordsOfTypeRoot.getRecords();
}

/**
* Create and authenticate a new user session to an MDM repository.
*
* @param mySimpleConnection
* The connection to the MDM Server
* @param RepositoryNameAsString
* name of the repository to connect to
* @param DBServerNameAsString
* name of DBServer
* @param DBMSType
* Type of DBMS that MDM works with
* @param dataRegion
* RegionProperties defining the language the repository should
* be connected with.
* @param userName
* Name of the user that should make the connection to repository
* @param userPassword
* password of user that should be used if connection is not trusted
* @throws ConnectionException
* is propagated from the API
* @throws CommandException
* is propagated from the API
*/
public String getAuthenticatedUserSession(
) throws ConnectionException, CommandException {
/*
* We need a RepositoryIdentifier to connect to the desired repository
* parameters for the constructor are: Repository name as string as read
* in the MDM Console in the "Name" field DB Server name as string as
* used while creating a repository DBMS Type as string - Valid types
* are: MSQL, ORCL, IDB2, IZOS, IIOS, MXDB
*/
RepositoryIdentifier repId = new RepositoryIdentifier(
RepositoryNameAsString, DBServerNameAsString, DBMSTypeUsed);
// Create the command to get the Session
CreateUserSessionCommand createUserSessionCommand = new CreateUserSessionCommand(
mySimpleConnection);
// Set the identifier
createUserSessionCommand.setRepositoryIdentifier(repId);
// Set the region to use for Session - (Language)
createUserSessionCommand.setDataRegion(dataRegion);
// Execute the command
createUserSessionCommand.execute();
// Get the session identifier
this.userSession = createUserSessionCommand.getUserSession();

// Authenticate the user session
try {
// Use command to authenticate user session on trusted connection
TrustedUserSessionCommand tuscTrustedUser = new TrustedUserSessionCommand(
mySimpleConnection);
// Set the user name to use
tuscTrustedUser.setUserName(userName);
tuscTrustedUser.setSession(this.userSession);
tuscTrustedUser.execute();
this.userSession = tuscTrustedUser.getSession();
} catch (com.sap.mdm.commands.CommandException e) {
/* In Case the Connection is not Trusted */
AuthenticateUserSessionCommand authenticateUserSessionCommand = new AuthenticateUserSessionCommand(
mySimpleConnection);
authenticateUserSessionCommand.setSession(this.userSession);
authenticateUserSessionCommand.setUserName(userName);
authenticateUserSessionCommand.setUserPassword(userPassword);
authenticateUserSessionCommand.execute();
}
// For further information see:
// http://help.sap.com/javadocs/MDM/current/com/sap/mdm/commands/SetUnicodeNormalizationCommand.html
// Create the normalization command
SetUnicodeNormalizationCommand setUnicodeNormalizationCommand = new SetUnicodeNormalizationCommand(
mySimpleConnection);
// Set the session to be used
setUnicodeNormalizationCommand.setSession(this.userSession);
// Set the normalization type
setUnicodeNormalizationCommand
.setNormalizationType(SetUnicodeNormalizationCommand.NORMALIZATION_COMPOSED);
// Execute the command
setUnicodeNormalizationCommand.execute();
// Return the session identifier as string value
return this.userSession;
}

/*
* The method will return a ConnectionAccessor which is needed every time
* you want to execute a Command like searching or as on any other Command
* there is. @return SimpleConnection to MDM Server
*/
public void getConnection() {
String sHostName = serverName;
// We need a try / catch statement or a throws for the method cause
try {
/*
* retrieve connection from Factory The hostname can be the name of
* the server if it is listening on the standard port 20005 or a
* combination of Servername:Portnumber eg. MDMSERVER:40000
*/
mySimpleConnection = SimpleConnectionFactory.getInstance(sHostName);
} catch (ConnectionException e) {
// Do some exception handling
e.printStackTrace();
}
}
}


To test the code we also need a simple test class that will instantiate the sample class and prints out the number of retrieved Records.

package com.sap.sdn.examples;

public class Test {

/**
* @param args
*/
public static void main(String[] args) {
// Create instance of search class
SearchExamples test = new SearchExamples();
// Print out the amount of found records
System.out.println("Total of " + test.Result.getCount() + " Records found that match criteria TYPE=ROOT");
}
}



I wrote a lot of comments in the code but I will give you some more details on what is happening there.

First of all there are some instance variables that hold the connection information to the MDM Server and the MDM Repository.
The constructor will set up some variables which are needed for the search.
A connection to the MDM server will be created.
A Session will be created and will be authenticated, if there is a trusted connection we will use it to authenticate and if we only have a normal connection to the server normal authentication will be used.
The search will be triggered and the result will be stored in instance variable.
The search needs some elements to work with.

At the beginning in the SearchTypes() method [Comment 1. in code] set up a ResultDefiniton which tells the search what fields are of interest to the program and should be accessible in the result. If this definition is not including some desired field and you want to access it later in your code, an exception will be thrown.

[Comment 2. in code] Define a search dimension and a search constraint to tell the search where to look at and what to look for. There are a lot of search constraints to work with.
All available constraints:

[Comment 3. in code] Define the search itself and use the dimension and constraint to set it up.

[Comment 4. in code] Build the command that is needed to execute the search.
More Details on command please see this link

So this is just a very simple example and I will give you more advanced codes in future Blogs.
If you have any questions on what the code does or need more detailed explanations please feel free to comment on this Blog. If this code helped you a little please feel free to comment as well.


So this will be the last Blog for this year since I am rebuilding my flat at the moment. The topic of the next Blog needs to be defined and I will update my agenda in my first Blog next year.


So I wish you a happy new year and as you would say in german: “ Einen guten Rutsch ins neue Jahr!”.

Best regards,
Tobi

Tobias Grunow is a Solution Consultant for IBSolution GmbH Heilbronn, Germany. He is a member of the research and development team.


MDM Java API 2 an introductive series part IV
Tobias Grunow

Resolve Consolidated Reporting Problem in multiple BI systems through SAP MDM Ankur Ramkumar Goe

Resolve Consolidated Reporting Problem in multiple BI systems through SAP MDM
Ankur Ramkumar Goe

Summary

Management uses BI reports for planning, Strategies and making Business Decisions. BI systems get data from multiple systems to give consolidated data. However all transactional systems maintain their own master data which in turn is sent to BI. Thus Master data in BI is quite redundant and does not show single version of truth. Due to this, the Management gets wrong reports/information about organizations performance which leads to wrong planning and decisions.

We will try to address this issue with the help of Master Data management solution so that Management gets true view of global business and thus enabling them with unified business information for optimized operations and faster and accurate decisions.



Current Scenario

Organizations, due to various reasons like across geographies, mergers and acquisitions, best of breed products landed in scattered and distributed IT landscape. With the introduction of BI for reporting, currently organizations have respective BI systems for reporting according to geographies, systems, units, functionality. Every transactional system needs to create its own master data as master data drives transactions. These transactional systems are usually connected to their own BI systems and do not share their data with other systems due to various reasons like complexity, number of connections, cost of maintaining connections, lack of governance etc. Thus every BI system ends up getting master data from their own transactional systems and which in turn fed to corporate BI system for consolidated reporting. Without consistent master Data, data warehousing becomes garbage in, garbage out.

Due to all this master data in BI is quite redundant and does not show single version of truth. Hence the Management gets wrong reports presenting wrong information about organizations performance which leads to wrong planning and decisions. The management might be looking on the reports which will be showing 1 single customer as 3 or more customers (e.g. John Kennedy, John F Kennedy, J F Kennedy etc) or same with vendors ( e.g. Satyam, Satyam Computers, Satyam Computers Services ltd etc). Surveys show that more than 70% decisions are made wrongly because of incomplete or wrong reports/information. Even organizations spend 30% of their time to verify the reports/information. There’s a organization which found that their 40% of orders struck because of mismatched master data.
BI system have ELT layer which is specially made for reporting use only. Unfortunately it is not configured and optimized for cleansing Master Data. Also organizations are maintaining and trying to solve master data problems for many years by their own methods and tools. By this organizations were trying to heal the symptoms but were not able to solve root cause of master data problem. Industry has recognized this problem and coming up with a tool to manage master data. With the evolution of MDM tools, organizations will be able to benefit by best practices, reduced efforts and ease. Also maintaining master data outside of BI system will help

Below are two scenarios showing organizations distributed IT landscapes.

Scenario 1 – Organizations BI landscape as per geography
bi 3

Scenario 1

This scenario describes organization landscape as per geography which is distributed across geographies to cater the local and corporate reporting requirements.
Scenario 2 - Organizations BI Landscape as per functionality
image

Scenario 2

This scenario describes organization landscape as per different functionality such as finance and logistics separately.


Suggested Approach

Single version of truth of master data across organization can be achieved through introduction of MDM system. There’s a organization which found that they had 37% duplicates vendors in their systems.

Since we are handling master data of organization, the introduction of MDM system should not be disruptive to other systems and business in mind. Thus small steps approach is recommended for approaching master data management system in organizations. For consolidated reporting giving correct reports, 2 approaches are suggested. The 2nd approach has 2 steps. However organization can go directly for approach 2.2, however decision has to be taken as per disturbance to existing landscape. Also Strong Governance mechanism has to be in place is suggested for CMDM.

Approach 1 Harmonization across BI

Approach 2.1 Master Data from Legacy Systems

Approach 2.2 Central Master Data Management (CMDM)
Approach 1 – Harmonization across BI systems only

Here, will not interfere in organizations current master data flow or mechanism thus not becoming disruptive in current landscape and processes. We will take master data from BI system only, cleanse it and map it and send back to BI systems. The MDM system will send back the mapping of local BI IDs with global MDM IDs. By this approach, management will be able to get reports on both local BI IDs and global MDM IDs.

image

Approach 1 Harmonization across BI systems to achieve consolidated reporting
Benefits:

. Derive immediate benefits from consolidated master data through BI reports

. Mitigated risk of using MDM in the transactional landscape

. Immediate benefit by control on Master Data of newly Merged and Acquired Companies.

. No Interface is required between MDM and BI

. Not disruptive to existing landscape
Limitations:

. Master Data integrity is not achieved throughout the organization and only for Analytics


Approach 2.1 – Master Data from Legacy Systems

In this, we introduce MDM system to transactional systems, before BI systems layer. MDM system gets all the master data from source systems and maintains it. Afterwards this cleansed master data is passed to BI systems to get correct reports based on consolidated and cleansed master data. Here also we are not interfering with source systems and source systems continue with their own master data creation and processes.

However it is still not solving the root cause of the problem. Hence after this organizations should go for next approach.

image

Approach 2.1 Master Data is coming from source systems and fed to BI
Benefits

. Immediate benefit by control on Master Data of newly Merged and Acquired Companies.

. Ensures data integrity across Transactional landscape

. Sets stage for easy re-use via distribution or business processes that directly leverage consolidated data.

. Sets stage to achieve CMDM


Approach 2.2 – Central Master Data Management (CMDM)

This is true Master data management. However organizations need to prepare themselves first to approach this and careful planning needs to be done. This approach will fail if strong governance mechanism will not set in place.

CMDM will be the central master data management system, thus it will maintain all the master data across organization landscape. This will result in single version of truth across organization landscape, and definitely reporting.

image

Approach 2.2 Consistent Master Data is coming through CMDM
It’s upto the organization to implement CMDM as per their convenience, readiness and governance mechanism. There are two ways to achieve CMDM:
1. Organizations can continue to create master data in local systems and then get it cleaned and synchronized to respectable systems with the help of MDM.
2. Organizations create master data centrally in MDM only and then sent back to respective systems to maintain synchronized version of master data across their landscape.

Benefits

. Centralized management of master data

. Ensures data integrity through Enriching and Standardization of master data across landscape

. Centralized Governance - Ensures that data changes in one application, other business applications that depend on that data, are updated with a consistent view of key data as it changes in real time.



MDM Benefits

Below are the benefits which organization will get with the help of master data management.

· Global View of Business

· Single view of truth

· Optimized Operations

· Easy Integration of newly acquired systems

· Elimination of manual and Redundant processes

· Full Interoperability

· Greater Visibility

· Better Accuracy

· Reduction in maintenance cost

· Faster cycles

SAP MDM Benefits

· Easy Integration with non-SAP, SAP and BI systems

· Predefined content for configuration

· Driving eSOA

· Java enabled content for EP and other external systems

· No coding required, easy configuration

· Workflow in windows based Visio


Company: Satyam Computer Services Ltd.

Role: Projects Lead / Project Manager

Above shared views are my personal views and might not synchronize with my company views.






Ankur Ramkumar Goel has over 7 years of SAP experience in ABAP, BI and MDM with implementations, Roll outs, Maintenance, due diligence and strategy projects across Europe, USA and APJ, and has been instrumental in MDM for 3 years.



Resolve Consolidated Reporting Problem in multiple BI systems through SAP MDM
Ankur Ramkumar Goe

Tuesday, February 3, 2009

MDM Import Manager capability Ravi Kumar

MDM Import Manager capability
Ravi Kumar


In my previous project we had a requirement of running matching strategy for finding duplicate Material master. The matching rule was pretty simple based on Material Description but the repository size was in tune of 500K with around 250 attributes.

to enable better matching results we had used lots of transformations on Material Description field for normalizing and standardizing of data. We were using combination of two rules in the Duplicate matching strategy:

1. Rule1: using functionality equal(on Material Description transformed field)

2. Rule 2: token equal( on Material Description field)

On an average the description was having 1.5 tokens per record. We found that performance was very poor as we had both token equals and a huge list of transformations( few of them were replacing string XXX to AAA, deleting blanks, other special characters etc etc). Even after restricting the total number of records considered for matching( we used Material group for Clustering) it was taking 20-40 mins for matching results.

Solution: We improved the data quality by re importing the same set of records which was used for initial load from ECC but this time MDM Import manager capabilities were harnessed to reduce number of transformations.

HOW: After mapping Material Description field apply value conversion filter on the mapped field and we have almost all excel based powerful functions available like replace, Append, Prepend. We can have even multiple such conversion rules applied on the same mapped field.

We also made Keyword normal for field Material Description, which optimized token based comparisons.

This effectively reduced our number of transformations from 22 to 5 and increased matching strategy performance.



Ravi Kumar is Consultant with Infosys.

MDM Import Manager capability
Ravi Kumar

SAP MDM integration with R/3 system Ravi Kumar

SAP MDM integration with R/3 system
Ravi Kumar


•We all know what are the different It scenarios supported by MDM, namely:
Master Data Consolidation

*
– Cleansing and de-duplication
*
–Data normalization including categorization and taxonomy management
*
–New interactive consolidation capabilities

•Master Data Harmonization

*
–Automated synchronization of globally relevant master data information
*
–New interactive distribution capabilities

•Central Master Data Management

*
–One-stop data maintenance
*
–Ongoing master data quality

In this Blog I will try to cover the different steps required for integrating MDM with R/3 system which is the source of Master data.After cleansing and de-duplication of Data in MDM all changes/ updates will be reflecting back in SAP R/3. this step by step procedure should help in understanding the procedure for implementing MDM IT scenarios. :)



Different settings required for doing this are:

* Settings in R3
* Settings in XI
* Settings in MDM
* MDM console
* Import Manager
* Syndication Manager

MDM Process flow









Process Flow: Trigger Idoc containing Masters from R/3.XI converts these Idocs to XML files and places in ready folder for Inbound port. Files received by Import Manager into MDM and after changes etal via Syndicator Manager xml file placed in Ready folder of outbound Port from where it will be picked by XI and sent to R/3 using the Idocs.





Setting in R/3:

Step 1



T.Code: SALE

Define Logical System





Sender Logical System:





Receiver Logical System:



















Step 2: T.code: SALE



Assign Client to Sender Logical System:







Step 3:

Create RFC destination of type ‘3' for SAP XI system using transaction - SM59





Step 4:



Create a Distribution model through T.Code: SALE/BD64



Here we need to mention Sender Client (DEVCLNT500) and Receiver Client (XICLNT100) and Message type (MATMAS)









Step 5:

Using T.Code: WE21



Maintain Ports for IDoc Processing



















Step: 6



Maintain Partner Profile using the T.code: WE20









Step 7:

Using BD12/BD10 T.code to Send Customer IDoc or material IDoc











Using T.code WE02 we can confirm or see the message transformations.







Now we are able to generate IDoc from R/3 containing the master data.

Settings in MDM:

Select the repository for which you want to do the settings.









Go to "Admin" and select "Client Systems" and right click on it and create your client system















Then go to "Ports" for creating a Port for Client System







Here you are defining the Outbound Port for the Client system (MDC R/3) defined in previous step.

You have an option of processing the data Automatically/manually.





After Saving the "Port" and "Client System" you need to ensure the folders in respective repository has been created in server.







Because "Ready" is the folder where all the files get exchange from.



Similar steps should be repeated for Inbound Port.





MDM Import Manager: Assuming that all the settings are done in XI also we move to Import manager where we will select the file to be imported and steps to be followed for import of Data into MDM



Step 1: Login to Import Manager and connect to Source file. Select type as PORT. System automatically connects to the Inbound Port of the repository logged into.







Step 2: Do all Field mapping and value mapping in Map Field/Value tab.we can use the standard maps provided in Business content or do all mappings manually and save the map.







Step 3: Go to match records Tab select the Field used for Matching records and select the import action. PS : For each record we can manually override create/skip as import action.











Step 4: After all mappings has been done and import action is Ready to Import in Import status tab execute the import









This will import the records contained in the xml file from the Ready folder into MDM.GO to Data manager and check all records created from R3.





Syndicator Manager MDM:



Any changes made in MDM data manager based upon the validations/Assignemnts and Business rules the changed record should be syndicated back to R3 which is Data source.



Step 1: Login to Syndicator Manager giving the repository name. select File> Destination Properties and select Port as shown:



Select the remote system R3 in this case

Select the Port which we have created for Outbound port where the Syndicated file can be placed in XML format.











Step 2: Do all the mappings again. Use the standard maps provided in Business content or do it manually like in Import manager. We have the options of selecting few records based on search parameters. We also have the option of Supressing all unchanged records in map Properties. This will select only those records which has been changed in Data manager instead of all existing records.









Step 3: In Destination Preview we can actually see all records with the values for fields before syndicating. This should be always done before executing to reduce the erroneous /incomplete data flow. Once you have all the details execute the syndication.











Step 4: Check the IDoc list in SAP. In case of status 51 do the further analysis why it has failed.











RESULT: data changed in MDM Data manager will be updated in SAP R3 system provided all the mappings are correct.

Ravi Kumar is Consultant with Infosys.

SAP MDM integration with R/3 system
Ravi Kumar

Latest updates from sdn.sap.com

SAP Developer Network Latest Updates