SAPNetWeaverMasterDataManagementEnrichmentController
Showing posts with label Business Process Management. Show all posts
Showing posts with label Business Process Management. Show all posts
Monday, November 22, 2010
Wednesday, August 5, 2009
SAP MDM ONLINE TRAININGS WITH REALTIME CONSULTANTS
Hi,
We offer the following Online Training Courses :
SAP MDM,
EP,BW / BI 7.0,FI/CO, HR , ABAP , BASIS & SECURITY, SD / CRM , MM / PP, HR, ABAP-HR , PI/XI TRAINING,QA TRAINING
For further details please reach us at
Sree
+91-9379914378
www.sapmdm.co.in
We offer the following Online Training Courses :
SAP MDM,
EP,BW / BI 7.0,FI/CO, HR , ABAP , BASIS & SECURITY, SD / CRM , MM / PP, HR, ABAP-HR , PI/XI TRAINING,QA TRAINING
For further details please reach us at
Sree
+91-9379914378
www.sapmdm.co.in
Sunday, March 8, 2009
Ten Commandments for MDM Implementation Prabuddha Roy
Ten Commandments for MDM Implementation
Prabuddha Roy
Without wanting to labour the point about what MDM is, I nevertheless feel compelled to state my understanding of this relatively new technology area before I get into discussing integration of it with business intelligence. This is done mainly for the sake of clarity. Master data management is a set of processes, policies, services and technologies used to create, maintain and manage data associated with a company’s core business entities as a system of record (SOR) for the enterprise. Core entities include customer, supplier, employee, asset, etc. Note that master data management is not associated with transaction data such as orders, for example. Whether you build your own MDM system or buy a MDM solution from a MDM vendor in the marketplace, it should meet a number of key requirements. These include the ability to:
* Define and maintain metadata for master data entities in a repository.
* Acquire, clean, de-duplicate and integrate master data into a central master data store.
* Offer a common set of shared master data services for applications, processes and portals to invoke to access and maintain master data entities (i.e., system of entry [SOE] MDM services).
* Manage master data hierarchies including a history of hierarchy changes and hierarchy versions.
* Manage the synchronisation of changes to master data to all operational and analytical systems that use complete sets or subsets of this data.
Here is a brief elucidation of the Ten Golden Principles that needs to be appreciated during any MDM Investment
1. Getting Started: Irrespective of the industry, market segment, or existing IT environmentthere is a hunt for the right way to launch an MDM initiative. Many due diligence initiatives propose that MDM is the only right answer to the most critical business problems. Many Corporates have started securing budgets . for launching a sustainable MDM program. The Idea is clear "Start small -- with an initial project -- but think large-scale, long-term, and across subject areas." MDM requires new technologies, specialized skills, and a business focus. With all of these ingredients in place, the payoff is well worth the effort.
2. ROI : Perhaps the biggest issue is how to justify and get funding for an MDM project. As with data warehousing projects, some organizations are blessed with enlightened executives who understand the correlation between high-quality, consistent data and their strategic objectives; these executives may approve MDM projects without delay. Other project leaders must closely align MDM projects with business need and pain and perform a detailed cost-justification analysis. MDM project managers can easily identify and monetize cost savings, but the best approach is to align the MDM initiative with strategic objectives or shoehorn it into approved projects as a necessary underpinning for success.
3. Serendipity: MDM is a business solution that offers a myriad of unexpected benefits once implemented. Many MDM early adopters discovered that once they cleansed and reconciled data through an MDM initiative, they not only improved the quality and consistency of the data available among systems supporting key business processes, but they also enabled other business initiatives such as mergers and acquisitions support, customer relationship management, target marketing, and supply chain optimization. “Data is a corporate asset, and when carefully managed, it provides a strong, stable foundation to support any information-centric business initiative an organization wishes to pursue now or in the future," said Wayne Eckerson, director of research at TDWI. Without MDM, many organizations will spend millions of additional dollars executing information-centric strategic initiatives or won't even attempt them at all.
4. Change Management: Change management is key. From a technical perspective, understanding and socializing the impact of MDM development activities has everything to do with the perception of success. From a cultural perspective, managing the expectations of business and IT stakeholders is nothing less than a make-or-break proposition. Change management is hard and can derail an MDM project if you aren't careful, when you change the data that end users have become accustomed to receiving through reports or other means, it can cause significant angst. You have to anticipate this, implement a transition plan, and prepare the users.
5. Roadblocks : IT and business can pose significant roadblocks. IT stakeholders, many of whom are beholden to established technologies, often need to hear the MDM pitch as much as business users do. Distinguishing MDM from incumbent technologies is an often-underestimated step. Conversely, the business may not want to initiate or fund an MDM project when they already have many of the existing tools and technologies required to do MDM. A very commonly addressed concern/query from majority of the CTOs "Our business has already funded our data warehouse and a customer relationship management solution.,wasn't the data warehouse supposed to solve these issues? wasn't the CRM system supposed to reconcile customers? How can I now convince the stakeholders to take on an MDM initiative?" The answer is that MDM can optimize those existing solutions, reducing their overall expense and minimizing the risk that they will deliver bad data (which could be their kiss of death). Also it is advisable that when purchasing an MDM solution, you shouldn't pay vendors for comparable technologies you already have in house, but which come bundled in their packages.
6. Enterprise Scope : It is true Enterprise MDM may be fraught with problems. With the vision of "start small, think big”, the vast majority of the corporates asserts their wish to quickly broaden their initial implementation to encompass additional domains, systems, and data. Organizations have started supporting their CRM program with better data which facilitiaites to perform business functions that could never have done with native CRM." Thus starting Small companies now plans to extend its MDM capabilities to additional operational systems. Once an organization has implemented an MDM project and it takes root, the organization can then decide whether to widen the road by adding more domains to the existing environment or extend the road by using MDM to address other business problems. 7. Data Governance. - "Data governance is a critical path to MDM,In order to be effective data governance must be designed. A company's cultural norms, established development processes, and incumbent steering committees must all factor into its data governance framework.It is recommended to grow data governance organically and in lockstep with an MDM architecture, which evolves over time. First, one should define what policies and rules are needed by the business to formulate to support an MDM project that solves a business problem. Then, one can formalize the requirements needed to sustain the initiative. That way, the business is working in their own perceived interest, not IT's.
8. Cross-System Data Analysis :One major issue is the time and costs involved in understanding and modeling source data that spans multiple, heterogeneous systems. Microsoft had 10 people working for 100 days to analyze source systems targeted for MDM integration, while the European Patent Office has 60 people analyzing and managing patent data originating around the world. Estimates show that the services-to-software ratio in MDM deployments is 10 to 1, with cross-source data analysis consuming the lion's share of the services. Just as in the data warehousing world, when early adopters in the 1990s underestimated the quality and condition of source data required to deliver an enterprise data warehouse, many implementers have not thought much about the challenges of understanding and reconciling source data.
9. Matching :Calibrating matches between records generated by disparate systems is both art and science. Many speakers acknowledged that these matching engines -- which are typically delivered within a data quality tool -- are the brains of the MDM system.
Many a times consultants need a few go-arounds configuring their matching rules. Many Consultants shared the consequences of "over-matching" records and thus consolidating two products or customers into one faulty record. The point was that matching needs to be refined over time. One must remember that MDM is as much about business and data rules as it is about the data itself.
10. Logical Data Models. An existing logical data model can propel you forward. Data administration skills is imperative in MDM. While some vendors maintain that a logical data model isn't required to make their tools work, most agree that the exercise itself can help a company understand definitions and usage scenarios for enterprise data, which makes it easier to gain consensus around data policies. Carl Gerber, senior manager of data warehousing at Tween Brands, shared how his company's data modeling and stewardship skills were a large part of his team's successful MDM delivery. Tween has created a series of data integration hubs to manage various subject areas, such as product, inventory, suppliers, and merchandising hierarchies. All data exchanged between systems passes through these hubs to ensure data consistency and reconciliation. The architecture has eliminated numerous point-to-point interfaces and improved operational efficiency, decision making, and revenue-generation objectives.
Prabuddha Roy
Without wanting to labour the point about what MDM is, I nevertheless feel compelled to state my understanding of this relatively new technology area before I get into discussing integration of it with business intelligence. This is done mainly for the sake of clarity. Master data management is a set of processes, policies, services and technologies used to create, maintain and manage data associated with a company’s core business entities as a system of record (SOR) for the enterprise. Core entities include customer, supplier, employee, asset, etc. Note that master data management is not associated with transaction data such as orders, for example. Whether you build your own MDM system or buy a MDM solution from a MDM vendor in the marketplace, it should meet a number of key requirements. These include the ability to:
* Define and maintain metadata for master data entities in a repository.
* Acquire, clean, de-duplicate and integrate master data into a central master data store.
* Offer a common set of shared master data services for applications, processes and portals to invoke to access and maintain master data entities (i.e., system of entry [SOE] MDM services).
* Manage master data hierarchies including a history of hierarchy changes and hierarchy versions.
* Manage the synchronisation of changes to master data to all operational and analytical systems that use complete sets or subsets of this data.
Here is a brief elucidation of the Ten Golden Principles that needs to be appreciated during any MDM Investment
1. Getting Started: Irrespective of the industry, market segment, or existing IT environmentthere is a hunt for the right way to launch an MDM initiative. Many due diligence initiatives propose that MDM is the only right answer to the most critical business problems. Many Corporates have started securing budgets . for launching a sustainable MDM program. The Idea is clear "Start small -- with an initial project -- but think large-scale, long-term, and across subject areas." MDM requires new technologies, specialized skills, and a business focus. With all of these ingredients in place, the payoff is well worth the effort.
2. ROI : Perhaps the biggest issue is how to justify and get funding for an MDM project. As with data warehousing projects, some organizations are blessed with enlightened executives who understand the correlation between high-quality, consistent data and their strategic objectives; these executives may approve MDM projects without delay. Other project leaders must closely align MDM projects with business need and pain and perform a detailed cost-justification analysis. MDM project managers can easily identify and monetize cost savings, but the best approach is to align the MDM initiative with strategic objectives or shoehorn it into approved projects as a necessary underpinning for success.
3. Serendipity: MDM is a business solution that offers a myriad of unexpected benefits once implemented. Many MDM early adopters discovered that once they cleansed and reconciled data through an MDM initiative, they not only improved the quality and consistency of the data available among systems supporting key business processes, but they also enabled other business initiatives such as mergers and acquisitions support, customer relationship management, target marketing, and supply chain optimization. “Data is a corporate asset, and when carefully managed, it provides a strong, stable foundation to support any information-centric business initiative an organization wishes to pursue now or in the future," said Wayne Eckerson, director of research at TDWI. Without MDM, many organizations will spend millions of additional dollars executing information-centric strategic initiatives or won't even attempt them at all.
4. Change Management: Change management is key. From a technical perspective, understanding and socializing the impact of MDM development activities has everything to do with the perception of success. From a cultural perspective, managing the expectations of business and IT stakeholders is nothing less than a make-or-break proposition. Change management is hard and can derail an MDM project if you aren't careful, when you change the data that end users have become accustomed to receiving through reports or other means, it can cause significant angst. You have to anticipate this, implement a transition plan, and prepare the users.
5. Roadblocks : IT and business can pose significant roadblocks. IT stakeholders, many of whom are beholden to established technologies, often need to hear the MDM pitch as much as business users do. Distinguishing MDM from incumbent technologies is an often-underestimated step. Conversely, the business may not want to initiate or fund an MDM project when they already have many of the existing tools and technologies required to do MDM. A very commonly addressed concern/query from majority of the CTOs "Our business has already funded our data warehouse and a customer relationship management solution.,wasn't the data warehouse supposed to solve these issues? wasn't the CRM system supposed to reconcile customers? How can I now convince the stakeholders to take on an MDM initiative?" The answer is that MDM can optimize those existing solutions, reducing their overall expense and minimizing the risk that they will deliver bad data (which could be their kiss of death). Also it is advisable that when purchasing an MDM solution, you shouldn't pay vendors for comparable technologies you already have in house, but which come bundled in their packages.
6. Enterprise Scope : It is true Enterprise MDM may be fraught with problems. With the vision of "start small, think big”, the vast majority of the corporates asserts their wish to quickly broaden their initial implementation to encompass additional domains, systems, and data. Organizations have started supporting their CRM program with better data which facilitiaites to perform business functions that could never have done with native CRM." Thus starting Small companies now plans to extend its MDM capabilities to additional operational systems. Once an organization has implemented an MDM project and it takes root, the organization can then decide whether to widen the road by adding more domains to the existing environment or extend the road by using MDM to address other business problems. 7. Data Governance. - "Data governance is a critical path to MDM,In order to be effective data governance must be designed. A company's cultural norms, established development processes, and incumbent steering committees must all factor into its data governance framework.It is recommended to grow data governance organically and in lockstep with an MDM architecture, which evolves over time. First, one should define what policies and rules are needed by the business to formulate to support an MDM project that solves a business problem. Then, one can formalize the requirements needed to sustain the initiative. That way, the business is working in their own perceived interest, not IT's.
8. Cross-System Data Analysis :One major issue is the time and costs involved in understanding and modeling source data that spans multiple, heterogeneous systems. Microsoft had 10 people working for 100 days to analyze source systems targeted for MDM integration, while the European Patent Office has 60 people analyzing and managing patent data originating around the world. Estimates show that the services-to-software ratio in MDM deployments is 10 to 1, with cross-source data analysis consuming the lion's share of the services. Just as in the data warehousing world, when early adopters in the 1990s underestimated the quality and condition of source data required to deliver an enterprise data warehouse, many implementers have not thought much about the challenges of understanding and reconciling source data.
9. Matching :Calibrating matches between records generated by disparate systems is both art and science. Many speakers acknowledged that these matching engines -- which are typically delivered within a data quality tool -- are the brains of the MDM system.
Many a times consultants need a few go-arounds configuring their matching rules. Many Consultants shared the consequences of "over-matching" records and thus consolidating two products or customers into one faulty record. The point was that matching needs to be refined over time. One must remember that MDM is as much about business and data rules as it is about the data itself.
10. Logical Data Models. An existing logical data model can propel you forward. Data administration skills is imperative in MDM. While some vendors maintain that a logical data model isn't required to make their tools work, most agree that the exercise itself can help a company understand definitions and usage scenarios for enterprise data, which makes it easier to gain consensus around data policies. Carl Gerber, senior manager of data warehousing at Tween Brands, shared how his company's data modeling and stewardship skills were a large part of his team's successful MDM delivery. Tween has created a series of data integration hubs to manage various subject areas, such as product, inventory, suppliers, and merchandising hierarchies. All data exchanged between systems passes through these hubs to ensure data consistency and reconciliation. The architecture has eliminated numerous point-to-point interfaces and improved operational efficiency, decision making, and revenue-generation objectives.
Ten Commandments for MDM Implementation Prabuddha Roy
Ten Commandments for MDM Implementation
Prabuddha Roy
Without wanting to labour the point about what MDM is, I nevertheless feel compelled to state my understanding of this relatively new technology area before I get into discussing integration of it with business intelligence. This is done mainly for the sake of clarity. Master data management is a set of processes, policies, services and technologies used to create, maintain and manage data associated with a company’s core business entities as a system of record (SOR) for the enterprise. Core entities include customer, supplier, employee, asset, etc. Note that master data management is not associated with transaction data such as orders, for example. Whether you build your own MDM system or buy a MDM solution from a MDM vendor in the marketplace, it should meet a number of key requirements. These include the ability to:
* Define and maintain metadata for master data entities in a repository.
* Acquire, clean, de-duplicate and integrate master data into a central master data store.
* Offer a common set of shared master data services for applications, processes and portals to invoke to access and maintain master data entities (i.e., system of entry [SOE] MDM services).
* Manage master data hierarchies including a history of hierarchy changes and hierarchy versions.
* Manage the synchronisation of changes to master data to all operational and analytical systems that use complete sets or subsets of this data.
Here is a brief elucidation of the Ten Golden Principles that needs to be appreciated during any MDM Investment
1. Getting Started: Irrespective of the industry, market segment, or existing IT environmentthere is a hunt for the right way to launch an MDM initiative. Many due diligence initiatives propose that MDM is the only right answer to the most critical business problems. Many Corporates have started securing budgets . for launching a sustainable MDM program. The Idea is clear "Start small -- with an initial project -- but think large-scale, long-term, and across subject areas." MDM requires new technologies, specialized skills, and a business focus. With all of these ingredients in place, the payoff is well worth the effort.
2. ROI : Perhaps the biggest issue is how to justify and get funding for an MDM project. As with data warehousing projects, some organizations are blessed with enlightened executives who understand the correlation between high-quality, consistent data and their strategic objectives; these executives may approve MDM projects without delay. Other project leaders must closely align MDM projects with business need and pain and perform a detailed cost-justification analysis. MDM project managers can easily identify and monetize cost savings, but the best approach is to align the MDM initiative with strategic objectives or shoehorn it into approved projects as a necessary underpinning for success.
3. Serendipity: MDM is a business solution that offers a myriad of unexpected benefits once implemented. Many MDM early adopters discovered that once they cleansed and reconciled data through an MDM initiative, they not only improved the quality and consistency of the data available among systems supporting key business processes, but they also enabled other business initiatives such as mergers and acquisitions support, customer relationship management, target marketing, and supply chain optimization. “Data is a corporate asset, and when carefully managed, it provides a strong, stable foundation to support any information-centric business initiative an organization wishes to pursue now or in the future," said Wayne Eckerson, director of research at TDWI. Without MDM, many organizations will spend millions of additional dollars executing information-centric strategic initiatives or won't even attempt them at all.
4. Change Management: Change management is key. From a technical perspective, understanding and socializing the impact of MDM development activities has everything to do with the perception of success. From a cultural perspective, managing the expectations of business and IT stakeholders is nothing less than a make-or-break proposition. Change management is hard and can derail an MDM project if you aren't careful, when you change the data that end users have become accustomed to receiving through reports or other means, it can cause significant angst. You have to anticipate this, implement a transition plan, and prepare the users.
5. Roadblocks : IT and business can pose significant roadblocks. IT stakeholders, many of whom are beholden to established technologies, often need to hear the MDM pitch as much as business users do. Distinguishing MDM from incumbent technologies is an often-underestimated step. Conversely, the business may not want to initiate or fund an MDM project when they already have many of the existing tools and technologies required to do MDM. A very commonly addressed concern/query from majority of the CTOs "Our business has already funded our data warehouse and a customer relationship management solution.,wasn't the data warehouse supposed to solve these issues? wasn't the CRM system supposed to reconcile customers? How can I now convince the stakeholders to take on an MDM initiative?" The answer is that MDM can optimize those existing solutions, reducing their overall expense and minimizing the risk that they will deliver bad data (which could be their kiss of death). Also it is advisable that when purchasing an MDM solution, you shouldn't pay vendors for comparable technologies you already have in house, but which come bundled in their packages.
6. Enterprise Scope : It is true Enterprise MDM may be fraught with problems. With the vision of "start small, think big”, the vast majority of the corporates asserts their wish to quickly broaden their initial implementation to encompass additional domains, systems, and data. Organizations have started supporting their CRM program with better data which facilitiaites to perform business functions that could never have done with native CRM." Thus starting Small companies now plans to extend its MDM capabilities to additional operational systems. Once an organization has implemented an MDM project and it takes root, the organization can then decide whether to widen the road by adding more domains to the existing environment or extend the road by using MDM to address other business problems. 7. Data Governance. - "Data governance is a critical path to MDM,In order to be effective data governance must be designed. A company's cultural norms, established development processes, and incumbent steering committees must all factor into its data governance framework.It is recommended to grow data governance organically and in lockstep with an MDM architecture, which evolves over time. First, one should define what policies and rules are needed by the business to formulate to support an MDM project that solves a business problem. Then, one can formalize the requirements needed to sustain the initiative. That way, the business is working in their own perceived interest, not IT's.
8. Cross-System Data Analysis :One major issue is the time and costs involved in understanding and modeling source data that spans multiple, heterogeneous systems. Microsoft had 10 people working for 100 days to analyze source systems targeted for MDM integration, while the European Patent Office has 60 people analyzing and managing patent data originating around the world. Estimates show that the services-to-software ratio in MDM deployments is 10 to 1, with cross-source data analysis consuming the lion's share of the services. Just as in the data warehousing world, when early adopters in the 1990s underestimated the quality and condition of source data required to deliver an enterprise data warehouse, many implementers have not thought much about the challenges of understanding and reconciling source data.
9. Matching :Calibrating matches between records generated by disparate systems is both art and science. Many speakers acknowledged that these matching engines -- which are typically delivered within a data quality tool -- are the brains of the MDM system.
Many a times consultants need a few go-arounds configuring their matching rules. Many Consultants shared the consequences of "over-matching" records and thus consolidating two products or customers into one faulty record. The point was that matching needs to be refined over time. One must remember that MDM is as much about business and data rules as it is about the data itself.
10. Logical Data Models. An existing logical data model can propel you forward. Data administration skills is imperative in MDM. While some vendors maintain that a logical data model isn't required to make their tools work, most agree that the exercise itself can help a company understand definitions and usage scenarios for enterprise data, which makes it easier to gain consensus around data policies. Carl Gerber, senior manager of data warehousing at Tween Brands, shared how his company's data modeling and stewardship skills were a large part of his team's successful MDM delivery. Tween has created a series of data integration hubs to manage various subject areas, such as product, inventory, suppliers, and merchandising hierarchies. All data exchanged between systems passes through these hubs to ensure data consistency and reconciliation. The architecture has eliminated numerous point-to-point interfaces and improved operational efficiency, decision making, and revenue-generation objectives.
Prabuddha Roy
Without wanting to labour the point about what MDM is, I nevertheless feel compelled to state my understanding of this relatively new technology area before I get into discussing integration of it with business intelligence. This is done mainly for the sake of clarity. Master data management is a set of processes, policies, services and technologies used to create, maintain and manage data associated with a company’s core business entities as a system of record (SOR) for the enterprise. Core entities include customer, supplier, employee, asset, etc. Note that master data management is not associated with transaction data such as orders, for example. Whether you build your own MDM system or buy a MDM solution from a MDM vendor in the marketplace, it should meet a number of key requirements. These include the ability to:
* Define and maintain metadata for master data entities in a repository.
* Acquire, clean, de-duplicate and integrate master data into a central master data store.
* Offer a common set of shared master data services for applications, processes and portals to invoke to access and maintain master data entities (i.e., system of entry [SOE] MDM services).
* Manage master data hierarchies including a history of hierarchy changes and hierarchy versions.
* Manage the synchronisation of changes to master data to all operational and analytical systems that use complete sets or subsets of this data.
Here is a brief elucidation of the Ten Golden Principles that needs to be appreciated during any MDM Investment
1. Getting Started: Irrespective of the industry, market segment, or existing IT environmentthere is a hunt for the right way to launch an MDM initiative. Many due diligence initiatives propose that MDM is the only right answer to the most critical business problems. Many Corporates have started securing budgets . for launching a sustainable MDM program. The Idea is clear "Start small -- with an initial project -- but think large-scale, long-term, and across subject areas." MDM requires new technologies, specialized skills, and a business focus. With all of these ingredients in place, the payoff is well worth the effort.
2. ROI : Perhaps the biggest issue is how to justify and get funding for an MDM project. As with data warehousing projects, some organizations are blessed with enlightened executives who understand the correlation between high-quality, consistent data and their strategic objectives; these executives may approve MDM projects without delay. Other project leaders must closely align MDM projects with business need and pain and perform a detailed cost-justification analysis. MDM project managers can easily identify and monetize cost savings, but the best approach is to align the MDM initiative with strategic objectives or shoehorn it into approved projects as a necessary underpinning for success.
3. Serendipity: MDM is a business solution that offers a myriad of unexpected benefits once implemented. Many MDM early adopters discovered that once they cleansed and reconciled data through an MDM initiative, they not only improved the quality and consistency of the data available among systems supporting key business processes, but they also enabled other business initiatives such as mergers and acquisitions support, customer relationship management, target marketing, and supply chain optimization. “Data is a corporate asset, and when carefully managed, it provides a strong, stable foundation to support any information-centric business initiative an organization wishes to pursue now or in the future," said Wayne Eckerson, director of research at TDWI. Without MDM, many organizations will spend millions of additional dollars executing information-centric strategic initiatives or won't even attempt them at all.
4. Change Management: Change management is key. From a technical perspective, understanding and socializing the impact of MDM development activities has everything to do with the perception of success. From a cultural perspective, managing the expectations of business and IT stakeholders is nothing less than a make-or-break proposition. Change management is hard and can derail an MDM project if you aren't careful, when you change the data that end users have become accustomed to receiving through reports or other means, it can cause significant angst. You have to anticipate this, implement a transition plan, and prepare the users.
5. Roadblocks : IT and business can pose significant roadblocks. IT stakeholders, many of whom are beholden to established technologies, often need to hear the MDM pitch as much as business users do. Distinguishing MDM from incumbent technologies is an often-underestimated step. Conversely, the business may not want to initiate or fund an MDM project when they already have many of the existing tools and technologies required to do MDM. A very commonly addressed concern/query from majority of the CTOs "Our business has already funded our data warehouse and a customer relationship management solution.,wasn't the data warehouse supposed to solve these issues? wasn't the CRM system supposed to reconcile customers? How can I now convince the stakeholders to take on an MDM initiative?" The answer is that MDM can optimize those existing solutions, reducing their overall expense and minimizing the risk that they will deliver bad data (which could be their kiss of death). Also it is advisable that when purchasing an MDM solution, you shouldn't pay vendors for comparable technologies you already have in house, but which come bundled in their packages.
6. Enterprise Scope : It is true Enterprise MDM may be fraught with problems. With the vision of "start small, think big”, the vast majority of the corporates asserts their wish to quickly broaden their initial implementation to encompass additional domains, systems, and data. Organizations have started supporting their CRM program with better data which facilitiaites to perform business functions that could never have done with native CRM." Thus starting Small companies now plans to extend its MDM capabilities to additional operational systems. Once an organization has implemented an MDM project and it takes root, the organization can then decide whether to widen the road by adding more domains to the existing environment or extend the road by using MDM to address other business problems. 7. Data Governance. - "Data governance is a critical path to MDM,In order to be effective data governance must be designed. A company's cultural norms, established development processes, and incumbent steering committees must all factor into its data governance framework.It is recommended to grow data governance organically and in lockstep with an MDM architecture, which evolves over time. First, one should define what policies and rules are needed by the business to formulate to support an MDM project that solves a business problem. Then, one can formalize the requirements needed to sustain the initiative. That way, the business is working in their own perceived interest, not IT's.
8. Cross-System Data Analysis :One major issue is the time and costs involved in understanding and modeling source data that spans multiple, heterogeneous systems. Microsoft had 10 people working for 100 days to analyze source systems targeted for MDM integration, while the European Patent Office has 60 people analyzing and managing patent data originating around the world. Estimates show that the services-to-software ratio in MDM deployments is 10 to 1, with cross-source data analysis consuming the lion's share of the services. Just as in the data warehousing world, when early adopters in the 1990s underestimated the quality and condition of source data required to deliver an enterprise data warehouse, many implementers have not thought much about the challenges of understanding and reconciling source data.
9. Matching :Calibrating matches between records generated by disparate systems is both art and science. Many speakers acknowledged that these matching engines -- which are typically delivered within a data quality tool -- are the brains of the MDM system.
Many a times consultants need a few go-arounds configuring their matching rules. Many Consultants shared the consequences of "over-matching" records and thus consolidating two products or customers into one faulty record. The point was that matching needs to be refined over time. One must remember that MDM is as much about business and data rules as it is about the data itself.
10. Logical Data Models. An existing logical data model can propel you forward. Data administration skills is imperative in MDM. While some vendors maintain that a logical data model isn't required to make their tools work, most agree that the exercise itself can help a company understand definitions and usage scenarios for enterprise data, which makes it easier to gain consensus around data policies. Carl Gerber, senior manager of data warehousing at Tween Brands, shared how his company's data modeling and stewardship skills were a large part of his team's successful MDM delivery. Tween has created a series of data integration hubs to manage various subject areas, such as product, inventory, suppliers, and merchandising hierarchies. All data exchanged between systems passes through these hubs to ensure data consistency and reconciliation. The architecture has eliminated numerous point-to-point interfaces and improved operational efficiency, decision making, and revenue-generation objectives.
Monday, February 9, 2009
How to work with Command in the new MDM Java API Vijendra Singh Bhanot
How to work with Command in the new MDM Java API
Vijendra Singh Bhanot
I am new to the MDM Java API. For one of the existing project I was asked to explore the new MDM Java API. I do not have any experience with the earlier MDM4J but still I was able to understand and explore the new MDM Java API. I think this API is excellent and I am almost fallen in love with it. The best part I like about this API is the Commands. Initially it took me time to understand the concept but very soon it was all clear. With this blog I would like to share my knowledge about how to work with Commands. Also in this blog you will see an example of Validation Command.
Why you need a Command? What is a Command?
I command the MDM Java API to get me the list of Validations …..
Well, that’s what the idea is.
Command is a special class that instructs MDM Java API to perform some action. All actions like managing repository content, searching records, managing repository schema and etc, have dedicated commands.
All these commands are logically organized in packages. You can always refer Java Doc to identify which Command you need to use. (https://help.sap.com/javadocs/MDM/current/index.html)
How to Use a Command?
All Command’s are used in the following way:
1 RetrieveValidationsCommand objRetrieveValidationsCommand = new RetrieveValidationsCommand();
2 objRetrieveValidationsCommand.setSession();
3 objRetrieveValidationsCommand.setTableId(<>); // Required
try {
4 objRetrieveValidationsCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}
(1) You create a new instance of a particular command by passing it the ConnectionAccessor object.
(2) The very second step is mostly the setSession(). Here it is important that you use the right type of session (Server Session, Repository Session or User Session). For Example, you may not be able to get list of validations if you use a Repository Session instead of User Session.
As I already mentioned that setSession() is not always the second step. Actually the commands responsible for creation of session are the ones that do not require a session. Thus these commands (CreateServerSessionCommand, CreateRepositorySessionCommand and CreateUserSessionCommand) do not require setSession().
(3) Some commands require specific setter method to be set before it is used. These setter methods are marked as “Required” in the Java Dock. There are few which are marked optional.
(4) Inside a try block you execute the command. If there is an error then CommandException is thrown.
Here is a sample of RetrieveValidationsCommand in action…
/*
* Created on Feb 7, 2008
*
* To change the template for this generated file go to
* Window>Preferences>Java>Code Generation>Code and Comments
*/
package demo.validation;
import com.sap.mdm.commands.AuthenticateUserSessionCommand;
import com.sap.mdm.commands.CommandException;
import com.sap.mdm.commands.CreateUserSessionCommand;
import com.sap.mdm.commands.GetRepositoryRegionListCommand;
import com.sap.mdm.data.RegionProperties;
import com.sap.mdm.ids.TableId;
import com.sap.mdm.net.ConnectionException;
import com.sap.mdm.net.ConnectionPool;
import com.sap.mdm.net.ConnectionPoolFactory;
import com.sap.mdm.server.DBMSType;
import com.sap.mdm.server.RepositoryIdentifier;
import com.sap.mdm.validation.ValidationProperties;
import com.sap.mdm.validation.ValidationPropertiesResult;
import com.sap.mdm.validation.commands.RetrieveValidationsCommand;
/**
*
*
* To change the template for this generated type comment go to
* Window>Preferences>Java>Code Generation>Code and Comments
*/
public class GetListOfValidations {
public static void main(String[] args) {
// create connection pool to a MDM server
String serverName = "LOCALHOST";
ConnectionPool connections = null;
try {
connections = ConnectionPoolFactory.getInstance(serverName);
} catch (ConnectionException e) {
e.printStackTrace();
return;
}
// specify the repository to use
// alternatively, a repository identifier can be obtain from the GetMountedRepositoryListCommand
String repositoryName = "INQDemo";
String dbmsName = "localhost";
RepositoryIdentifier reposId =
new RepositoryIdentifier(repositoryName, dbmsName, DBMSType.ORACLE);
// get list of available regions for the repository
GetRepositoryRegionListCommand regionListCommand =
new GetRepositoryRegionListCommand(connections);
regionListCommand.setRepositoryIdentifier(reposId);
try {
regionListCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}
RegionProperties[] regions = regionListCommand.getRegions();
// create a user session
CreateUserSessionCommand sessionCommand =
new CreateUserSessionCommand(connections);
sessionCommand.setRepositoryIdentifier(reposId);
sessionCommand.setDataRegion(regions[0]); // use the first region
try {
sessionCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}
String sessionId = sessionCommand.getUserSession();
// authenticate the user session
String userName = "admin";
String userPassword = "admin";
AuthenticateUserSessionCommand authCommand =
new AuthenticateUserSessionCommand(connections);
authCommand.setSession(sessionId);
authCommand.setUserName(userName);
authCommand.setUserPassword(userPassword);
try {
authCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}
// the main table, hard-coded
TableId mainTableId = new TableId(1);
// Get the list of validations
RetrieveValidationsCommand objRtvVldCmd =
new RetrieveValidationsCommand(connections);
// set the user session
objRtvVldCmd.setSession(sessionId);
// get validation for the following tables.
objRtvVldCmd.setTableId(mainTableId);
try {
objRtvVldCmd.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}
ValidationPropertiesResult objVldPropRslt =
objRtvVldCmd.getValidationPropertiesResult();
ValidationProperties[] validations = objVldPropRslt.getValidations();
//disply --> Validation ID | error/warning message | Validation Name
for (int i = 0; i < validations.length; i++) {
System.out.println(
validations[i].getId()
+ " | "
+ validations[i].getMessage()
+ " | "
+ validations[i].getName());
}
}
}
Vijendra Singh Bhanot is a certified XI Consultant
Vijendra Singh Bhanot
I am new to the MDM Java API. For one of the existing project I was asked to explore the new MDM Java API. I do not have any experience with the earlier MDM4J but still I was able to understand and explore the new MDM Java API. I think this API is excellent and I am almost fallen in love with it. The best part I like about this API is the Commands. Initially it took me time to understand the concept but very soon it was all clear. With this blog I would like to share my knowledge about how to work with Commands. Also in this blog you will see an example of Validation Command.
Why you need a Command? What is a Command?
I command the MDM Java API to get me the list of Validations …..
Well, that’s what the idea is.
Command is a special class that instructs MDM Java API to perform some action. All actions like managing repository content, searching records, managing repository schema and etc, have dedicated commands.
All these commands are logically organized in packages. You can always refer Java Doc to identify which Command you need to use. (https://help.sap.com/javadocs/MDM/current/index.html)
How to Use a Command?
All Command’s are used in the following way:
1 RetrieveValidationsCommand objRetrieveValidationsCommand = new RetrieveValidationsCommand(
2 objRetrieveValidationsCommand.setSession(
3 objRetrieveValidationsCommand.setTableId(<>); // Required
try {
4 objRetrieveValidationsCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}
(1) You create a new instance of a particular command by passing it the ConnectionAccessor object.
(2) The very second step is mostly the setSession(
As I already mentioned that setSession(
(3) Some commands require specific setter method to be set before it is used. These setter methods are marked as “Required” in the Java Dock. There are few which are marked optional.
(4) Inside a try block you execute the command. If there is an error then CommandException is thrown.
Here is a sample of RetrieveValidationsCommand in action…
/*
* Created on Feb 7, 2008
*
* To change the template for this generated file go to
* Window>Preferences>Java>Code Generation>Code and Comments
*/
package demo.validation;
import com.sap.mdm.commands.AuthenticateUserSessionCommand;
import com.sap.mdm.commands.CommandException;
import com.sap.mdm.commands.CreateUserSessionCommand;
import com.sap.mdm.commands.GetRepositoryRegionListCommand;
import com.sap.mdm.data.RegionProperties;
import com.sap.mdm.ids.TableId;
import com.sap.mdm.net.ConnectionException;
import com.sap.mdm.net.ConnectionPool;
import com.sap.mdm.net.ConnectionPoolFactory;
import com.sap.mdm.server.DBMSType;
import com.sap.mdm.server.RepositoryIdentifier;
import com.sap.mdm.validation.ValidationProperties;
import com.sap.mdm.validation.ValidationPropertiesResult;
import com.sap.mdm.validation.commands.RetrieveValidationsCommand;
/**
*
*
* To change the template for this generated type comment go to
* Window>Preferences>Java>Code Generation>Code and Comments
*/
public class GetListOfValidations {
public static void main(String[] args) {
// create connection pool to a MDM server
String serverName = "LOCALHOST";
ConnectionPool connections = null;
try {
connections = ConnectionPoolFactory.getInstance(serverName);
} catch (ConnectionException e) {
e.printStackTrace();
return;
}
// specify the repository to use
// alternatively, a repository identifier can be obtain from the GetMountedRepositoryListCommand
String repositoryName = "INQDemo";
String dbmsName = "localhost";
RepositoryIdentifier reposId =
new RepositoryIdentifier(repositoryName, dbmsName, DBMSType.ORACLE);
// get list of available regions for the repository
GetRepositoryRegionListCommand regionListCommand =
new GetRepositoryRegionListCommand(connections);
regionListCommand.setRepositoryIdentifier(reposId);
try {
regionListCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}
RegionProperties[] regions = regionListCommand.getRegions();
// create a user session
CreateUserSessionCommand sessionCommand =
new CreateUserSessionCommand(connections);
sessionCommand.setRepositoryIdentifier(reposId);
sessionCommand.setDataRegion(regions[0]); // use the first region
try {
sessionCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}
String sessionId = sessionCommand.getUserSession();
// authenticate the user session
String userName = "admin";
String userPassword = "admin";
AuthenticateUserSessionCommand authCommand =
new AuthenticateUserSessionCommand(connections);
authCommand.setSession(sessionId);
authCommand.setUserName(userName);
authCommand.setUserPassword(userPassword);
try {
authCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}
// the main table, hard-coded
TableId mainTableId = new TableId(1);
// Get the list of validations
RetrieveValidationsCommand objRtvVldCmd =
new RetrieveValidationsCommand(connections);
// set the user session
objRtvVldCmd.setSession(sessionId);
// get validation for the following tables.
objRtvVldCmd.setTableId(mainTableId);
try {
objRtvVldCmd.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}
ValidationPropertiesResult objVldPropRslt =
objRtvVldCmd.getValidationPropertiesResult();
ValidationProperties[] validations = objVldPropRslt.getValidations();
//disply --> Validation ID | error/warning message | Validation Name
for (int i = 0; i < validations.length; i++) {
System.out.println(
validations[i].getId()
+ " | "
+ validations[i].getMessage()
+ " | "
+ validations[i].getName());
}
}
}
Vijendra Singh Bhanot is a certified XI Consultant
Thursday, February 5, 2009
MDM Java API 2 an introductive series part IV Tobias Grunow
MDM Java API 2 an introductive series part IV
Tobias Grunow
Introduction
In my last Blog I showed you how to get an authenticated user session using a trusted connection. In this part I want to show you how to gather MDM system information from a SAP Portal system object.
After that I will introduce the basic concept behind working with MDM and the Java API 2, showing you how the system works (RecordID, Lookup Values, Display Fields...).
If you build a solution based on MDM you will sooner or later face a two or three tier MDM system landscape. Meaning that there will be a development system (D-System) and there might be a quality assurance system (Q-System) and there will be a productive system (P-System). This scenario can include multiple MDM Server or maybe only different Repositories on the same server and in addition maybe (D-, Q-, P-Clients). So now you face the problem how to address the different repositories from your code without having to take care about addressing the right one dependant on which landscape you are in.
In case you are using a SAP Portal I want to point to a very useful thread I have found on SDN.
Hard-code credentials - any other solution exists?
This forum post shows you how to work with the SAP Portal system objects and how to retrieve information out of it. I have used this concept to address the problem I have just pointed out to you in addition with the trusted connection user session authentication and it worked fine for me.
Let us continue with the next topic...
MDM Java API 2 introducing the concept
First of all I want to give u a brief introduction into the concept of MDM to better understand my future coding. As a coder you need a much better understanding of the concept behind the MDM than anyone else. Simple GUI-Users (Graphical User Interface) don’t have the need to understand the technical details behind the data storage and operations like Java programmers. So at the beginning the most important task is to understand how MDM will store data and connect the data with each other.
So let’s start with the first graphic:
image
Figure 1: Table concept in MDM
The first graphic shows the table concept of MDM. The central table called “Main table” is the centre of the model. From the main table there will be references going towards any kind of sub tables. In the Main table will be fields. Those fields can hold values which are stored directly in the table or hold a reference to any record of a sub table. Sub tables can store data values or hold references to other sub tables.
To illustrate the possible layout of a main table take a look at figure 2.
[Click the picture to enlarge!]
image
Figure 2: Possible layout in main table (MDM Console view)
As you can see the main table stores e.g. text values directly as well as references to other sub tables (Lookup’s of different kinds). To find out where the lookup’s are pointing to, you can open up the MDM Console and click on an entry in the list of fields in the main table to see its details (Figure 3).
image
Figure 3: Main table field details (Bottom part of MDM Console)
If we look at the details we have to notice two important things.
First, figure 3 shows us a field detail named “CODE”. This code is very important for us because it is the name we will use in our code to address this field in the table. Second, we have to notice that the field is of “Type” Lookup [Flat]. This tells us, that the value we will find in this field will be of type RecordID.
A RecordID is the ID of the record in the sub table (e.g. Sales Product Key – Table [Detail: Lookup Table]). This means that we will not be able to access the value directly by accessing the field in the main table. We will only get the reference to the sub table which holds the actual value desired. In my future Blogs I will give more details on the concepts behind MDM and give examples of other techniques used.
So enough of MDM concepts and let’s get to the Java API and some examples.
Searching in MDM
Searching in MDM will be one of the most common used functionality there is.
Searching in a repository has some prerequisites. First we have to have a connection to the MDM Server and second we have to have an authenticated user session to a repository. In my Blogs published before I showed you how to setup those prerequisites. In the class provided I have combined all the necessary steps to get the connection and the user session. So now let’s get to the code.
package com.sap.sdn.examples;
import java.util.Locale;
import com.sap.mdm.commands.AuthenticateUserSessionCommand;
import com.sap.mdm.commands.CommandException;
import com.sap.mdm.commands.CreateUserSessionCommand;
import com.sap.mdm.commands.SetUnicodeNormalizationCommand;
import com.sap.mdm.commands.TrustedUserSessionCommand;
import com.sap.mdm.data.RecordResultSet;
import com.sap.mdm.data.RegionProperties;
import com.sap.mdm.data.ResultDefinition;
import com.sap.mdm.data.commands.RetrieveLimitedRecordsCommand;
import com.sap.mdm.ids.FieldId;
import com.sap.mdm.ids.TableId;
import com.sap.mdm.net.ConnectionAccessor;
import com.sap.mdm.net.ConnectionException;
import com.sap.mdm.net.SimpleConnectionFactory;
import com.sap.mdm.search.FieldSearchDimension;
import com.sap.mdm.search.Search;
import com.sap.mdm.search.TextSearchConstraint;
import com.sap.mdm.server.DBMSType;
import com.sap.mdm.server.RepositoryIdentifier;
public class SearchExamples {
// Instance variables needed for processing
private ConnectionAccessor mySimpleConnection;
// Name of the server that mdm runns on
private String serverName = "IBSOLUTI-D790B6";
// Name of the repository shown in the mdm console
private String RepositoryNameAsString = "SDN_Repository";
// Name of the DB-Server this could be an IP address only
private String DBServerNameAsString = "IBSOLUTI-D790B6\\SQLEXPRESS";
// Define the Database type (MS SQL Server)
private DBMSType DBMSTypeUsed = DBMSType.MS_SQL;
// Create a new data region
private RegionProperties dataRegion = new RegionProperties();
// Session which will be used for searching
private String userSession;
// Default user name
private String userName = "Admin";
// Password is empty on default setup
private String userPassword ="";
// result we will get from mdm
public RecordResultSet Result;
/**
* Constructor for class
*/
public SearchExamples(){
// Set the Data Region
dataRegion.setRegionCode("engUSA");
// Set the locale on data region
dataRegion.setLocale(new Locale("en", "US"));
// Set the name of data region
dataRegion.setName("US");
// get a connection to the server
this.getConnection();
// Authenticate a user session
try {
this.getAuthenticatedUserSession();
} catch (ConnectionException e) {
// Do something with exception
e.printStackTrace();
} catch (CommandException e) {
// Do something with exception
e.printStackTrace();
}
// Get resulting records
Result = this.SearchTypes();
}
// ResultDefinition Main Table declaration
private ResultDefinition rdMain;
/**
* Method that will search for all records in main table of a certain type
*
* @return RecordResultSet that holds all resulting records from search
*/
public RecordResultSet SearchTypes() {
/**
* 1. First we create the Result Definition. This result definition will
* tell the search which fields are of interest to us. The list could
* include all fields of the table or only the ones we are interested
* in.
*/
// Define which table should be represented by this ResultDefintion
// In my repository this is the table MAINTABLE
rdMain = new ResultDefinition(new TableId(1));
// Add the desired FieldId's to the result definition
// In my repository this is the field PRODUCT_NAME
rdMain.addSelectField(new FieldId(2));
// In my repository this is the field TYP
rdMain.addSelectField(new FieldId(27));
/**
* 2. Create the needed search parameters.
* Define what to search for and where.
*/
// Create the field search dimension [Where to search!?]
FieldSearchDimension fsdMaintableType = new FieldSearchDimension(new FieldId(27));
// Create the text search constraint [What to search for?! (Every record that contains ROOT)]
TextSearchConstraint tscTypeRoot = new TextSearchConstraint("ROOT", TextSearchConstraint.CONTAINS);
/**
* 3.
* Create the search object with the given search parameters.
*/
// Create the search
Search seSearchTypeRoot = new Search(new TableId(1));
// Add the parameters to the search
seSearchTypeRoot.addSearchItem(fsdMaintableType, tscTypeRoot);
/**
* 4.
* Create the command to search with and retrieve the result
*/
// Build the command
RetrieveLimitedRecordsCommand rlrcGetRecordsOfTypeRoot = new RetrieveLimitedRecordsCommand(mySimpleConnection);
// Set the search to use for command
rlrcGetRecordsOfTypeRoot.setSearch(seSearchTypeRoot);
// Set the session to use for command
rlrcGetRecordsOfTypeRoot.setSession(this.userSession);
// Set the result definition to use
rlrcGetRecordsOfTypeRoot.setResultDefinition(rdMain);
// Try to execute the command
try {
rlrcGetRecordsOfTypeRoot.execute();
} catch (CommandException e) {
// Do something with the exception
e.printStackTrace();
}
// Return the result
return rlrcGetRecordsOfTypeRoot.getRecords();
}
/**
* Create and authenticate a new user session to an MDM repository.
*
* @param mySimpleConnection
* The connection to the MDM Server
* @param RepositoryNameAsString
* name of the repository to connect to
* @param DBServerNameAsString
* name of DBServer
* @param DBMSType
* Type of DBMS that MDM works with
* @param dataRegion
* RegionProperties defining the language the repository should
* be connected with.
* @param userName
* Name of the user that should make the connection to repository
* @param userPassword
* password of user that should be used if connection is not trusted
* @throws ConnectionException
* is propagated from the API
* @throws CommandException
* is propagated from the API
*/
public String getAuthenticatedUserSession(
) throws ConnectionException, CommandException {
/*
* We need a RepositoryIdentifier to connect to the desired repository
* parameters for the constructor are: Repository name as string as read
* in the MDM Console in the "Name" field DB Server name as string as
* used while creating a repository DBMS Type as string - Valid types
* are: MSQL, ORCL, IDB2, IZOS, IIOS, MXDB
*/
RepositoryIdentifier repId = new RepositoryIdentifier(
RepositoryNameAsString, DBServerNameAsString, DBMSTypeUsed);
// Create the command to get the Session
CreateUserSessionCommand createUserSessionCommand = new CreateUserSessionCommand(
mySimpleConnection);
// Set the identifier
createUserSessionCommand.setRepositoryIdentifier(repId);
// Set the region to use for Session - (Language)
createUserSessionCommand.setDataRegion(dataRegion);
// Execute the command
createUserSessionCommand.execute();
// Get the session identifier
this.userSession = createUserSessionCommand.getUserSession();
// Authenticate the user session
try {
// Use command to authenticate user session on trusted connection
TrustedUserSessionCommand tuscTrustedUser = new TrustedUserSessionCommand(
mySimpleConnection);
// Set the user name to use
tuscTrustedUser.setUserName(userName);
tuscTrustedUser.setSession(this.userSession);
tuscTrustedUser.execute();
this.userSession = tuscTrustedUser.getSession();
} catch (com.sap.mdm.commands.CommandException e) {
/* In Case the Connection is not Trusted */
AuthenticateUserSessionCommand authenticateUserSessionCommand = new AuthenticateUserSessionCommand(
mySimpleConnection);
authenticateUserSessionCommand.setSession(this.userSession);
authenticateUserSessionCommand.setUserName(userName);
authenticateUserSessionCommand.setUserPassword(userPassword);
authenticateUserSessionCommand.execute();
}
// For further information see:
// http://help.sap.com/javadocs/MDM/current/com/sap/mdm/commands/SetUnicodeNormalizationCommand.html
// Create the normalization command
SetUnicodeNormalizationCommand setUnicodeNormalizationCommand = new SetUnicodeNormalizationCommand(
mySimpleConnection);
// Set the session to be used
setUnicodeNormalizationCommand.setSession(this.userSession);
// Set the normalization type
setUnicodeNormalizationCommand
.setNormalizationType(SetUnicodeNormalizationCommand.NORMALIZATION_COMPOSED);
// Execute the command
setUnicodeNormalizationCommand.execute();
// Return the session identifier as string value
return this.userSession;
}
/*
* The method will return a ConnectionAccessor which is needed every time
* you want to execute a Command like searching or as on any other Command
* there is. @return SimpleConnection to MDM Server
*/
public void getConnection() {
String sHostName = serverName;
// We need a try / catch statement or a throws for the method cause
try {
/*
* retrieve connection from Factory The hostname can be the name of
* the server if it is listening on the standard port 20005 or a
* combination of Servername:Portnumber eg. MDMSERVER:40000
*/
mySimpleConnection = SimpleConnectionFactory.getInstance(sHostName);
} catch (ConnectionException e) {
// Do some exception handling
e.printStackTrace();
}
}
}
To test the code we also need a simple test class that will instantiate the sample class and prints out the number of retrieved Records.
package com.sap.sdn.examples;
public class Test {
/**
* @param args
*/
public static void main(String[] args) {
// Create instance of search class
SearchExamples test = new SearchExamples();
// Print out the amount of found records
System.out.println("Total of " + test.Result.getCount() + " Records found that match criteria TYPE=ROOT");
}
}
I wrote a lot of comments in the code but I will give you some more details on what is happening there.
First of all there are some instance variables that hold the connection information to the MDM Server and the MDM Repository.
The constructor will set up some variables which are needed for the search.
A connection to the MDM server will be created.
A Session will be created and will be authenticated, if there is a trusted connection we will use it to authenticate and if we only have a normal connection to the server normal authentication will be used.
The search will be triggered and the result will be stored in instance variable.
The search needs some elements to work with.
At the beginning in the SearchTypes() method [Comment 1. in code] set up a ResultDefiniton which tells the search what fields are of interest to the program and should be accessible in the result. If this definition is not including some desired field and you want to access it later in your code, an exception will be thrown.
[Comment 2. in code] Define a search dimension and a search constraint to tell the search where to look at and what to look for. There are a lot of search constraints to work with.
All available constraints:
[Comment 3. in code] Define the search itself and use the dimension and constraint to set it up.
[Comment 4. in code] Build the command that is needed to execute the search.
More Details on command please see this link
So this is just a very simple example and I will give you more advanced codes in future Blogs.
If you have any questions on what the code does or need more detailed explanations please feel free to comment on this Blog. If this code helped you a little please feel free to comment as well.
So this will be the last Blog for this year since I am rebuilding my flat at the moment. The topic of the next Blog needs to be defined and I will update my agenda in my first Blog next year.
So I wish you a happy new year and as you would say in german: “ Einen guten Rutsch ins neue Jahr!”.
Best regards,
Tobi
Tobias Grunow is a Solution Consultant for IBSolution GmbH Heilbronn, Germany. He is a member of the research and development team.
MDM Java API 2 an introductive series part IV
Tobias Grunow
Tobias Grunow
Introduction
In my last Blog I showed you how to get an authenticated user session using a trusted connection. In this part I want to show you how to gather MDM system information from a SAP Portal system object.
After that I will introduce the basic concept behind working with MDM and the Java API 2, showing you how the system works (RecordID, Lookup Values, Display Fields...).
If you build a solution based on MDM you will sooner or later face a two or three tier MDM system landscape. Meaning that there will be a development system (D-System) and there might be a quality assurance system (Q-System) and there will be a productive system (P-System). This scenario can include multiple MDM Server or maybe only different Repositories on the same server and in addition maybe (D-, Q-, P-Clients). So now you face the problem how to address the different repositories from your code without having to take care about addressing the right one dependant on which landscape you are in.
In case you are using a SAP Portal I want to point to a very useful thread I have found on SDN.
Hard-code credentials - any other solution exists?
This forum post shows you how to work with the SAP Portal system objects and how to retrieve information out of it. I have used this concept to address the problem I have just pointed out to you in addition with the trusted connection user session authentication and it worked fine for me.
Let us continue with the next topic...
MDM Java API 2 introducing the concept
First of all I want to give u a brief introduction into the concept of MDM to better understand my future coding. As a coder you need a much better understanding of the concept behind the MDM than anyone else. Simple GUI-Users (Graphical User Interface) don’t have the need to understand the technical details behind the data storage and operations like Java programmers. So at the beginning the most important task is to understand how MDM will store data and connect the data with each other.
So let’s start with the first graphic:
image
Figure 1: Table concept in MDM
The first graphic shows the table concept of MDM. The central table called “Main table” is the centre of the model. From the main table there will be references going towards any kind of sub tables. In the Main table will be fields. Those fields can hold values which are stored directly in the table or hold a reference to any record of a sub table. Sub tables can store data values or hold references to other sub tables.
To illustrate the possible layout of a main table take a look at figure 2.
[Click the picture to enlarge!]
image
Figure 2: Possible layout in main table (MDM Console view)
As you can see the main table stores e.g. text values directly as well as references to other sub tables (Lookup’s of different kinds). To find out where the lookup’s are pointing to, you can open up the MDM Console and click on an entry in the list of fields in the main table to see its details (Figure 3).
image
Figure 3: Main table field details (Bottom part of MDM Console)
If we look at the details we have to notice two important things.
First, figure 3 shows us a field detail named “CODE”. This code is very important for us because it is the name we will use in our code to address this field in the table. Second, we have to notice that the field is of “Type” Lookup [Flat]. This tells us, that the value we will find in this field will be of type RecordID.
A RecordID is the ID of the record in the sub table (e.g. Sales Product Key – Table [Detail: Lookup Table]). This means that we will not be able to access the value directly by accessing the field in the main table. We will only get the reference to the sub table which holds the actual value desired. In my future Blogs I will give more details on the concepts behind MDM and give examples of other techniques used.
So enough of MDM concepts and let’s get to the Java API and some examples.
Searching in MDM
Searching in MDM will be one of the most common used functionality there is.
Searching in a repository has some prerequisites. First we have to have a connection to the MDM Server and second we have to have an authenticated user session to a repository. In my Blogs published before I showed you how to setup those prerequisites. In the class provided I have combined all the necessary steps to get the connection and the user session. So now let’s get to the code.
package com.sap.sdn.examples;
import java.util.Locale;
import com.sap.mdm.commands.AuthenticateUserSessionCommand;
import com.sap.mdm.commands.CommandException;
import com.sap.mdm.commands.CreateUserSessionCommand;
import com.sap.mdm.commands.SetUnicodeNormalizationCommand;
import com.sap.mdm.commands.TrustedUserSessionCommand;
import com.sap.mdm.data.RecordResultSet;
import com.sap.mdm.data.RegionProperties;
import com.sap.mdm.data.ResultDefinition;
import com.sap.mdm.data.commands.RetrieveLimitedRecordsCommand;
import com.sap.mdm.ids.FieldId;
import com.sap.mdm.ids.TableId;
import com.sap.mdm.net.ConnectionAccessor;
import com.sap.mdm.net.ConnectionException;
import com.sap.mdm.net.SimpleConnectionFactory;
import com.sap.mdm.search.FieldSearchDimension;
import com.sap.mdm.search.Search;
import com.sap.mdm.search.TextSearchConstraint;
import com.sap.mdm.server.DBMSType;
import com.sap.mdm.server.RepositoryIdentifier;
public class SearchExamples {
// Instance variables needed for processing
private ConnectionAccessor mySimpleConnection;
// Name of the server that mdm runns on
private String serverName = "IBSOLUTI-D790B6";
// Name of the repository shown in the mdm console
private String RepositoryNameAsString = "SDN_Repository";
// Name of the DB-Server this could be an IP address only
private String DBServerNameAsString = "IBSOLUTI-D790B6\\SQLEXPRESS";
// Define the Database type (MS SQL Server)
private DBMSType DBMSTypeUsed = DBMSType.MS_SQL;
// Create a new data region
private RegionProperties dataRegion = new RegionProperties();
// Session which will be used for searching
private String userSession;
// Default user name
private String userName = "Admin";
// Password is empty on default setup
private String userPassword ="";
// result we will get from mdm
public RecordResultSet Result;
/**
* Constructor for class
*/
public SearchExamples(){
// Set the Data Region
dataRegion.setRegionCode("engUSA");
// Set the locale on data region
dataRegion.setLocale(new Locale("en", "US"));
// Set the name of data region
dataRegion.setName("US");
// get a connection to the server
this.getConnection();
// Authenticate a user session
try {
this.getAuthenticatedUserSession();
} catch (ConnectionException e) {
// Do something with exception
e.printStackTrace();
} catch (CommandException e) {
// Do something with exception
e.printStackTrace();
}
// Get resulting records
Result = this.SearchTypes();
}
// ResultDefinition Main Table declaration
private ResultDefinition rdMain;
/**
* Method that will search for all records in main table of a certain type
*
* @return RecordResultSet that holds all resulting records from search
*/
public RecordResultSet SearchTypes() {
/**
* 1. First we create the Result Definition. This result definition will
* tell the search which fields are of interest to us. The list could
* include all fields of the table or only the ones we are interested
* in.
*/
// Define which table should be represented by this ResultDefintion
// In my repository this is the table MAINTABLE
rdMain = new ResultDefinition(new TableId(1));
// Add the desired FieldId's to the result definition
// In my repository this is the field PRODUCT_NAME
rdMain.addSelectField(new FieldId(2));
// In my repository this is the field TYP
rdMain.addSelectField(new FieldId(27));
/**
* 2. Create the needed search parameters.
* Define what to search for and where.
*/
// Create the field search dimension [Where to search!?]
FieldSearchDimension fsdMaintableType = new FieldSearchDimension(new FieldId(27));
// Create the text search constraint [What to search for?! (Every record that contains ROOT)]
TextSearchConstraint tscTypeRoot = new TextSearchConstraint("ROOT", TextSearchConstraint.CONTAINS);
/**
* 3.
* Create the search object with the given search parameters.
*/
// Create the search
Search seSearchTypeRoot = new Search(new TableId(1));
// Add the parameters to the search
seSearchTypeRoot.addSearchItem(fsdMaintableType, tscTypeRoot);
/**
* 4.
* Create the command to search with and retrieve the result
*/
// Build the command
RetrieveLimitedRecordsCommand rlrcGetRecordsOfTypeRoot = new RetrieveLimitedRecordsCommand(mySimpleConnection);
// Set the search to use for command
rlrcGetRecordsOfTypeRoot.setSearch(seSearchTypeRoot);
// Set the session to use for command
rlrcGetRecordsOfTypeRoot.setSession(this.userSession);
// Set the result definition to use
rlrcGetRecordsOfTypeRoot.setResultDefinition(rdMain);
// Try to execute the command
try {
rlrcGetRecordsOfTypeRoot.execute();
} catch (CommandException e) {
// Do something with the exception
e.printStackTrace();
}
// Return the result
return rlrcGetRecordsOfTypeRoot.getRecords();
}
/**
* Create and authenticate a new user session to an MDM repository.
*
* @param mySimpleConnection
* The connection to the MDM Server
* @param RepositoryNameAsString
* name of the repository to connect to
* @param DBServerNameAsString
* name of DBServer
* @param DBMSType
* Type of DBMS that MDM works with
* @param dataRegion
* RegionProperties defining the language the repository should
* be connected with.
* @param userName
* Name of the user that should make the connection to repository
* @param userPassword
* password of user that should be used if connection is not trusted
* @throws ConnectionException
* is propagated from the API
* @throws CommandException
* is propagated from the API
*/
public String getAuthenticatedUserSession(
) throws ConnectionException, CommandException {
/*
* We need a RepositoryIdentifier to connect to the desired repository
* parameters for the constructor are: Repository name as string as read
* in the MDM Console in the "Name" field DB Server name as string as
* used while creating a repository DBMS Type as string - Valid types
* are: MSQL, ORCL, IDB2, IZOS, IIOS, MXDB
*/
RepositoryIdentifier repId = new RepositoryIdentifier(
RepositoryNameAsString, DBServerNameAsString, DBMSTypeUsed);
// Create the command to get the Session
CreateUserSessionCommand createUserSessionCommand = new CreateUserSessionCommand(
mySimpleConnection);
// Set the identifier
createUserSessionCommand.setRepositoryIdentifier(repId);
// Set the region to use for Session - (Language)
createUserSessionCommand.setDataRegion(dataRegion);
// Execute the command
createUserSessionCommand.execute();
// Get the session identifier
this.userSession = createUserSessionCommand.getUserSession();
// Authenticate the user session
try {
// Use command to authenticate user session on trusted connection
TrustedUserSessionCommand tuscTrustedUser = new TrustedUserSessionCommand(
mySimpleConnection);
// Set the user name to use
tuscTrustedUser.setUserName(userName);
tuscTrustedUser.setSession(this.userSession);
tuscTrustedUser.execute();
this.userSession = tuscTrustedUser.getSession();
} catch (com.sap.mdm.commands.CommandException e) {
/* In Case the Connection is not Trusted */
AuthenticateUserSessionCommand authenticateUserSessionCommand = new AuthenticateUserSessionCommand(
mySimpleConnection);
authenticateUserSessionCommand.setSession(this.userSession);
authenticateUserSessionCommand.setUserName(userName);
authenticateUserSessionCommand.setUserPassword(userPassword);
authenticateUserSessionCommand.execute();
}
// For further information see:
// http://help.sap.com/javadocs/MDM/current/com/sap/mdm/commands/SetUnicodeNormalizationCommand.html
// Create the normalization command
SetUnicodeNormalizationCommand setUnicodeNormalizationCommand = new SetUnicodeNormalizationCommand(
mySimpleConnection);
// Set the session to be used
setUnicodeNormalizationCommand.setSession(this.userSession);
// Set the normalization type
setUnicodeNormalizationCommand
.setNormalizationType(SetUnicodeNormalizationCommand.NORMALIZATION_COMPOSED);
// Execute the command
setUnicodeNormalizationCommand.execute();
// Return the session identifier as string value
return this.userSession;
}
/*
* The method will return a ConnectionAccessor which is needed every time
* you want to execute a Command like searching or as on any other Command
* there is. @return SimpleConnection to MDM Server
*/
public void getConnection() {
String sHostName = serverName;
// We need a try / catch statement or a throws for the method cause
try {
/*
* retrieve connection from Factory The hostname can be the name of
* the server if it is listening on the standard port 20005 or a
* combination of Servername:Portnumber eg. MDMSERVER:40000
*/
mySimpleConnection = SimpleConnectionFactory.getInstance(sHostName);
} catch (ConnectionException e) {
// Do some exception handling
e.printStackTrace();
}
}
}
To test the code we also need a simple test class that will instantiate the sample class and prints out the number of retrieved Records.
package com.sap.sdn.examples;
public class Test {
/**
* @param args
*/
public static void main(String[] args) {
// Create instance of search class
SearchExamples test = new SearchExamples();
// Print out the amount of found records
System.out.println("Total of " + test.Result.getCount() + " Records found that match criteria TYPE=ROOT");
}
}
I wrote a lot of comments in the code but I will give you some more details on what is happening there.
First of all there are some instance variables that hold the connection information to the MDM Server and the MDM Repository.
The constructor will set up some variables which are needed for the search.
A connection to the MDM server will be created.
A Session will be created and will be authenticated, if there is a trusted connection we will use it to authenticate and if we only have a normal connection to the server normal authentication will be used.
The search will be triggered and the result will be stored in instance variable.
The search needs some elements to work with.
At the beginning in the SearchTypes() method [Comment 1. in code] set up a ResultDefiniton which tells the search what fields are of interest to the program and should be accessible in the result. If this definition is not including some desired field and you want to access it later in your code, an exception will be thrown.
[Comment 2. in code] Define a search dimension and a search constraint to tell the search where to look at and what to look for. There are a lot of search constraints to work with.
All available constraints:
[Comment 3. in code] Define the search itself and use the dimension and constraint to set it up.
[Comment 4. in code] Build the command that is needed to execute the search.
More Details on command please see this link
So this is just a very simple example and I will give you more advanced codes in future Blogs.
If you have any questions on what the code does or need more detailed explanations please feel free to comment on this Blog. If this code helped you a little please feel free to comment as well.
So this will be the last Blog for this year since I am rebuilding my flat at the moment. The topic of the next Blog needs to be defined and I will update my agenda in my first Blog next year.
So I wish you a happy new year and as you would say in german: “ Einen guten Rutsch ins neue Jahr!”.
Best regards,
Tobi
Tobias Grunow is a Solution Consultant for IBSolution GmbH Heilbronn, Germany. He is a member of the research and development team.
MDM Java API 2 an introductive series part IV
Tobias Grunow
Resolve Consolidated Reporting Problem in multiple BI systems through SAP MDM Ankur Ramkumar Goe
Resolve Consolidated Reporting Problem in multiple BI systems through SAP MDM
Ankur Ramkumar Goe
Summary
Management uses BI reports for planning, Strategies and making Business Decisions. BI systems get data from multiple systems to give consolidated data. However all transactional systems maintain their own master data which in turn is sent to BI. Thus Master data in BI is quite redundant and does not show single version of truth. Due to this, the Management gets wrong reports/information about organizations performance which leads to wrong planning and decisions.
We will try to address this issue with the help of Master Data management solution so that Management gets true view of global business and thus enabling them with unified business information for optimized operations and faster and accurate decisions.
Current Scenario
Organizations, due to various reasons like across geographies, mergers and acquisitions, best of breed products landed in scattered and distributed IT landscape. With the introduction of BI for reporting, currently organizations have respective BI systems for reporting according to geographies, systems, units, functionality. Every transactional system needs to create its own master data as master data drives transactions. These transactional systems are usually connected to their own BI systems and do not share their data with other systems due to various reasons like complexity, number of connections, cost of maintaining connections, lack of governance etc. Thus every BI system ends up getting master data from their own transactional systems and which in turn fed to corporate BI system for consolidated reporting. Without consistent master Data, data warehousing becomes garbage in, garbage out.
Due to all this master data in BI is quite redundant and does not show single version of truth. Hence the Management gets wrong reports presenting wrong information about organizations performance which leads to wrong planning and decisions. The management might be looking on the reports which will be showing 1 single customer as 3 or more customers (e.g. John Kennedy, John F Kennedy, J F Kennedy etc) or same with vendors ( e.g. Satyam, Satyam Computers, Satyam Computers Services ltd etc). Surveys show that more than 70% decisions are made wrongly because of incomplete or wrong reports/information. Even organizations spend 30% of their time to verify the reports/information. There’s a organization which found that their 40% of orders struck because of mismatched master data.
BI system have ELT layer which is specially made for reporting use only. Unfortunately it is not configured and optimized for cleansing Master Data. Also organizations are maintaining and trying to solve master data problems for many years by their own methods and tools. By this organizations were trying to heal the symptoms but were not able to solve root cause of master data problem. Industry has recognized this problem and coming up with a tool to manage master data. With the evolution of MDM tools, organizations will be able to benefit by best practices, reduced efforts and ease. Also maintaining master data outside of BI system will help
Below are two scenarios showing organizations distributed IT landscapes.
Scenario 1 – Organizations BI landscape as per geography
bi 3
Scenario 1
This scenario describes organization landscape as per geography which is distributed across geographies to cater the local and corporate reporting requirements.
Scenario 2 - Organizations BI Landscape as per functionality
image
Scenario 2
This scenario describes organization landscape as per different functionality such as finance and logistics separately.
Suggested Approach
Single version of truth of master data across organization can be achieved through introduction of MDM system. There’s a organization which found that they had 37% duplicates vendors in their systems.
Since we are handling master data of organization, the introduction of MDM system should not be disruptive to other systems and business in mind. Thus small steps approach is recommended for approaching master data management system in organizations. For consolidated reporting giving correct reports, 2 approaches are suggested. The 2nd approach has 2 steps. However organization can go directly for approach 2.2, however decision has to be taken as per disturbance to existing landscape. Also Strong Governance mechanism has to be in place is suggested for CMDM.
Approach 1 Harmonization across BI
Approach 2.1 Master Data from Legacy Systems
Approach 2.2 Central Master Data Management (CMDM)
Approach 1 – Harmonization across BI systems only
Here, will not interfere in organizations current master data flow or mechanism thus not becoming disruptive in current landscape and processes. We will take master data from BI system only, cleanse it and map it and send back to BI systems. The MDM system will send back the mapping of local BI IDs with global MDM IDs. By this approach, management will be able to get reports on both local BI IDs and global MDM IDs.
image
Approach 1 Harmonization across BI systems to achieve consolidated reporting
Benefits:
. Derive immediate benefits from consolidated master data through BI reports
. Mitigated risk of using MDM in the transactional landscape
. Immediate benefit by control on Master Data of newly Merged and Acquired Companies.
. No Interface is required between MDM and BI
. Not disruptive to existing landscape
Limitations:
. Master Data integrity is not achieved throughout the organization and only for Analytics
Approach 2.1 – Master Data from Legacy Systems
In this, we introduce MDM system to transactional systems, before BI systems layer. MDM system gets all the master data from source systems and maintains it. Afterwards this cleansed master data is passed to BI systems to get correct reports based on consolidated and cleansed master data. Here also we are not interfering with source systems and source systems continue with their own master data creation and processes.
However it is still not solving the root cause of the problem. Hence after this organizations should go for next approach.
image
Approach 2.1 Master Data is coming from source systems and fed to BI
Benefits
. Immediate benefit by control on Master Data of newly Merged and Acquired Companies.
. Ensures data integrity across Transactional landscape
. Sets stage for easy re-use via distribution or business processes that directly leverage consolidated data.
. Sets stage to achieve CMDM
Approach 2.2 – Central Master Data Management (CMDM)
This is true Master data management. However organizations need to prepare themselves first to approach this and careful planning needs to be done. This approach will fail if strong governance mechanism will not set in place.
CMDM will be the central master data management system, thus it will maintain all the master data across organization landscape. This will result in single version of truth across organization landscape, and definitely reporting.
image
Approach 2.2 Consistent Master Data is coming through CMDM
It’s upto the organization to implement CMDM as per their convenience, readiness and governance mechanism. There are two ways to achieve CMDM:
1. Organizations can continue to create master data in local systems and then get it cleaned and synchronized to respectable systems with the help of MDM.
2. Organizations create master data centrally in MDM only and then sent back to respective systems to maintain synchronized version of master data across their landscape.
Benefits
. Centralized management of master data
. Ensures data integrity through Enriching and Standardization of master data across landscape
. Centralized Governance - Ensures that data changes in one application, other business applications that depend on that data, are updated with a consistent view of key data as it changes in real time.
MDM Benefits
Below are the benefits which organization will get with the help of master data management.
· Global View of Business
· Single view of truth
· Optimized Operations
· Easy Integration of newly acquired systems
· Elimination of manual and Redundant processes
· Full Interoperability
· Greater Visibility
· Better Accuracy
· Reduction in maintenance cost
· Faster cycles
SAP MDM Benefits
· Easy Integration with non-SAP, SAP and BI systems
· Predefined content for configuration
· Driving eSOA
· Java enabled content for EP and other external systems
· No coding required, easy configuration
· Workflow in windows based Visio
Company: Satyam Computer Services Ltd.
Role: Projects Lead / Project Manager
Above shared views are my personal views and might not synchronize with my company views.
Ankur Ramkumar Goel has over 7 years of SAP experience in ABAP, BI and MDM with implementations, Roll outs, Maintenance, due diligence and strategy projects across Europe, USA and APJ, and has been instrumental in MDM for 3 years.
Resolve Consolidated Reporting Problem in multiple BI systems through SAP MDM
Ankur Ramkumar Goe
Ankur Ramkumar Goe
Summary
Management uses BI reports for planning, Strategies and making Business Decisions. BI systems get data from multiple systems to give consolidated data. However all transactional systems maintain their own master data which in turn is sent to BI. Thus Master data in BI is quite redundant and does not show single version of truth. Due to this, the Management gets wrong reports/information about organizations performance which leads to wrong planning and decisions.
We will try to address this issue with the help of Master Data management solution so that Management gets true view of global business and thus enabling them with unified business information for optimized operations and faster and accurate decisions.
Current Scenario
Organizations, due to various reasons like across geographies, mergers and acquisitions, best of breed products landed in scattered and distributed IT landscape. With the introduction of BI for reporting, currently organizations have respective BI systems for reporting according to geographies, systems, units, functionality. Every transactional system needs to create its own master data as master data drives transactions. These transactional systems are usually connected to their own BI systems and do not share their data with other systems due to various reasons like complexity, number of connections, cost of maintaining connections, lack of governance etc. Thus every BI system ends up getting master data from their own transactional systems and which in turn fed to corporate BI system for consolidated reporting. Without consistent master Data, data warehousing becomes garbage in, garbage out.
Due to all this master data in BI is quite redundant and does not show single version of truth. Hence the Management gets wrong reports presenting wrong information about organizations performance which leads to wrong planning and decisions. The management might be looking on the reports which will be showing 1 single customer as 3 or more customers (e.g. John Kennedy, John F Kennedy, J F Kennedy etc) or same with vendors ( e.g. Satyam, Satyam Computers, Satyam Computers Services ltd etc). Surveys show that more than 70% decisions are made wrongly because of incomplete or wrong reports/information. Even organizations spend 30% of their time to verify the reports/information. There’s a organization which found that their 40% of orders struck because of mismatched master data.
BI system have ELT layer which is specially made for reporting use only. Unfortunately it is not configured and optimized for cleansing Master Data. Also organizations are maintaining and trying to solve master data problems for many years by their own methods and tools. By this organizations were trying to heal the symptoms but were not able to solve root cause of master data problem. Industry has recognized this problem and coming up with a tool to manage master data. With the evolution of MDM tools, organizations will be able to benefit by best practices, reduced efforts and ease. Also maintaining master data outside of BI system will help
Below are two scenarios showing organizations distributed IT landscapes.
Scenario 1 – Organizations BI landscape as per geography
bi 3
Scenario 1
This scenario describes organization landscape as per geography which is distributed across geographies to cater the local and corporate reporting requirements.
Scenario 2 - Organizations BI Landscape as per functionality
image
Scenario 2
This scenario describes organization landscape as per different functionality such as finance and logistics separately.
Suggested Approach
Single version of truth of master data across organization can be achieved through introduction of MDM system. There’s a organization which found that they had 37% duplicates vendors in their systems.
Since we are handling master data of organization, the introduction of MDM system should not be disruptive to other systems and business in mind. Thus small steps approach is recommended for approaching master data management system in organizations. For consolidated reporting giving correct reports, 2 approaches are suggested. The 2nd approach has 2 steps. However organization can go directly for approach 2.2, however decision has to be taken as per disturbance to existing landscape. Also Strong Governance mechanism has to be in place is suggested for CMDM.
Approach 1 Harmonization across BI
Approach 2.1 Master Data from Legacy Systems
Approach 2.2 Central Master Data Management (CMDM)
Approach 1 – Harmonization across BI systems only
Here, will not interfere in organizations current master data flow or mechanism thus not becoming disruptive in current landscape and processes. We will take master data from BI system only, cleanse it and map it and send back to BI systems. The MDM system will send back the mapping of local BI IDs with global MDM IDs. By this approach, management will be able to get reports on both local BI IDs and global MDM IDs.
image
Approach 1 Harmonization across BI systems to achieve consolidated reporting
Benefits:
. Derive immediate benefits from consolidated master data through BI reports
. Mitigated risk of using MDM in the transactional landscape
. Immediate benefit by control on Master Data of newly Merged and Acquired Companies.
. No Interface is required between MDM and BI
. Not disruptive to existing landscape
Limitations:
. Master Data integrity is not achieved throughout the organization and only for Analytics
Approach 2.1 – Master Data from Legacy Systems
In this, we introduce MDM system to transactional systems, before BI systems layer. MDM system gets all the master data from source systems and maintains it. Afterwards this cleansed master data is passed to BI systems to get correct reports based on consolidated and cleansed master data. Here also we are not interfering with source systems and source systems continue with their own master data creation and processes.
However it is still not solving the root cause of the problem. Hence after this organizations should go for next approach.
image
Approach 2.1 Master Data is coming from source systems and fed to BI
Benefits
. Immediate benefit by control on Master Data of newly Merged and Acquired Companies.
. Ensures data integrity across Transactional landscape
. Sets stage for easy re-use via distribution or business processes that directly leverage consolidated data.
. Sets stage to achieve CMDM
Approach 2.2 – Central Master Data Management (CMDM)
This is true Master data management. However organizations need to prepare themselves first to approach this and careful planning needs to be done. This approach will fail if strong governance mechanism will not set in place.
CMDM will be the central master data management system, thus it will maintain all the master data across organization landscape. This will result in single version of truth across organization landscape, and definitely reporting.
image
Approach 2.2 Consistent Master Data is coming through CMDM
It’s upto the organization to implement CMDM as per their convenience, readiness and governance mechanism. There are two ways to achieve CMDM:
1. Organizations can continue to create master data in local systems and then get it cleaned and synchronized to respectable systems with the help of MDM.
2. Organizations create master data centrally in MDM only and then sent back to respective systems to maintain synchronized version of master data across their landscape.
Benefits
. Centralized management of master data
. Ensures data integrity through Enriching and Standardization of master data across landscape
. Centralized Governance - Ensures that data changes in one application, other business applications that depend on that data, are updated with a consistent view of key data as it changes in real time.
MDM Benefits
Below are the benefits which organization will get with the help of master data management.
· Global View of Business
· Single view of truth
· Optimized Operations
· Easy Integration of newly acquired systems
· Elimination of manual and Redundant processes
· Full Interoperability
· Greater Visibility
· Better Accuracy
· Reduction in maintenance cost
· Faster cycles
SAP MDM Benefits
· Easy Integration with non-SAP, SAP and BI systems
· Predefined content for configuration
· Driving eSOA
· Java enabled content for EP and other external systems
· No coding required, easy configuration
· Workflow in windows based Visio
Company: Satyam Computer Services Ltd.
Role: Projects Lead / Project Manager
Above shared views are my personal views and might not synchronize with my company views.
Ankur Ramkumar Goel has over 7 years of SAP experience in ABAP, BI and MDM with implementations, Roll outs, Maintenance, due diligence and strategy projects across Europe, USA and APJ, and has been instrumental in MDM for 3 years.
Resolve Consolidated Reporting Problem in multiple BI systems through SAP MDM
Ankur Ramkumar Goe
Tuesday, February 3, 2009
Central MDM Business Partner Creation Scenario with Galaxy (BPM) - Part I Satyajit Chakraborty
Central MDM Business Partner Creation Scenario with Galaxy (BPM) - Part I
Satyajit Chakraborty
With NetWeaver CE Ehp1 SAP has announced the availability of the new BPM tool called Galaxy. There have been a lot of blogs in SDN talking about Galaxy and also showcasing a few use cases. This blog is the first part in a series of two (or maybe three) blogs about a new scenario where I talk about managing a master data management scenario with Galaxy and the new rules framework.
The first part details out the scenario, lists out notable features about the scenario and finally shows the scenario as modeled in Galaxy. The second part will contain more implementation details.
The process is that of a new business partner creation in MDM in the central MDM scenario. It can be demonstrated using the following diagram
Process Specification
Note that even though the diagram is for a vendor creation scenario it can be re-used in the business partner creation scenario as well.
Before we look at the process modeling in Galaxy here are a few things to note about the process:
§ 3 roles: Vendor Manager, Accounting Agent, Procurement Agent
§ Parallel enrichment of data by the Accounting Agent and the Procurement Agent
§ Manual/Auto approval of a record is based on a rule
§ DUNS number assignment is automatic (not entered by either of the agent roles)
From MDM perspective the things to note are:
§ In the first step of the process the Vendor Manager searches for a record in MDM and absence of appropriate results allow him/her to create a new record in MDM with the search parameters
§ All enrichments happen on an already created and checked-out record
§ Before persisting any change the record should undergo MDM validations defined in the repository
§ When a record is approved it is checked-in and the process ends
A few implementation considerations that are worth mentioning are:
§ Usage of the standard SAP delivered Business Partner repository
§ Web Dynpro for Java for the user interaction steps in the process
§ Auto or Manual approval of a record is based on the presence or absence of the SSN entry respectively for the record
§ Usage of MDM web services and Java APIs in cases where there were no web services
§ Role based access to the application for the two different agent roles
With all that in mind let's look at the process model in Galaxy. This blog is not about how to model processes in Galaxy and so I'll just put the final completed process model here. If you are interested in learning modeling with Galaxy you can find a lot of good resources about both Galaxy modeling and BPMN modeling (quick hint: search SDN for "Ginger Gatling Galaxy").
BPM Model
Looking at the process model you might have already noticed that there is a disparity between the original process specification and the modeled process. The first step of searching for a business partner by the Vendor Manager is missing from the Galaxy model! The reason is this: the process to create a new business partner is started only if the Vendor Manager chooses to do so.
Hence there is a need to remotely start the process and not a manual start via the NWA tool. The future parts of this blog will also show the implementation details of how to start a Galaxy process remotely.
Satyajit Chakraborty is part of the BST Innovation Center team in Palo Alto.
Central MDM Business Partner Creation Scenario with Galaxy (BPM) - Part I
Satyajit Chakraborty
Satyajit Chakraborty
With NetWeaver CE Ehp1 SAP has announced the availability of the new BPM tool called Galaxy. There have been a lot of blogs in SDN talking about Galaxy and also showcasing a few use cases. This blog is the first part in a series of two (or maybe three) blogs about a new scenario where I talk about managing a master data management scenario with Galaxy and the new rules framework.
The first part details out the scenario, lists out notable features about the scenario and finally shows the scenario as modeled in Galaxy. The second part will contain more implementation details.
The process is that of a new business partner creation in MDM in the central MDM scenario. It can be demonstrated using the following diagram
Process Specification
Note that even though the diagram is for a vendor creation scenario it can be re-used in the business partner creation scenario as well.
Before we look at the process modeling in Galaxy here are a few things to note about the process:
§ 3 roles: Vendor Manager, Accounting Agent, Procurement Agent
§ Parallel enrichment of data by the Accounting Agent and the Procurement Agent
§ Manual/Auto approval of a record is based on a rule
§ DUNS number assignment is automatic (not entered by either of the agent roles)
From MDM perspective the things to note are:
§ In the first step of the process the Vendor Manager searches for a record in MDM and absence of appropriate results allow him/her to create a new record in MDM with the search parameters
§ All enrichments happen on an already created and checked-out record
§ Before persisting any change the record should undergo MDM validations defined in the repository
§ When a record is approved it is checked-in and the process ends
A few implementation considerations that are worth mentioning are:
§ Usage of the standard SAP delivered Business Partner repository
§ Web Dynpro for Java for the user interaction steps in the process
§ Auto or Manual approval of a record is based on the presence or absence of the SSN entry respectively for the record
§ Usage of MDM web services and Java APIs in cases where there were no web services
§ Role based access to the application for the two different agent roles
With all that in mind let's look at the process model in Galaxy. This blog is not about how to model processes in Galaxy and so I'll just put the final completed process model here. If you are interested in learning modeling with Galaxy you can find a lot of good resources about both Galaxy modeling and BPMN modeling (quick hint: search SDN for "Ginger Gatling Galaxy").
BPM Model
Looking at the process model you might have already noticed that there is a disparity between the original process specification and the modeled process. The first step of searching for a business partner by the Vendor Manager is missing from the Galaxy model! The reason is this: the process to create a new business partner is started only if the Vendor Manager chooses to do so.
Hence there is a need to remotely start the process and not a manual start via the NWA tool. The future parts of this blog will also show the implementation details of how to start a Galaxy process remotely.
Satyajit Chakraborty is part of the BST Innovation Center team in Palo Alto.
Central MDM Business Partner Creation Scenario with Galaxy (BPM) - Part I
Satyajit Chakraborty
Saturday, January 31, 2009
Using SAP MDM 7.1 effectively and effeciently vinay swarup
Using SAP MDM 7.1 effectively and effeciently
vinay swarup
It is no more a hot news as what all is new in SAP MDM 7.1.Everyone interested in SAP MDM must have been aware of it by now. This blog mainly aims at discussing those features and their possible use .This is only my idea of representation and suggestion.
Well to begin with the most important feature ---having Multiple main tables. Wow! what important use it can be of !Many times in real life scenarios we need to share master data across different loosely or tightly coupled objects like Customer and Vendor ,Products and material as per the business needs .Now when one has actually finished modeling for Vendor and has a model for customer in pipeline ,he can reuse the vendor model by simply adding one more main table with name customer. In this case all the common tables supplying information like regions ,Countries ,Taxes etc. Also feather in the cap is these two main tables can actually talk to each other with new Lookup type Lookup[main].There are vendors who are also customers to an organization and this scenario can now be modeled better. Other complex and untraditional objects say Bank can now have relational data modeling .egs Bank might have several schemes modeled as product and it can be linked to customers in one repository only. So far so good .But some cautions should be kept in mind before actually freezing the modeling in such typical cases.
According to me following points might be worth giving a thought
a)How much reusability is there? Are objects defined as main tables in one repository well coupled and other supporting data tables can be reused effectively? Also how much efforts are saved?
b)Most importantly How bulky would repository be ?Overloading a repository with tables would result in bad performance especially while creating a record from portal, loading and unloading, searching for a record etc.So be careful while making your final call and justify your need
c) Suppose I have customer and vendor as two main tables in a single repository and I have two portal use cases for each object (customer and vendor) .But downloading a repository for any table would stop the portal working for another table .This means although my one object has no problem , it still suffers a downtime because of other object as complete repository is down .
Now coming to next feature --Introduction of Tuple. This data structure can be a dark horse as what MDM was missing was defining complex data structure by reusing the already created fields. Again only challenge is to first justify your need for tuple creation as it is a complex data structure and is proportionally connected to repository’s performance
Next comes trusted connections--Although they were launched before but MDM Standard iviews did not support trusted connection(correct me if I am wrong).I hope this is resolved as this can be helpful and we can get rid of storing passwords while transporting contents from one environment to another.
To move ahead transportation of MDM objects are better and now you can see your validations ,expressions and workflows getting transported as well .CTS+ promises to transport MDM contents .Again a welcome move from SAP.
Last but not the least SAP has promised considerable performance from repository as the downtime is reduced and online activities list has been appended.
So let's wait and watch this much awaited blue eyed product
vinay swarup is a Netweaver MDM EP Consultant with Accenture Services
vinay swarup
It is no more a hot news as what all is new in SAP MDM 7.1.Everyone interested in SAP MDM must have been aware of it by now. This blog mainly aims at discussing those features and their possible use .This is only my idea of representation and suggestion.
Well to begin with the most important feature ---having Multiple main tables. Wow! what important use it can be of !Many times in real life scenarios we need to share master data across different loosely or tightly coupled objects like Customer and Vendor ,Products and material as per the business needs .Now when one has actually finished modeling for Vendor and has a model for customer in pipeline ,he can reuse the vendor model by simply adding one more main table with name customer. In this case all the common tables supplying information like regions ,Countries ,Taxes etc. Also feather in the cap is these two main tables can actually talk to each other with new Lookup type Lookup[main].There are vendors who are also customers to an organization and this scenario can now be modeled better. Other complex and untraditional objects say Bank can now have relational data modeling .egs Bank might have several schemes modeled as product and it can be linked to customers in one repository only. So far so good .But some cautions should be kept in mind before actually freezing the modeling in such typical cases.
According to me following points might be worth giving a thought
a)How much reusability is there? Are objects defined as main tables in one repository well coupled and other supporting data tables can be reused effectively? Also how much efforts are saved?
b)Most importantly How bulky would repository be ?Overloading a repository with tables would result in bad performance especially while creating a record from portal, loading and unloading, searching for a record etc.So be careful while making your final call and justify your need
c) Suppose I have customer and vendor as two main tables in a single repository and I have two portal use cases for each object (customer and vendor) .But downloading a repository for any table would stop the portal working for another table .This means although my one object has no problem , it still suffers a downtime because of other object as complete repository is down .
Now coming to next feature --Introduction of Tuple. This data structure can be a dark horse as what MDM was missing was defining complex data structure by reusing the already created fields. Again only challenge is to first justify your need for tuple creation as it is a complex data structure and is proportionally connected to repository’s performance
Next comes trusted connections--Although they were launched before but MDM Standard iviews did not support trusted connection(correct me if I am wrong).I hope this is resolved as this can be helpful and we can get rid of storing passwords while transporting contents from one environment to another.
To move ahead transportation of MDM objects are better and now you can see your validations ,expressions and workflows getting transported as well .CTS+ promises to transport MDM contents .Again a welcome move from SAP.
Last but not the least SAP has promised considerable performance from repository as the downtime is reduced and online activities list has been appended.
So let's wait and watch this much awaited blue eyed product
vinay swarup is a Netweaver MDM EP Consultant with Accenture Services
Thursday, January 29, 2009
Central MDM Business Partner Creation Scenario with Galaxy (BPM) - Part III
Central MDM Business Partner Creation Scenario with Galaxy (BPM)
The first two parts of this series dealt with overview of the process (part I) and some implementation details (part II) respectively. In this third and final part I'll talk about the rules integration as well as remote invocation of the process.
As a memory refresher - we wanted a rule such that if the new business partner record doesn't have a SSN entered the record should be manually approved. Otherwise it should be approved automatically.
Think of this rule as a function which takes two arguments - "context" which passes the entire web dynpro context to the function and "return" of type Boolean which is set to true if manual approval is needed and false otherwise. The rule engine will pick up the SSN entry from the context, evaluate it and set return accordingly.
Here's the rule at design time:
image
Since the return type is Boolean we can conveniently assign this function as a condition to the non-default gate of the exclusive choice gateway. Check the process model in part I for clarification.
For example, if you have called the ruleset as "checkApprovalMethod" and if the non-default gate for the exclusive choice is the one for auto-approval, then as a condition for this gate you can write: not(checkApprovalMethod(Context,IsAutoApproval)). So this gate will be taken by the process if the condition evaluates to true. Here Context and IsAutoApproval are two data-objects that you have created. IsAutoApproval is of type Boolean here.
So much for the rule, let's look at the way to start the process remotely. True to BPMN standards, every Galaxy process model has a Start event which can be bound to a web service interface and an operation of that interface. We can also handcraft wsdl files to suit our needs and bind it to the start event.
For example, in this scenario I had to pass the record ID to the process context when the process is started. So I added the following XML element to the default process start web service WSDL provided by Galaxy.
image
Logically if we invoke this web service operation manually it should start the process and that is exactly what happens! All you need to do is find out the URL of the wsdl of this web service from the NWA and the rest is easy. Since I was using web dynpro, I simply created an adaptive web service model out of this web service and invoked the operation on click of a button.
That brings us to the end of this series. Let me know if you have any questions regarding the implementation or about this scenario by posting your comments.
Satyajit Chakraborty is part of the BST Innovation Center team in Palo Alto.
The first two parts of this series dealt with overview of the process (part I) and some implementation details (part II) respectively. In this third and final part I'll talk about the rules integration as well as remote invocation of the process.
As a memory refresher - we wanted a rule such that if the new business partner record doesn't have a SSN entered the record should be manually approved. Otherwise it should be approved automatically.
Think of this rule as a function which takes two arguments - "context" which passes the entire web dynpro context to the function and "return" of type Boolean which is set to true if manual approval is needed and false otherwise. The rule engine will pick up the SSN entry from the context, evaluate it and set return accordingly.
Here's the rule at design time:
image
Since the return type is Boolean we can conveniently assign this function as a condition to the non-default gate of the exclusive choice gateway. Check the process model in part I for clarification.
For example, if you have called the ruleset as "checkApprovalMethod" and if the non-default gate for the exclusive choice is the one for auto-approval, then as a condition for this gate you can write: not(checkApprovalMethod(Context,IsAutoApproval)). So this gate will be taken by the process if the condition evaluates to true. Here Context and IsAutoApproval are two data-objects that you have created. IsAutoApproval is of type Boolean here.
So much for the rule, let's look at the way to start the process remotely. True to BPMN standards, every Galaxy process model has a Start event which can be bound to a web service interface and an operation of that interface. We can also handcraft wsdl files to suit our needs and bind it to the start event.
For example, in this scenario I had to pass the record ID to the process context when the process is started. So I added the following XML element to the default process start web service WSDL provided by Galaxy.
image
Logically if we invoke this web service operation manually it should start the process and that is exactly what happens! All you need to do is find out the URL of the wsdl of this web service from the NWA and the rest is easy. Since I was using web dynpro, I simply created an adaptive web service model out of this web service and invoked the operation on click of a button.
That brings us to the end of this series. Let me know if you have any questions regarding the implementation or about this scenario by posting your comments.
Satyajit Chakraborty is part of the BST Innovation Center team in Palo Alto.
Subscribe to:
Posts (Atom)