Tuesday, October 20, 2009

What is Data Modelling in SAP MDM

Data Modeling:

Data Modeling means organizing the data just like we organize files in file cabinet. Placing the right data at right place allows to fetch data easily. Organizing the data in data structures effectively means to effective performance.

Wednesday, August 5, 2009

SAP MDM ONLINE TRAININGS WITH REALTIME CONSULTANTS

Hi,

We offer the following Online Training Courses :

SAP MDM,
EP,BW / BI 7.0,FI/CO, HR , ABAP , BASIS & SECURITY, SD / CRM , MM / PP, HR, ABAP-HR , PI/XI TRAINING,QA TRAINING

For further details please reach us at

Sree
+91-9379914378
www.sapmdm.co.in

Sunday, July 19, 2009

SAP MDM ONLINE TRAINING WITH REALTIME EXPERTS

HI,

We provide SAP MDM ONLINE training with realtime consultants at reasonable cost.

For further details reach us at

Sree
www.sapmdm.co.in
+91-9379914378

Wednesday, July 1, 2009

SAP MDM ONLINE TRAINING

Hi,

We offer the following Online Training Courses :

SAP MDM,
EP,BW / BI 7.0,FI/CO, HR , ABAP , BASIS & SECURITY, SD / CRM , MM / PP, HR, ABAP-HR , PI/XI TRAINING,QA TRAINING

For further details please reach us at

Sree
+91-9949512008
www.sapmdm.co.in

Friday, June 5, 2009

SAP Master Data Management Jobs

Monday, May 18, 2009

Getting Informed on SAP NetWeaver MDM 7.1 Documentation Updates Markus Ganser

To provide news on SAP NetWeaver MDM documentation updates in a timely manner, the MDM documentation team has provided a dedicated space in the MDM documentation center at SAP Service Marketplace for quite some time now (see SAP NetWeaver MDM 7.1 documentation updates, or SAP NetWeaver MDM 5.5 documentation updates).

To further streamline the publication channel for SAP NetWeaver MDM 7.1 documentation updates according to the MDM community on SDN, the MDM documentation team has decided to open a specific SDN thread in addition to the specified space at SAP Service Marketplace, and is commited to use the SDN discussion forum to continuously post information about docu updates and new documentation. So it's up to you to select the channel that suits you best.

Simply watch this new SDN thread, or include it in your favourite links if you like to stay tuned on what's going on in SAP NetWeaver MDM 7.1 documentation.

Tribute goes to the MDM documentation team.

Regards,

Markus

Google maps inside MDM Vito Palasciano

Introduction

Sometimes it can be useful to locate a vendor in order to understand in which part of the world it carries out his business activities. This blog describes how to encapsulate Google Maps directly into MDM so that it will show the vendor address each time a vendor is selected.
Getting to work…

Using the Google maps Web site (http://maps.google.com/) it is possible to search an address everywhere in the world. Once an address has been located, the “link“ functionality (the hypertext indicated by the red arrow) can be used in order to obtain the full link to the map, as highlighted in the red frame.

Google Maps

In the example above the link is http://maps.google.com/maps?f=q&source=s_q&hl=en&geocode=&q=Italy,+MILAN,+VIA+CALDERA+21
where the parameters are language (English, in this example) and address (Italy, MILAN, VIA CALDERA 21, in this example).

SAP MDM provides us with a tool named Web Tab which can be used to call an external application just by adding an URL in the Links table in Console and specifying same parameters.

It is necessary to paste a new link version into the Web Tab URL:
http://maps.google.com/maps?f=q&source=s_q&hl=en&geocode=&q=
where parameter geo_map is a calculated field in MDM that contains Country name, City and Company Name separated by a comma.

A new link version into the Web Tab URL

Additionally, it is necessary to set the Web Pane URL in the configuration option in MDM Data Manager:

Web Pane URL configuration option

From now on, it will be possible to select a vendor and see the related map, as in the following examples:

Techedge Milan

Techedge Chicago

And, finally, hoping that Techedge will sooner or later open a branch at Beau Vallon Beach…

Techedge Seychelles

Enjoy your maps!

Vito Palasciano

Vito Palasciano is a Solution Architect at TechEdge currently focused on SAP Master Data Management.

MDM: Expectations v/s Realities

Master Data Management as a concept has existed as long as the ERP market or even before. The concept has been to help efficiently manage transactional and reporting functions within an application by having a separate repository for master data. The master data function gets called dynamically within the execution of a transaction.

This concept has evolved over the years, due to complexity of businesses and hence the IT Systems that are required to support it. In recent years the concept is gaining maturity by looking at Master data as a function to feed systems enterprise wide, not ERP alone. Hence MDM is now being viewed as a de-coupled application from the ERP applications.


Many vendors, some niche and other ERP vendors, have products to address this space. Though Gartner has reported the relative capability across various products in the ‘Magic Quadrant’, Consultants and Clients alike, are aware that none of the products have matured enough to warrant the name of ‘MDM Package Solution’. The products are still in the form of a tool/platform and a basic framework over which clients are expected to build a custom solution. Depending on what the client wants to achieve, the solution build up can be as expensive as a custom solution on any other platform.


SAP’s MDM product addresses master data across the enterprise like Material, Vendor and Customer. It also has certain content that is much suited for an SAP environment, hence there is good amount of data modeling effort that can be saved if the company has a SAP based landscape. These are some of the key differentiators for SAP MDM against other contemporaries.

The other features, like de-duplication, merging, harmonization etc are also part of the Product offerings.


However, the above alone does not make the offering a rich or a mature one. Today, the companies need to orchestrate and manage master data as a central application to the Enterprise. This needs a robust, user friendly and configurable package with a host of functionalities. The requirement from the Industry is the following:

* User friendly screens
* Complex validations that may need to be called from other applications
* Easy to configure and
* Manage workflows for data approval



What can SAP do to meet the above requirements with the current MDM package?

* SAP needs to bring the Key components of Enterprise Portal and MDM together. This would mean that the configuration of screens and fields should be in sync and should not appear as two disjoint applications.
* Since the companies preferring SAP MDM are predominantly SAP shops, SAP needs to have EP-MDM ready for an SAP landscape. The lead time to configure and set up a Central master data management should come down drastically. This will increase the adoption and usability of SAP MDM
* SAP has tried to merge BO with SAP MDM, but these again should not be viewed and communicated as two different components. This should be communicated and projected as one single component. Currently there is a lot of confusion on what is the ETL recommended, is it of BOBJ or SAP MDM? These confusions in messaging adds to the reluctance in adoption
* Workflow is a major component of central master data management. The workflow in some cases can be as complex as running across SAP MDM and SAP ECC or other SAP applications. This can be done using the new BPM, but this again needs to be offered as a part of the product.



SAP needs to work on maturing the product to make it a robust single unit, than expect customers to work on multiple components such as BPM, EP, Webdynpro and MDM to put up a MDM Solution. Essentially, the concept can be borrowed from ECC, where-in you have; Transactional processing UI; Configuration and Developer workbench. The skills required for the same are Functional and ABAP technical and it is easy to implement and use.

The skills required for setting up the current MDM solution –using number of SAP components- are too many and can act as a big entry barrier for larger customer adoption.


SAP needs to put in a lot of resources and get this going fast before customers start looking at other products to meet their MDM needs. Most of the vendors are just catching up and faster they move the larger the pie of the market they will acquire.
Historically SAP has developed a number of products and matured it quickly to gain market share. The more recent example being CRM. I would still place my bets on SAP maturing the MDM solution to the leadership space of the Magic Quadrant.

Ravishankar Hossur Prinicpal Consultant; SAP Technologies

SAP Business Objects and SAP NetWeaver MDM - Bringing Together The Best of Two Worlds--Markus Ganser

This is really good news for data stewards and other IT people dedicated to information management and data quality management: SAP Business Objects and SAP NetWeaver continue to team up in a common approach to establish sustained enterprise strategies.

Attentive community readers may already have noticed that the SAP NetWeaver Information Management space and the SAP Business Objects Information Management space are semantically closely related, both covering one and the same hot spot in current enterprise strategies applied to gain the competitive edge. This has already been addressed in the Extending the Value of SAP NetWeaver Business Intelligence with the Business Objects Business Intelligence Platform presentation. And - the common ground can also be observed in the master data management space.

As already depicted in the SAP NetWeaver Master Data Management roadmap graphic below, you can see one example of how these areas have started to interact, combining the best of two worlds featuring end-to-end data quality driven master data management.



Roadmap2008

[*) Please note that possible future developments are subject to change and may be changed by SAP at any time for any reason without notice]

With the Data Quality Management (DQM) for SAP NetWeaver MDM package at hand, you can leverage the powerful address cleansing capabilities of SAP Business Objects Data Services in SAP NetWeaver MDM*.

For an architectural overview of SAP NetWeaver MDM and SAP Business Objects interacting through the DQM-MDM Enrichment Adapter see the following graphic.

image"

From the package you will get the following benefits:

1. Ready-to-run sophisticated address cleansing capabilities on an international scale.
2. Master data excellence by combining master data quality and master data persistency
3. Streamlined service-based integration via the MDM Enrichment Architecture.
4. Workflow-embedded data enrichment process.

The related user guide provides a general overview of BusinessObjects™ Data Quality Management software, version for SAP NetWeaver MDM, as well as specific information for installing and integrating this product with SAP NetWeaver MDM.

SAP Business Objects Data Services customers can download the Data Quality Management (DQM) for SAP NetWeaver MDM package from the SAP Download Center (Path: Download > Installations and Upgrades > Entry by Application Group > SAP Business Objects packages and products > BOBJ DQM FOR SAP NW MDM).

*) refers to SAP NetWeaver MDM 5.5 SP6; version for SAP NetWeaver MDM 7.1 is coming soon (will be announced separately)

Thursday, May 14, 2009

SAP MDM TRAINING

Wednesday, April 1, 2009

LDAP Support For MDM And All About It

Sunday, March 8, 2009

Ten Commandments for MDM Implementation Prabuddha Roy

Ten Commandments for MDM Implementation
Prabuddha Roy


Without wanting to labour the point about what MDM is, I nevertheless feel compelled to state my understanding of this relatively new technology area before I get into discussing integration of it with business intelligence. This is done mainly for the sake of clarity. Master data management is a set of processes, policies, services and technologies used to create, maintain and manage data associated with a company’s core business entities as a system of record (SOR) for the enterprise. Core entities include customer, supplier, employee, asset, etc. Note that master data management is not associated with transaction data such as orders, for example. Whether you build your own MDM system or buy a MDM solution from a MDM vendor in the marketplace, it should meet a number of key requirements. These include the ability to:

* Define and maintain metadata for master data entities in a repository.
* Acquire, clean, de-duplicate and integrate master data into a central master data store.
* Offer a common set of shared master data services for applications, processes and portals to invoke to access and maintain master data entities (i.e., system of entry [SOE] MDM services).
* Manage master data hierarchies including a history of hierarchy changes and hierarchy versions.
* Manage the synchronisation of changes to master data to all operational and analytical systems that use complete sets or subsets of this data.



Here is a brief elucidation of the Ten Golden Principles that needs to be appreciated during any MDM Investment



1. Getting Started: Irrespective of the industry, market segment, or existing IT environmentthere is a hunt for the right way to launch an MDM initiative. Many due diligence initiatives propose that MDM is the only right answer to the most critical business problems. Many Corporates have started securing budgets . for launching a sustainable MDM program. The Idea is clear "Start small -- with an initial project -- but think large-scale, long-term, and across subject areas." MDM requires new technologies, specialized skills, and a business focus. With all of these ingredients in place, the payoff is well worth the effort.

2. ROI : Perhaps the biggest issue is how to justify and get funding for an MDM project. As with data warehousing projects, some organizations are blessed with enlightened executives who understand the correlation between high-quality, consistent data and their strategic objectives; these executives may approve MDM projects without delay. Other project leaders must closely align MDM projects with business need and pain and perform a detailed cost-justification analysis. MDM project managers can easily identify and monetize cost savings, but the best approach is to align the MDM initiative with strategic objectives or shoehorn it into approved projects as a necessary underpinning for success.

3. Serendipity: MDM is a business solution that offers a myriad of unexpected benefits once implemented. Many MDM early adopters discovered that once they cleansed and reconciled data through an MDM initiative, they not only improved the quality and consistency of the data available among systems supporting key business processes, but they also enabled other business initiatives such as mergers and acquisitions support, customer relationship management, target marketing, and supply chain optimization. “Data is a corporate asset, and when carefully managed, it provides a strong, stable foundation to support any information-centric business initiative an organization wishes to pursue now or in the future," said Wayne Eckerson, director of research at TDWI. Without MDM, many organizations will spend millions of additional dollars executing information-centric strategic initiatives or won't even attempt them at all.

4. Change Management: Change management is key. From a technical perspective, understanding and socializing the impact of MDM development activities has everything to do with the perception of success. From a cultural perspective, managing the expectations of business and IT stakeholders is nothing less than a make-or-break proposition. Change management is hard and can derail an MDM project if you aren't careful, when you change the data that end users have become accustomed to receiving through reports or other means, it can cause significant angst. You have to anticipate this, implement a transition plan, and prepare the users.

5. Roadblocks : IT and business can pose significant roadblocks. IT stakeholders, many of whom are beholden to established technologies, often need to hear the MDM pitch as much as business users do. Distinguishing MDM from incumbent technologies is an often-underestimated step. Conversely, the business may not want to initiate or fund an MDM project when they already have many of the existing tools and technologies required to do MDM. A very commonly addressed concern/query from majority of the CTOs "Our business has already funded our data warehouse and a customer relationship management solution.,wasn't the data warehouse supposed to solve these issues? wasn't the CRM system supposed to reconcile customers? How can I now convince the stakeholders to take on an MDM initiative?" The answer is that MDM can optimize those existing solutions, reducing their overall expense and minimizing the risk that they will deliver bad data (which could be their kiss of death). Also it is advisable that when purchasing an MDM solution, you shouldn't pay vendors for comparable technologies you already have in house, but which come bundled in their packages.

6. Enterprise Scope : It is true Enterprise MDM may be fraught with problems. With the vision of "start small, think big”, the vast majority of the corporates asserts their wish to quickly broaden their initial implementation to encompass additional domains, systems, and data. Organizations have started supporting their CRM program with better data which facilitiaites to perform business functions that could never have done with native CRM." Thus starting Small companies now plans to extend its MDM capabilities to additional operational systems. Once an organization has implemented an MDM project and it takes root, the organization can then decide whether to widen the road by adding more domains to the existing environment or extend the road by using MDM to address other business problems. 7. Data Governance. - "Data governance is a critical path to MDM,In order to be effective data governance must be designed. A company's cultural norms, established development processes, and incumbent steering committees must all factor into its data governance framework.It is recommended to grow data governance organically and in lockstep with an MDM architecture, which evolves over time. First, one should define what policies and rules are needed by the business to formulate to support an MDM project that solves a business problem. Then, one can formalize the requirements needed to sustain the initiative. That way, the business is working in their own perceived interest, not IT's.

8. Cross-System Data Analysis :One major issue is the time and costs involved in understanding and modeling source data that spans multiple, heterogeneous systems. Microsoft had 10 people working for 100 days to analyze source systems targeted for MDM integration, while the European Patent Office has 60 people analyzing and managing patent data originating around the world. Estimates show that the services-to-software ratio in MDM deployments is 10 to 1, with cross-source data analysis consuming the lion's share of the services. Just as in the data warehousing world, when early adopters in the 1990s underestimated the quality and condition of source data required to deliver an enterprise data warehouse, many implementers have not thought much about the challenges of understanding and reconciling source data.

9. Matching :Calibrating matches between records generated by disparate systems is both art and science. Many speakers acknowledged that these matching engines -- which are typically delivered within a data quality tool -- are the brains of the MDM system.

Many a times consultants need a few go-arounds configuring their matching rules. Many Consultants shared the consequences of "over-matching" records and thus consolidating two products or customers into one faulty record. The point was that matching needs to be refined over time. One must remember that MDM is as much about business and data rules as it is about the data itself.


10. Logical Data Models. An existing logical data model can propel you forward. Data administration skills is imperative in MDM. While some vendors maintain that a logical data model isn't required to make their tools work, most agree that the exercise itself can help a company understand definitions and usage scenarios for enterprise data, which makes it easier to gain consensus around data policies. Carl Gerber, senior manager of data warehousing at Tween Brands, shared how his company's data modeling and stewardship skills were a large part of his team's successful MDM delivery. Tween has created a series of data integration hubs to manage various subject areas, such as product, inventory, suppliers, and merchandising hierarchies. All data exchanged between systems passes through these hubs to ensure data consistency and reconciliation. The architecture has eliminated numerous point-to-point interfaces and improved operational efficiency, decision making, and revenue-generation objectives.

Ten Commandments for MDM Implementation Prabuddha Roy

Ten Commandments for MDM Implementation
Prabuddha Roy


Without wanting to labour the point about what MDM is, I nevertheless feel compelled to state my understanding of this relatively new technology area before I get into discussing integration of it with business intelligence. This is done mainly for the sake of clarity. Master data management is a set of processes, policies, services and technologies used to create, maintain and manage data associated with a company’s core business entities as a system of record (SOR) for the enterprise. Core entities include customer, supplier, employee, asset, etc. Note that master data management is not associated with transaction data such as orders, for example. Whether you build your own MDM system or buy a MDM solution from a MDM vendor in the marketplace, it should meet a number of key requirements. These include the ability to:

* Define and maintain metadata for master data entities in a repository.
* Acquire, clean, de-duplicate and integrate master data into a central master data store.
* Offer a common set of shared master data services for applications, processes and portals to invoke to access and maintain master data entities (i.e., system of entry [SOE] MDM services).
* Manage master data hierarchies including a history of hierarchy changes and hierarchy versions.
* Manage the synchronisation of changes to master data to all operational and analytical systems that use complete sets or subsets of this data.



Here is a brief elucidation of the Ten Golden Principles that needs to be appreciated during any MDM Investment



1. Getting Started: Irrespective of the industry, market segment, or existing IT environmentthere is a hunt for the right way to launch an MDM initiative. Many due diligence initiatives propose that MDM is the only right answer to the most critical business problems. Many Corporates have started securing budgets . for launching a sustainable MDM program. The Idea is clear "Start small -- with an initial project -- but think large-scale, long-term, and across subject areas." MDM requires new technologies, specialized skills, and a business focus. With all of these ingredients in place, the payoff is well worth the effort.

2. ROI : Perhaps the biggest issue is how to justify and get funding for an MDM project. As with data warehousing projects, some organizations are blessed with enlightened executives who understand the correlation between high-quality, consistent data and their strategic objectives; these executives may approve MDM projects without delay. Other project leaders must closely align MDM projects with business need and pain and perform a detailed cost-justification analysis. MDM project managers can easily identify and monetize cost savings, but the best approach is to align the MDM initiative with strategic objectives or shoehorn it into approved projects as a necessary underpinning for success.

3. Serendipity: MDM is a business solution that offers a myriad of unexpected benefits once implemented. Many MDM early adopters discovered that once they cleansed and reconciled data through an MDM initiative, they not only improved the quality and consistency of the data available among systems supporting key business processes, but they also enabled other business initiatives such as mergers and acquisitions support, customer relationship management, target marketing, and supply chain optimization. “Data is a corporate asset, and when carefully managed, it provides a strong, stable foundation to support any information-centric business initiative an organization wishes to pursue now or in the future," said Wayne Eckerson, director of research at TDWI. Without MDM, many organizations will spend millions of additional dollars executing information-centric strategic initiatives or won't even attempt them at all.

4. Change Management: Change management is key. From a technical perspective, understanding and socializing the impact of MDM development activities has everything to do with the perception of success. From a cultural perspective, managing the expectations of business and IT stakeholders is nothing less than a make-or-break proposition. Change management is hard and can derail an MDM project if you aren't careful, when you change the data that end users have become accustomed to receiving through reports or other means, it can cause significant angst. You have to anticipate this, implement a transition plan, and prepare the users.

5. Roadblocks : IT and business can pose significant roadblocks. IT stakeholders, many of whom are beholden to established technologies, often need to hear the MDM pitch as much as business users do. Distinguishing MDM from incumbent technologies is an often-underestimated step. Conversely, the business may not want to initiate or fund an MDM project when they already have many of the existing tools and technologies required to do MDM. A very commonly addressed concern/query from majority of the CTOs "Our business has already funded our data warehouse and a customer relationship management solution.,wasn't the data warehouse supposed to solve these issues? wasn't the CRM system supposed to reconcile customers? How can I now convince the stakeholders to take on an MDM initiative?" The answer is that MDM can optimize those existing solutions, reducing their overall expense and minimizing the risk that they will deliver bad data (which could be their kiss of death). Also it is advisable that when purchasing an MDM solution, you shouldn't pay vendors for comparable technologies you already have in house, but which come bundled in their packages.

6. Enterprise Scope : It is true Enterprise MDM may be fraught with problems. With the vision of "start small, think big”, the vast majority of the corporates asserts their wish to quickly broaden their initial implementation to encompass additional domains, systems, and data. Organizations have started supporting their CRM program with better data which facilitiaites to perform business functions that could never have done with native CRM." Thus starting Small companies now plans to extend its MDM capabilities to additional operational systems. Once an organization has implemented an MDM project and it takes root, the organization can then decide whether to widen the road by adding more domains to the existing environment or extend the road by using MDM to address other business problems. 7. Data Governance. - "Data governance is a critical path to MDM,In order to be effective data governance must be designed. A company's cultural norms, established development processes, and incumbent steering committees must all factor into its data governance framework.It is recommended to grow data governance organically and in lockstep with an MDM architecture, which evolves over time. First, one should define what policies and rules are needed by the business to formulate to support an MDM project that solves a business problem. Then, one can formalize the requirements needed to sustain the initiative. That way, the business is working in their own perceived interest, not IT's.

8. Cross-System Data Analysis :One major issue is the time and costs involved in understanding and modeling source data that spans multiple, heterogeneous systems. Microsoft had 10 people working for 100 days to analyze source systems targeted for MDM integration, while the European Patent Office has 60 people analyzing and managing patent data originating around the world. Estimates show that the services-to-software ratio in MDM deployments is 10 to 1, with cross-source data analysis consuming the lion's share of the services. Just as in the data warehousing world, when early adopters in the 1990s underestimated the quality and condition of source data required to deliver an enterprise data warehouse, many implementers have not thought much about the challenges of understanding and reconciling source data.

9. Matching :Calibrating matches between records generated by disparate systems is both art and science. Many speakers acknowledged that these matching engines -- which are typically delivered within a data quality tool -- are the brains of the MDM system.

Many a times consultants need a few go-arounds configuring their matching rules. Many Consultants shared the consequences of "over-matching" records and thus consolidating two products or customers into one faulty record. The point was that matching needs to be refined over time. One must remember that MDM is as much about business and data rules as it is about the data itself.


10. Logical Data Models. An existing logical data model can propel you forward. Data administration skills is imperative in MDM. While some vendors maintain that a logical data model isn't required to make their tools work, most agree that the exercise itself can help a company understand definitions and usage scenarios for enterprise data, which makes it easier to gain consensus around data policies. Carl Gerber, senior manager of data warehousing at Tween Brands, shared how his company's data modeling and stewardship skills were a large part of his team's successful MDM delivery. Tween has created a series of data integration hubs to manage various subject areas, such as product, inventory, suppliers, and merchandising hierarchies. All data exchanged between systems passes through these hubs to ensure data consistency and reconciliation. The architecture has eliminated numerous point-to-point interfaces and improved operational efficiency, decision making, and revenue-generation objectives.

Friday, February 27, 2009

Controlled\Restricted Access to Data in MDM sai charan singh

Controlled\Restricted Access to Data in MDM
sai charan singh


MDM provides security through Users, Roles and Privileges.

Each user has his own user access, with his own user name and password.

Each roles defines the permitted area and restricted section in the repository.

Privileges are to defined on tables, fields and functions as executable, read-only or read/write.


User:

When you create a new user you have to give a user name, password and assign role to the user, by default a 'Default' role is assigned, when you assign any other role then default role is replaced. When you create a repository by default an Admin user, with blank password is created, which you use to log into the repository. You can set a custom password for Admin but you cannot delete user Admin.



image




Roles:

When you create a new role you have to give a name to that role and assign different users to this role, by default all functions are enabled for execute and tables/fields are enabled to read/write. When you create a repository by default two roles are created, First role being Admin, with all functions enables to execute and tables/fields enabled to read/write and this roles cannot be edited, the second role created is Default, which also has all functions enabled to execute and tables/fields enabled to read/write but this role can be edited and changes can be made to functions and tables/fields, remember this role is the one assigned when you create a new user. Both the roles cannot be deleted.



image

Privileges:

Creating users and roles might be childs game, but when it comes to setting the right privileges then its MEN AT WORK, its very important to assign the right functionality to each role.

The second tab while creating roles, there are the list of functions provided, differentiate each role, understand why this role is required, for example, you want the user assigned to this role only to read and write data, then dont grant him permission to delete records, go through each and every function and set the right access, if you change the first row, functions[default] then by default all rows are effected.

The third tab while creating roles, has different tables and when you expand those you will find different fields. Hear you can set access at table level or for individual field. Tables n fields can be set with read-only or read/write access.


image


Constraints:

Once of the most important part actually for which I started this blog is this, constraints, you can find a constraint column as the last column in tables/fields tab while creating a role. Generally you dont want to give access to a role for complete table, you want to filter a group of records and then give access to them, then you should create a Mask or Named search and then select read-only access for all rows and only for the required Mask or Named search select read/write or select a constraint on a lookup table.


image



1. By default for all constraints, 'ALL' option is selected. By selecting the drop down list you can select you own options, its an multi-valued field.

2. Previous only Masks and Look up tables were allowed to be constrained, but now from SP6 even Named searches can be constrained.

3. When it comes to look up tables only non-Multi-valued fields with respect to main table are allowed, that means qualified tables and multi-valued lookup fields in Main table are not available for Constraints.

4. When you select a lookup tables value as an constraint, automatically both Main table and the lookup table gets short listed(Both the tables have records with respect to that constraint).

Monday, February 16, 2009

Using MDM Java APIs to add Text(type) taxonomy attribute allowable values Paras Arora

Using MDM Java APIs to add Text(type) taxonomy attribute allowable values
Paras Arora



Now consider this a continuation of my previous blog on MDM Java APIs (taxonomy attributes):

Using MDM Java APIs to retrieve Taxonomy attribute Values - this was while attempting to replicate all capabilities of MDM Data Manager using MDM Java APIs.

During the same exercise I had made up my mind, to develop a highly customised data manager using MDM Java APIs & Portal UI technologies. As of now, my Custom Data Manger is under build and would have to wait before I share the same with the community.

For now, I am sharing the following solution for the scenario, which propped up as a result of brief conversation over a cup of Tea.

"MDM Taxonomy attributes can have a set of allowable values for taxonomy attributes of type text for e.g. different country names for attribute 'Country', different values for attribute 'type' of a Material etc.While MDM Data Manger provides option to add allowable values for a Taxonomy attribute at design time, which are utilised in custom built MDM - Portal applications or MDM Business packages. What if the user wants to add a value to the set of allowable values for a text type taxonomy attribute at runtime? , which would enable him an additional option to select from a list of allowable values for a Taxonomy attribute. I couldn't find an out of the box set of API methods/Interfaces, which can help us achieve the same"

After a lot of analysis and indepth study of MDM Java APIs, I could find out a mechanism using which we can replicate this design time functionality of MDM Data Manager, at runtime by using MDM Java APIs.

The algo/code snippet given below can be re-used, after further customizing or extending as is needed.

We start at the point where we have retrieved the attribute object (retrieved using Attribute ID, which in turn is retrieved from current record object, refer embedded blog link for details)

Depending upon the screen design i.e. the number of entries you want the end user to enter for addition to list of allowable values for text type taxonomy attribute one can utilize the following lines of code.

int textAttrValueCount = 10; // allows the user to add 10 values to the set of allowable values for a text type taxonomy attribute

for (int i = 0; i < textAttrValueCount; i++)

{
TextAttributeValueProperties textValue = new TextAttributeValueProperties();
baseName = "TextValue" + i + System.currentTimeMillis();
MultilingualString textValueName = MultilingualHelper.createMultilingualString(regDefs, baseName);
textValue.setName(textValueName);

baseName = "Tooltip" + i + System.currentTimeMillis();
MultilingualString tooltip = MultilingualHelper.createMultilingualString(regDefs, baseName);
attr.addTextAttributeValue(textValue);
}

attr = text type taxonomy attribute object to which allowable values are to be added

Utilizing the code piece or the approach outlined above one can customize and extend MDM Business Packages (so that it gives end user the option to add to allowable values for a text type taxonomy attribute)or integrate the same into a webdynpro application providing cover to MDM repository and give end user the option which is otherwise available on at design time i.e. using MDM Data Manager.

Sunday, February 15, 2009

sap-netweaver-tep12-questions Part-2

sap-netweaver-tep12-questions Part-2

1) Accessing a Portal Component in the Default Mode

Ans : doContent();

2) Extending this class when Developing your Portal Components

Ans : AbstractPortalComponent.

3) Portal Runtime call the methods in the Life Cycle

Ans : init(),

service()

destroy()

4) What are the Parameters that we have to pass to doConent();

Ans : IPortalComponentRequest ,IPortalComponentResponse;

5) How do u access a Resource from a request object

Ans : request,getResource();

6) Personalization concept what are the data type ….the type attribute supports

Ans : String , Date , Select , Boolean

7) How do u get the property from the IPortalCompenentProfile

Ans : getProperty(String)

8) What is the method has to be overridden by the class that extends from PageProcessorComponent

Ans : getPage();

9) Give the sequence of methods execution of DynPage

Ans :

1) doInitialization()

2) doProcessAfterInput()

3) doProcessBeforeOutput()

10) sequence of method calls when an event occurs

Ans :

1) doProcessAfterInput()

2) on

3) doProcessBeforeOutput()

11) how do u get the getCurrentEvent();

Ans :

IPageContext myContext = PageConectFactory.createPageConext(request,responce);

Event = myContext.getCurrentevent();

12) onClientClick() and onClick() are specified then which method will be called first

Ans : onClientClick();

13) JSPDynPage uses _________ type of approach

Ans : Model View Controller

14) The two properties in the component profile indicate that a JSP need to be compiled into portal component :

Ans : Property name = “JSP”

Property name = “ComponentType”

15) Which one is true in the following statement

a)

<%@ taglib uri = “hbj” prefix = “htmlb”>

b)

<%@ taglib uri = “hbj” prefix = “hbj”>

c)

<%@ taglib uri = “hbj” prefix = “htmlb”>

d)

<%@ taglib uri = “htmlb” prefix = “hbj”>



Ans : a



16) How do u call a jsp file :

Ans : setJspName();



17) If java Script are used _______tag is necessary for the page

Ans : Page tag



18) Which tag is used for including bean in the jsp file

Ans :



19) What is the scope of the bean

Ans : Session



20) Give the objects that are extend form the IPrincipal

Ans :

IGroup,IRole , IUser,IUserAccount,IUserMaint,



21) ____is the Central object from which all UME object factories are obtained

Ans : UMFactory

22) IUser user = UMfactory.getUserfactory().getUserbyLogonID(uid)

String userName = user.getDisplayName();

String email = user.getEmail();

Response.write(“ userName”+ username+” Email :”+ email);



1) Displays the username and Email ID

2) Throws an exception

3) Doesn’t Compile



Ans : 1.

23) List the methods used to create successful user

Ans :

1) NewUser(uid);

2) setFirstName()

3) setLastName()

4) setEmail()

5) setPassword();





24) can we construct an unique id manually .

Ans : false can create a unique id .



25) Unique IDs are used to identify objects across data sources.

Ans : ..



26) How do u retrieve log on information

Ans : umdata.enrich(map);



27) What is the return type of map.get(“”);

Ans : String



28) How do u load the data in the Client Eventing across the iviews

Ans : EPCM.loadClientData();



29) What is the object available in the pages

Ans : EPCM

30) What problems does HTMLB the overcomes on servelts

Ans :

Visualization and business logic are not separate

Development has to take care of different web clients and versions

NamesSpace conflicts with form elements



31) Stored data is identified by the key …..

Ans : Namespace+name



32) Frame Work Levels

Ans :

Level = 0 ….. not supported by both javaScript, Java

Level = 1 ----only by browser (java Script )

Level = 2 ----both javaScript and Java



33) Features of portal Services in the portal

Ans :

1) portal services are way to provide functionality to portal component

2) portal services implement no user interface

3) portal service may be accessed from out side portal framework



34) Why do we need custom portal Services in the portal

Ans :

1) Can be used by other Portal Application

2) Provide commonly used Functionality

3) Can be exposed as webservice

35) To build a new portal service ……interface must be implemented

Ans :

IService



36) Life cycle of a portal service

Ans :

Init()

Afterinit()

Destroy()



37) Service name =

Ans :

.

.

.

.



38) Portal service name is myService …what would be the name of interface that extends IService

Ans : IMyService



39) JCA/J2EE connector Architecture is not API. True /False

Ans : true

40) ConnectorFrameWork is SAP Extended API from CCI . all methods in Connector FrameWork has methods with suffix as EX()….

Ans : False only some methods



41) What is the method used to get connection in the Java Connectors

Ans : Service.getConnection();



42) How do u get the locale from the request object.

Ans : Request.getLocale();



43) What is the return type for the table type structure .

Ans : IRecordSet.



44) Give the name of the method that returns resource bundle for the current locale.

Ans : getResourceBundle()



45) Localization.properties

Localization_de_DE.properties.

Localization_en_EN.properties.

What is the Resource bundle name : ?

Ans : Localization



46) What is the data type that returns by the method getString (key)



Ans : String



47) How do u access the a key in the properties file xyz = abc

Ans : getString (“xyz”)

48) What type of objects can be translated

Ans : Text

49) the portal translation process is supported by tools

Ans :

Translation worklist coordination

Worklist translation



50) to customize the Logoff screen to the portal ----file is used to change

Ans : masthead

51) SAP Recommends not to modify the SAP code , then what are the process to customize the code

Ans :

1) copy the existing file and rename it according to customer name space

2) create new custom component



52) how can we customize the company Branding



Ans :

1) masthead

2) through customize applicaton





53) Their one question on Desktop inner page




54) What are the components that are added to Portal Desktop

Ans :

Default Frame work

Themes



55) What is the jsp name that contains log on page

Ans : umLogonPage.jsp

56) Authschemes.xml is modified to get custom log on component



57) How do u access portal services from WebDynpro applications

Ans : WDPortalUtils



58) Cache Level :

Ans : none ,session ,user,shared




59) getCachingLevel() is used to get the Cachelevel



60) When will the doinitialization () method is called

Ans :

1 When the page is loaded for the first time

2 When the page is refreshed

3 When object is called from the another object



61) Cached objects are retrieved using the --- method

Ans : get(key)

62) How can the portal service access to external Web Service

Ans :

Generate java proxy out of WSDL file with PDK

You can execute the java proxy as portal service

sap netweaver TEP12 Questions - Part 1

sap net weaver TEP12 Questions-Part 1


1. Portal Applications ..
* Can be assigned to portal roles
* Are typically developed using NWDS
* Are stored in PAR files
* Are bundles of on or more portal components or portal services
* Are developed in portal content studio

2. Roles can assigned to
* WorkSets
* Groups
* Roles
* iViews
* Users



3. Which of the following statements about the Software Architecture of Enterprise Portal are true
* Portal Service acts like a interface that are enabled to exchange data and procedures
* The User Management Services is an interface between portal run time and the UME
* Page Builder is the portal component responsible for assembling the pages
* PRT is executed according to user requests, generating HTML output to display on client
* PRT service can be exposed as Web service



4. Business Packages are
* Always free of charge
* Are exclusively developed by SAP
* Typically contains iviews and worksets
* Can be downloaded from sdn.sap.com
* Can be downloaded from service.sap.com

5. What are the three main building blocks of SAP Enterprise Portal?
* Portal Runtime, Portal Server and Portal Framework
* Portal Platform, Unification and content Management
* Portal Platform, KM and Collaboration
* Portal Framework, Content Management and Collaboration

6. In Which Functional area is KM positioned
* Lifecycle Management
* Composite Application Framework
* Information Integration
* People Integration
* Process Integration



7. Which Functional Areas of SAP NetWeaver are delivered by SAP EP
* Multi Channel Access
* Knowledge Management
* Collaboration
* Portal
* Integration Broker



8. Pages can assigned to
* Worksets
* Groups
* Roles
* iViews
* Users
* Pages
9. Worksets can be assigned to
* Worksets
* Groups
* Roles
* iViews
* Users
* Pages



10. Which of the following statements are correct with regard to Portal Content Studio?
* The PCD is the tool to access PCD Objects offering browse and search interface
* PCD shows all PCD objects to every content administrator user
* Different editors respectively wizards are offered according to the view used to access Portal Content Catalog
* Both in Browse and Search, the view of the objects is organized by the type of object

11. Whats the difference between PCD Folders and folders within a workset/role?
* The MergeID property can be maintained only for PCD Objects
* The name can be maintained in different languages only for worsket/role folders
* The sort priority property can be maintained only for workset/role folders
* ACL’s can be maintained only for PCD folders
* Only PCD Folder names appear in Top Level Navigation and detailed navigation





12. You can integrate the following SAP Applications into SAP NetWeaver Portal as iViews
* SAP Transactions
* IAC Applications
* BSP Applications
* BEx Web Applications
* WebDynro for Java and ABAP
* Web Services



13. A Portal role determines
* Entries in Top Level Navigation
* Entries in Detail Level Navigation
* Portal Content user can access
* Authorizations in BackEnd Systems

14. Please Choose the type of Database required for VC connection when not accessing SAP System
* Any SAP DGDB compliant Database
* Any JDBC Compliant Database
* Any Oracle Database



15. What form of output is generated when a VC iView is created
* HTMLB
* WebDynpro for ABAP
* XML
* Java Applet

16. When using VC which of the following could be used to retrieve information from SAP System
* Java API
* BADI
* RFC
* BAPI
* Function Module



17. Please select the option that does not describes the portal run time
* Provides a runtime for all non JAVA Applications
* Defines and manages the objects that makes up the portal environment
* Provides runtime and its corresponding services
* PRT is one basic part of portal environment integrated into Web AS
* Provides Development environment

18. Please select the correct alternative with respect to Portal Applications
* Portal Applications are bundles of Portal components and API code
* Portal Application are bundles of Portal Services and API code
* Portal Applications are bundles of Deployment Descriptors and Portal Components
* Portal Applications are bundles of Portal Components and Portal Services



19. Which class is used to determine locale dependent strings?
* getLocale
* GoupResource
* BundleResource
* ResourceBundle
* ResourceGroup

20. Please select the correct form of localization file?
* XXX.de.DE.properties
* DE_de_xxx.properties
* EN_XXX_DE.properties
* XXX_Prop_EN.property

21. Which syntax is most likely to give value of single property?
* Profile.getPropertyAttributes(“name”);
* Profile.setParameter(“name”);
* Profile.getValue(“name”);
* Profile.getProperty(“name”);
* Profile.getParameter(“name”);



22. Which syntax is most suitable to get value of parameter?
* Request.getParameter(“name”);
* Request.getValue(“name”);
* Response.getParameter(“name”);
* Response.getPValue(“name”);

23. Which method would be used to retrieve the value of the text?
* getValue(key)
* getGroup(key)
* getText(key)
* getString(key)
* getLocale(key)

24. Which method returns the current users locale?
* setLanguage
* setResourceBundle
* setLocale
* getLanguage
* getLocale


25. Which method from the following methods return the resource bundle?
* getLocale
* getResourceGroup
* setResourceBundle
* getResourceBundle
* setResourceGroup

26. Please select the three types of personalization within the portal?
* Folder
* Workset
* Portal
* iView
* Page

27. When describing the personalization properties – what are the four types?
* Date, Select, Boolean, String
* Time, Date, Boolean, Select
* Time, Date, Select and String
* Date, Time, String and Boolean



28. Localization property files must be accessible by the Java class loading mechanism and must be packaged in the folder of the PAR files accessible by the Java Class Loader. Please select the correct examples of these files.
* JAR file in API section
* Properties file in the PORTAL-INF/private/classes
* JAR file in PORTAL-INF/private/lib
* JAR file in PORTAL-INF/lib
* Properties file in the PORTAL-INF/classes

29. What gives the Portal Service a view on the Portal Environment?
* ServiceContext
* IPortalService
* IPortalServiceContext
* IServiceContext
* IService
* IMyService

30. What interface needs to be implemented to build a new Portal Service?
* IPortalService
* IService
* MyService
* IMyService
* Service


31. The extension of our portal applications after uploading into a portal could be
* .rtf
* .txt
* .zip
* .par
* .doc



32. What portal development view could we use to add a new property without physically typing the code into the portalapp.xml?
* EP perspective
* Package explorer
* Debugging
* Outline
* Console

33. Please choose the methods that do not belong to the IUser interface
* setFirstGirlFriend
* setlastName
* setFirstName
* setHomeAddress
* setWorkAddress
* setEmail

34. Please select the non LDAP directory from the following?
* IPlanet
* Seimens
* MS-ADS
* SAP DB
* Novell

35. What service creates users in the external systems?
* User Management Engine
* Persistence Adapters
* Replication Manager
* Persistent Manager
* LDAP directory
* Database



36. Choose the interface from the following list that enables user maintaienence.
* IUserMaint
* IUser
* IPrincipal
* IPusher
* IRole
* IGroup

37. After the following search : ISearchResult result = userFact.searchUsers(userFilt), what state would determine the search was successful?
* SEARCH_RESULT_CORRECT
* SEARCH_RESULT_PASS
* SEARCH_RESULT_FOUND
* SEARCH_RESULT_OK
* SEARCH_RESULT_SUCCESS

38. After the following search : ISearchResult result = userFact.searchUsers(userFilt), what state would determine the search was unsuccessful?
* SEARCH_RESULT_UNDEFINED_STATE
* SEARCH_RESULT_BAD
* SEARCH_RESULT_INCORRECT
* SEARCH_RESULT_UNKOWN
* SEARCH_RESULT_INCOMPLETER



39. Which of the following role is to be given to Portal Developer
* PortalDeveloper
* ContentDeveloper
* JavaDeveloper
* SystemDeveloper
* JavaAnalyst
* JavaProgrammer
* PortalApp

40. A J2EE application server vendor is responsible for shipping a set of Java API’s these would be called
* Java Connector
* J2EE Application server
* JCA API
* EIS Systems
* Common Client Interface
* Resource Adapters

41. EIS vendors are responsible for developing what to shield the developer from the complexity of the EIS API’s?
* Java Connector
* J2EE Application server
* JCA API
* EIS Systems
* Common Client Interface
* Resource Adapters

42. Name of the architecture which defines interface for connection?
* JCA
* JTA
* JMS
* EIS
* ERP
* MYDB

43. BAPI stands for
* Business Additional Procedures Interface
* Business Applications Programming Interface
* Backward Applied Program Interface
* Business Applied Procedural Interface
* Business to Business Application programming interface



44. Which entry in the deployment descriptor deals with Connector Framework
* sharingReference
* ConnectorDB
* System_Alias
* ServicesReference
* ServiceReference
* privateSharingReference

45. What is an interaction when used in the following syntax: Interaction Ix = connection.createInteractionEx();
* Establishes the interaction with the EIS system
* Obtains the actual connection
* Describes the data needed to call a specific function
* Destroys the connection
* Establishes the connection
* Creates an interface to jar files

46. To iterate through the record what method could be used?
* Next()
* Iterate()
* getMore()
* getAnother()
* getNext()




47. In a JDBC connection – what format is the returned data held in ?
* IRecordData
* IRecordHeadings
* ISet
* IRecordSet
* IRecord

48. Please select the correct statements from the following
* EPCF has following levels 1,2,3
* EPCF have following levels 1,2,3 and 4
* EPCF have following levels 0,1,2
* EPCF have no levels
* Represents the sourceID

49. When subscribing to an event with these parameters what is the event handler?
(“urn”,”ABC”,eventHandler)
* Is an optional parameter
* Represents the data object
* Respresents the javascript function that is called
* Contains the data needed for the client side communication
* Respresents the sourceid

50. Please choose the correct syntax to subscribing for an event.
* EPCF.raiseEvent(“urn”,”ABC”, eventHandler);
* EPCM.subscribeEvent(“urn”,”ABC”,eventHandler);
* EPCM.raiseEvent(“urn”,”ABC”,eventHandler);
* EPCF.subscribeEvent(“urn”,”ABC”,eventHandler);

51. Please select the correct statement from the following?
* ClientEventing: common communication channel for java application communicating on the client side
* ClientEventing: common communication channel for javascript application communicating on the client side
* ClientEventing: common communication channel for javascript application communicating on the server side
* ClientEventing: common communication channel for java application communicating on the server side



52. Please select the correct statement from the following?
* Client Data Bag: a client side JavaScript object which serves as cross-iView storage mechanism
* Client Data Bag: a client side Java object which serves as cross-iView storage mechanism
* Client Data Bag: a server side JavaScript object which serves as cross-iView storage mechanism
* Client Data Bag: a server side Java object which serves as cross-iView storage mechanism



53. Please select the correct navigation syntax within the WebDynpro iView
* WD.navigateAbsolute(“ROLES://portal_content/Portal_Role/SimplePage.
* Portal.navigateAbsolute(“PCD://portal_content/Portal_Role/SimplePage.
* WDPortalNavigation.navigateAbsolute(“ROLES://portal_content/Portal_Role/SimplePage.
* WDPortalNavigation.navigateAbsolute(“PCD://portal_content/Portal_Role/SimplePage.
* PortalNavigation.navigateAbsolute(“PCD://portal_content/Portal_Role/SimplePage.



54. Please select the correct statements from the following?
* Instead of EPCM the WebDynpro uses WDPortalEventing e.g WDPortalEventing.subscribe
* Eventing between other Portal iviews is supported
* WebDynpro doesnot have wizard for eventing
* The System used is normally “SAP_localSysten” with WD iViews in the portal
* PortalEventing with WebDynpro iViews can interact with Events from different URL domains
* WebDynpro applications can be embedded in iViews

Monday, February 9, 2009

How to work with Command in the new MDM Java API Vijendra Singh Bhanot

How to work with Command in the new MDM Java API
Vijendra Singh Bhanot


I am new to the MDM Java API. For one of the existing project I was asked to explore the new MDM Java API. I do not have any experience with the earlier MDM4J but still I was able to understand and explore the new MDM Java API. I think this API is excellent and I am almost fallen in love with it. The best part I like about this API is the Commands. Initially it took me time to understand the concept but very soon it was all clear. With this blog I would like to share my knowledge about how to work with Commands. Also in this blog you will see an example of Validation Command.




Why you need a Command? What is a Command?


I command the MDM Java API to get me the list of Validations …..

Well, that’s what the idea is.


Command is a special class that instructs MDM Java API to perform some action. All actions like managing repository content, searching records, managing repository schema and etc, have dedicated commands.


All these commands are logically organized in packages. You can always refer Java Doc to identify which Command you need to use. (https://help.sap.com/javadocs/MDM/current/index.html)

How to Use a Command?



All Command’s are used in the following way:



1 RetrieveValidationsCommand objRetrieveValidationsCommand = new RetrieveValidationsCommand();

2 objRetrieveValidationsCommand.setSession();

3 objRetrieveValidationsCommand.setTableId(<>); // Required

try {

4 objRetrieveValidationsCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}



(1) You create a new instance of a particular command by passing it the ConnectionAccessor object.

(2) The very second step is mostly the setSession(). Here it is important that you use the right type of session (Server Session, Repository Session or User Session). For Example, you may not be able to get list of validations if you use a Repository Session instead of User Session.

As I already mentioned that setSession() is not always the second step. Actually the commands responsible for creation of session are the ones that do not require a session. Thus these commands (CreateServerSessionCommand, CreateRepositorySessionCommand and CreateUserSessionCommand) do not require setSession().

(3) Some commands require specific setter method to be set before it is used. These setter methods are marked as “Required” in the Java Dock. There are few which are marked optional.

(4) Inside a try block you execute the command. If there is an error then CommandException is thrown.


Here is a sample of RetrieveValidationsCommand in action…

/*
* Created on Feb 7, 2008
*
* To change the template for this generated file go to
* Window>Preferences>Java>Code Generation>Code and Comments
*/
package demo.validation;

import com.sap.mdm.commands.AuthenticateUserSessionCommand;
import com.sap.mdm.commands.CommandException;
import com.sap.mdm.commands.CreateUserSessionCommand;
import com.sap.mdm.commands.GetRepositoryRegionListCommand;
import com.sap.mdm.data.RegionProperties;
import com.sap.mdm.ids.TableId;
import com.sap.mdm.net.ConnectionException;
import com.sap.mdm.net.ConnectionPool;
import com.sap.mdm.net.ConnectionPoolFactory;
import com.sap.mdm.server.DBMSType;
import com.sap.mdm.server.RepositoryIdentifier;
import com.sap.mdm.validation.ValidationProperties;
import com.sap.mdm.validation.ValidationPropertiesResult;
import com.sap.mdm.validation.commands.RetrieveValidationsCommand;

/**
*
*
* To change the template for this generated type comment go to
* Window>Preferences>Java>Code Generation>Code and Comments
*/
public class GetListOfValidations {

public static void main(String[] args) {
// create connection pool to a MDM server
String serverName = "LOCALHOST";
ConnectionPool connections = null;
try {
connections = ConnectionPoolFactory.getInstance(serverName);
} catch (ConnectionException e) {
e.printStackTrace();
return;
}

// specify the repository to use
// alternatively, a repository identifier can be obtain from the GetMountedRepositoryListCommand
String repositoryName = "INQDemo";
String dbmsName = "localhost";
RepositoryIdentifier reposId =
new RepositoryIdentifier(repositoryName, dbmsName, DBMSType.ORACLE);

// get list of available regions for the repository
GetRepositoryRegionListCommand regionListCommand =
new GetRepositoryRegionListCommand(connections);
regionListCommand.setRepositoryIdentifier(reposId);
try {
regionListCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}
RegionProperties[] regions = regionListCommand.getRegions();

// create a user session
CreateUserSessionCommand sessionCommand =
new CreateUserSessionCommand(connections);
sessionCommand.setRepositoryIdentifier(reposId);
sessionCommand.setDataRegion(regions[0]); // use the first region
try {
sessionCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}
String sessionId = sessionCommand.getUserSession();

// authenticate the user session
String userName = "admin";
String userPassword = "admin";
AuthenticateUserSessionCommand authCommand =
new AuthenticateUserSessionCommand(connections);
authCommand.setSession(sessionId);
authCommand.setUserName(userName);
authCommand.setUserPassword(userPassword);
try {
authCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}

// the main table, hard-coded
TableId mainTableId = new TableId(1);

// Get the list of validations

RetrieveValidationsCommand objRtvVldCmd =
new RetrieveValidationsCommand(connections);
// set the user session
objRtvVldCmd.setSession(sessionId);
// get validation for the following tables.
objRtvVldCmd.setTableId(mainTableId);

try {
objRtvVldCmd.execute();

} catch (CommandException e) {
e.printStackTrace();
return;
}

ValidationPropertiesResult objVldPropRslt =
objRtvVldCmd.getValidationPropertiesResult();

ValidationProperties[] validations = objVldPropRslt.getValidations();


//disply --> Validation ID | error/warning message | Validation Name
for (int i = 0; i < validations.length; i++) {

System.out.println(
validations[i].getId()
+ " | "
+ validations[i].getMessage()
+ " | "
+ validations[i].getName());
}

}
}

Vijendra Singh Bhanot is a certified XI Consultant

Thursday, February 5, 2009

MDM Java API 2 an introductive series part IV Tobias Grunow

MDM Java API 2 an introductive series part IV
Tobias Grunow


Introduction

In my last Blog I showed you how to get an authenticated user session using a trusted connection. In this part I want to show you how to gather MDM system information from a SAP Portal system object.
After that I will introduce the basic concept behind working with MDM and the Java API 2, showing you how the system works (RecordID, Lookup Values, Display Fields...).

If you build a solution based on MDM you will sooner or later face a two or three tier MDM system landscape. Meaning that there will be a development system (D-System) and there might be a quality assurance system (Q-System) and there will be a productive system (P-System). This scenario can include multiple MDM Server or maybe only different Repositories on the same server and in addition maybe (D-, Q-, P-Clients). So now you face the problem how to address the different repositories from your code without having to take care about addressing the right one dependant on which landscape you are in.
In case you are using a SAP Portal I want to point to a very useful thread I have found on SDN.

Hard-code credentials - any other solution exists?

This forum post shows you how to work with the SAP Portal system objects and how to retrieve information out of it. I have used this concept to address the problem I have just pointed out to you in addition with the trusted connection user session authentication and it worked fine for me.

Let us continue with the next topic...
MDM Java API 2 introducing the concept

First of all I want to give u a brief introduction into the concept of MDM to better understand my future coding. As a coder you need a much better understanding of the concept behind the MDM than anyone else. Simple GUI-Users (Graphical User Interface) don’t have the need to understand the technical details behind the data storage and operations like Java programmers. So at the beginning the most important task is to understand how MDM will store data and connect the data with each other.
So let’s start with the first graphic:
image
Figure 1: Table concept in MDM

The first graphic shows the table concept of MDM. The central table called “Main table” is the centre of the model. From the main table there will be references going towards any kind of sub tables. In the Main table will be fields. Those fields can hold values which are stored directly in the table or hold a reference to any record of a sub table. Sub tables can store data values or hold references to other sub tables.
To illustrate the possible layout of a main table take a look at figure 2.

[Click the picture to enlarge!]
image
Figure 2: Possible layout in main table (MDM Console view)

As you can see the main table stores e.g. text values directly as well as references to other sub tables (Lookup’s of different kinds). To find out where the lookup’s are pointing to, you can open up the MDM Console and click on an entry in the list of fields in the main table to see its details (Figure 3).


image
Figure 3: Main table field details (Bottom part of MDM Console)


If we look at the details we have to notice two important things.
First, figure 3 shows us a field detail named “CODE”. This code is very important for us because it is the name we will use in our code to address this field in the table. Second, we have to notice that the field is of “Type” Lookup [Flat]. This tells us, that the value we will find in this field will be of type RecordID.
A RecordID is the ID of the record in the sub table (e.g. Sales Product Key – Table [Detail: Lookup Table]). This means that we will not be able to access the value directly by accessing the field in the main table. We will only get the reference to the sub table which holds the actual value desired. In my future Blogs I will give more details on the concepts behind MDM and give examples of other techniques used.
So enough of MDM concepts and let’s get to the Java API and some examples.
Searching in MDM

Searching in MDM will be one of the most common used functionality there is.
Searching in a repository has some prerequisites. First we have to have a connection to the MDM Server and second we have to have an authenticated user session to a repository. In my Blogs published before I showed you how to setup those prerequisites. In the class provided I have combined all the necessary steps to get the connection and the user session. So now let’s get to the code.

package com.sap.sdn.examples;

import java.util.Locale;

import com.sap.mdm.commands.AuthenticateUserSessionCommand;
import com.sap.mdm.commands.CommandException;
import com.sap.mdm.commands.CreateUserSessionCommand;
import com.sap.mdm.commands.SetUnicodeNormalizationCommand;
import com.sap.mdm.commands.TrustedUserSessionCommand;
import com.sap.mdm.data.RecordResultSet;
import com.sap.mdm.data.RegionProperties;
import com.sap.mdm.data.ResultDefinition;
import com.sap.mdm.data.commands.RetrieveLimitedRecordsCommand;
import com.sap.mdm.ids.FieldId;
import com.sap.mdm.ids.TableId;
import com.sap.mdm.net.ConnectionAccessor;
import com.sap.mdm.net.ConnectionException;
import com.sap.mdm.net.SimpleConnectionFactory;
import com.sap.mdm.search.FieldSearchDimension;
import com.sap.mdm.search.Search;
import com.sap.mdm.search.TextSearchConstraint;
import com.sap.mdm.server.DBMSType;
import com.sap.mdm.server.RepositoryIdentifier;

public class SearchExamples {

// Instance variables needed for processing
private ConnectionAccessor mySimpleConnection;
// Name of the server that mdm runns on
private String serverName = "IBSOLUTI-D790B6";
// Name of the repository shown in the mdm console
private String RepositoryNameAsString = "SDN_Repository";
// Name of the DB-Server this could be an IP address only
private String DBServerNameAsString = "IBSOLUTI-D790B6\\SQLEXPRESS";
// Define the Database type (MS SQL Server)
private DBMSType DBMSTypeUsed = DBMSType.MS_SQL;
// Create a new data region
private RegionProperties dataRegion = new RegionProperties();
// Session which will be used for searching
private String userSession;
// Default user name
private String userName = "Admin";
// Password is empty on default setup
private String userPassword ="";
// result we will get from mdm
public RecordResultSet Result;

/**
* Constructor for class
*/
public SearchExamples(){
// Set the Data Region
dataRegion.setRegionCode("engUSA");
// Set the locale on data region
dataRegion.setLocale(new Locale("en", "US"));
// Set the name of data region
dataRegion.setName("US");
// get a connection to the server
this.getConnection();
// Authenticate a user session
try {
this.getAuthenticatedUserSession();
} catch (ConnectionException e) {
// Do something with exception
e.printStackTrace();
} catch (CommandException e) {
// Do something with exception
e.printStackTrace();
}
// Get resulting records
Result = this.SearchTypes();

}

// ResultDefinition Main Table declaration
private ResultDefinition rdMain;


/**
* Method that will search for all records in main table of a certain type
*
* @return RecordResultSet that holds all resulting records from search
*/
public RecordResultSet SearchTypes() {

/**
* 1. First we create the Result Definition. This result definition will
* tell the search which fields are of interest to us. The list could
* include all fields of the table or only the ones we are interested
* in.
*/
// Define which table should be represented by this ResultDefintion
// In my repository this is the table MAINTABLE
rdMain = new ResultDefinition(new TableId(1));
// Add the desired FieldId's to the result definition
// In my repository this is the field PRODUCT_NAME
rdMain.addSelectField(new FieldId(2));
// In my repository this is the field TYP
rdMain.addSelectField(new FieldId(27));

/**
* 2. Create the needed search parameters.
* Define what to search for and where.
*/
// Create the field search dimension [Where to search!?]
FieldSearchDimension fsdMaintableType = new FieldSearchDimension(new FieldId(27));
// Create the text search constraint [What to search for?! (Every record that contains ROOT)]
TextSearchConstraint tscTypeRoot = new TextSearchConstraint("ROOT", TextSearchConstraint.CONTAINS);

/**
* 3.
* Create the search object with the given search parameters.
*/
// Create the search
Search seSearchTypeRoot = new Search(new TableId(1));
// Add the parameters to the search
seSearchTypeRoot.addSearchItem(fsdMaintableType, tscTypeRoot);

/**
* 4.
* Create the command to search with and retrieve the result
*/
// Build the command
RetrieveLimitedRecordsCommand rlrcGetRecordsOfTypeRoot = new RetrieveLimitedRecordsCommand(mySimpleConnection);
// Set the search to use for command
rlrcGetRecordsOfTypeRoot.setSearch(seSearchTypeRoot);
// Set the session to use for command
rlrcGetRecordsOfTypeRoot.setSession(this.userSession);
// Set the result definition to use
rlrcGetRecordsOfTypeRoot.setResultDefinition(rdMain);
// Try to execute the command
try {
rlrcGetRecordsOfTypeRoot.execute();
} catch (CommandException e) {
// Do something with the exception
e.printStackTrace();
}
// Return the result
return rlrcGetRecordsOfTypeRoot.getRecords();
}

/**
* Create and authenticate a new user session to an MDM repository.
*
* @param mySimpleConnection
* The connection to the MDM Server
* @param RepositoryNameAsString
* name of the repository to connect to
* @param DBServerNameAsString
* name of DBServer
* @param DBMSType
* Type of DBMS that MDM works with
* @param dataRegion
* RegionProperties defining the language the repository should
* be connected with.
* @param userName
* Name of the user that should make the connection to repository
* @param userPassword
* password of user that should be used if connection is not trusted
* @throws ConnectionException
* is propagated from the API
* @throws CommandException
* is propagated from the API
*/
public String getAuthenticatedUserSession(
) throws ConnectionException, CommandException {
/*
* We need a RepositoryIdentifier to connect to the desired repository
* parameters for the constructor are: Repository name as string as read
* in the MDM Console in the "Name" field DB Server name as string as
* used while creating a repository DBMS Type as string - Valid types
* are: MSQL, ORCL, IDB2, IZOS, IIOS, MXDB
*/
RepositoryIdentifier repId = new RepositoryIdentifier(
RepositoryNameAsString, DBServerNameAsString, DBMSTypeUsed);
// Create the command to get the Session
CreateUserSessionCommand createUserSessionCommand = new CreateUserSessionCommand(
mySimpleConnection);
// Set the identifier
createUserSessionCommand.setRepositoryIdentifier(repId);
// Set the region to use for Session - (Language)
createUserSessionCommand.setDataRegion(dataRegion);
// Execute the command
createUserSessionCommand.execute();
// Get the session identifier
this.userSession = createUserSessionCommand.getUserSession();

// Authenticate the user session
try {
// Use command to authenticate user session on trusted connection
TrustedUserSessionCommand tuscTrustedUser = new TrustedUserSessionCommand(
mySimpleConnection);
// Set the user name to use
tuscTrustedUser.setUserName(userName);
tuscTrustedUser.setSession(this.userSession);
tuscTrustedUser.execute();
this.userSession = tuscTrustedUser.getSession();
} catch (com.sap.mdm.commands.CommandException e) {
/* In Case the Connection is not Trusted */
AuthenticateUserSessionCommand authenticateUserSessionCommand = new AuthenticateUserSessionCommand(
mySimpleConnection);
authenticateUserSessionCommand.setSession(this.userSession);
authenticateUserSessionCommand.setUserName(userName);
authenticateUserSessionCommand.setUserPassword(userPassword);
authenticateUserSessionCommand.execute();
}
// For further information see:
// http://help.sap.com/javadocs/MDM/current/com/sap/mdm/commands/SetUnicodeNormalizationCommand.html
// Create the normalization command
SetUnicodeNormalizationCommand setUnicodeNormalizationCommand = new SetUnicodeNormalizationCommand(
mySimpleConnection);
// Set the session to be used
setUnicodeNormalizationCommand.setSession(this.userSession);
// Set the normalization type
setUnicodeNormalizationCommand
.setNormalizationType(SetUnicodeNormalizationCommand.NORMALIZATION_COMPOSED);
// Execute the command
setUnicodeNormalizationCommand.execute();
// Return the session identifier as string value
return this.userSession;
}

/*
* The method will return a ConnectionAccessor which is needed every time
* you want to execute a Command like searching or as on any other Command
* there is. @return SimpleConnection to MDM Server
*/
public void getConnection() {
String sHostName = serverName;
// We need a try / catch statement or a throws for the method cause
try {
/*
* retrieve connection from Factory The hostname can be the name of
* the server if it is listening on the standard port 20005 or a
* combination of Servername:Portnumber eg. MDMSERVER:40000
*/
mySimpleConnection = SimpleConnectionFactory.getInstance(sHostName);
} catch (ConnectionException e) {
// Do some exception handling
e.printStackTrace();
}
}
}


To test the code we also need a simple test class that will instantiate the sample class and prints out the number of retrieved Records.

package com.sap.sdn.examples;

public class Test {

/**
* @param args
*/
public static void main(String[] args) {
// Create instance of search class
SearchExamples test = new SearchExamples();
// Print out the amount of found records
System.out.println("Total of " + test.Result.getCount() + " Records found that match criteria TYPE=ROOT");
}
}



I wrote a lot of comments in the code but I will give you some more details on what is happening there.

First of all there are some instance variables that hold the connection information to the MDM Server and the MDM Repository.
The constructor will set up some variables which are needed for the search.
A connection to the MDM server will be created.
A Session will be created and will be authenticated, if there is a trusted connection we will use it to authenticate and if we only have a normal connection to the server normal authentication will be used.
The search will be triggered and the result will be stored in instance variable.
The search needs some elements to work with.

At the beginning in the SearchTypes() method [Comment 1. in code] set up a ResultDefiniton which tells the search what fields are of interest to the program and should be accessible in the result. If this definition is not including some desired field and you want to access it later in your code, an exception will be thrown.

[Comment 2. in code] Define a search dimension and a search constraint to tell the search where to look at and what to look for. There are a lot of search constraints to work with.
All available constraints:

[Comment 3. in code] Define the search itself and use the dimension and constraint to set it up.

[Comment 4. in code] Build the command that is needed to execute the search.
More Details on command please see this link

So this is just a very simple example and I will give you more advanced codes in future Blogs.
If you have any questions on what the code does or need more detailed explanations please feel free to comment on this Blog. If this code helped you a little please feel free to comment as well.


So this will be the last Blog for this year since I am rebuilding my flat at the moment. The topic of the next Blog needs to be defined and I will update my agenda in my first Blog next year.


So I wish you a happy new year and as you would say in german: “ Einen guten Rutsch ins neue Jahr!”.

Best regards,
Tobi

Tobias Grunow is a Solution Consultant for IBSolution GmbH Heilbronn, Germany. He is a member of the research and development team.


MDM Java API 2 an introductive series part IV
Tobias Grunow

Resolve Consolidated Reporting Problem in multiple BI systems through SAP MDM Ankur Ramkumar Goe

Resolve Consolidated Reporting Problem in multiple BI systems through SAP MDM
Ankur Ramkumar Goe

Summary

Management uses BI reports for planning, Strategies and making Business Decisions. BI systems get data from multiple systems to give consolidated data. However all transactional systems maintain their own master data which in turn is sent to BI. Thus Master data in BI is quite redundant and does not show single version of truth. Due to this, the Management gets wrong reports/information about organizations performance which leads to wrong planning and decisions.

We will try to address this issue with the help of Master Data management solution so that Management gets true view of global business and thus enabling them with unified business information for optimized operations and faster and accurate decisions.



Current Scenario

Organizations, due to various reasons like across geographies, mergers and acquisitions, best of breed products landed in scattered and distributed IT landscape. With the introduction of BI for reporting, currently organizations have respective BI systems for reporting according to geographies, systems, units, functionality. Every transactional system needs to create its own master data as master data drives transactions. These transactional systems are usually connected to their own BI systems and do not share their data with other systems due to various reasons like complexity, number of connections, cost of maintaining connections, lack of governance etc. Thus every BI system ends up getting master data from their own transactional systems and which in turn fed to corporate BI system for consolidated reporting. Without consistent master Data, data warehousing becomes garbage in, garbage out.

Due to all this master data in BI is quite redundant and does not show single version of truth. Hence the Management gets wrong reports presenting wrong information about organizations performance which leads to wrong planning and decisions. The management might be looking on the reports which will be showing 1 single customer as 3 or more customers (e.g. John Kennedy, John F Kennedy, J F Kennedy etc) or same with vendors ( e.g. Satyam, Satyam Computers, Satyam Computers Services ltd etc). Surveys show that more than 70% decisions are made wrongly because of incomplete or wrong reports/information. Even organizations spend 30% of their time to verify the reports/information. There’s a organization which found that their 40% of orders struck because of mismatched master data.
BI system have ELT layer which is specially made for reporting use only. Unfortunately it is not configured and optimized for cleansing Master Data. Also organizations are maintaining and trying to solve master data problems for many years by their own methods and tools. By this organizations were trying to heal the symptoms but were not able to solve root cause of master data problem. Industry has recognized this problem and coming up with a tool to manage master data. With the evolution of MDM tools, organizations will be able to benefit by best practices, reduced efforts and ease. Also maintaining master data outside of BI system will help

Below are two scenarios showing organizations distributed IT landscapes.

Scenario 1 – Organizations BI landscape as per geography
bi 3

Scenario 1

This scenario describes organization landscape as per geography which is distributed across geographies to cater the local and corporate reporting requirements.
Scenario 2 - Organizations BI Landscape as per functionality
image

Scenario 2

This scenario describes organization landscape as per different functionality such as finance and logistics separately.


Suggested Approach

Single version of truth of master data across organization can be achieved through introduction of MDM system. There’s a organization which found that they had 37% duplicates vendors in their systems.

Since we are handling master data of organization, the introduction of MDM system should not be disruptive to other systems and business in mind. Thus small steps approach is recommended for approaching master data management system in organizations. For consolidated reporting giving correct reports, 2 approaches are suggested. The 2nd approach has 2 steps. However organization can go directly for approach 2.2, however decision has to be taken as per disturbance to existing landscape. Also Strong Governance mechanism has to be in place is suggested for CMDM.

Approach 1 Harmonization across BI

Approach 2.1 Master Data from Legacy Systems

Approach 2.2 Central Master Data Management (CMDM)
Approach 1 – Harmonization across BI systems only

Here, will not interfere in organizations current master data flow or mechanism thus not becoming disruptive in current landscape and processes. We will take master data from BI system only, cleanse it and map it and send back to BI systems. The MDM system will send back the mapping of local BI IDs with global MDM IDs. By this approach, management will be able to get reports on both local BI IDs and global MDM IDs.

image

Approach 1 Harmonization across BI systems to achieve consolidated reporting
Benefits:

. Derive immediate benefits from consolidated master data through BI reports

. Mitigated risk of using MDM in the transactional landscape

. Immediate benefit by control on Master Data of newly Merged and Acquired Companies.

. No Interface is required between MDM and BI

. Not disruptive to existing landscape
Limitations:

. Master Data integrity is not achieved throughout the organization and only for Analytics


Approach 2.1 – Master Data from Legacy Systems

In this, we introduce MDM system to transactional systems, before BI systems layer. MDM system gets all the master data from source systems and maintains it. Afterwards this cleansed master data is passed to BI systems to get correct reports based on consolidated and cleansed master data. Here also we are not interfering with source systems and source systems continue with their own master data creation and processes.

However it is still not solving the root cause of the problem. Hence after this organizations should go for next approach.

image

Approach 2.1 Master Data is coming from source systems and fed to BI
Benefits

. Immediate benefit by control on Master Data of newly Merged and Acquired Companies.

. Ensures data integrity across Transactional landscape

. Sets stage for easy re-use via distribution or business processes that directly leverage consolidated data.

. Sets stage to achieve CMDM


Approach 2.2 – Central Master Data Management (CMDM)

This is true Master data management. However organizations need to prepare themselves first to approach this and careful planning needs to be done. This approach will fail if strong governance mechanism will not set in place.

CMDM will be the central master data management system, thus it will maintain all the master data across organization landscape. This will result in single version of truth across organization landscape, and definitely reporting.

image

Approach 2.2 Consistent Master Data is coming through CMDM
It’s upto the organization to implement CMDM as per their convenience, readiness and governance mechanism. There are two ways to achieve CMDM:
1. Organizations can continue to create master data in local systems and then get it cleaned and synchronized to respectable systems with the help of MDM.
2. Organizations create master data centrally in MDM only and then sent back to respective systems to maintain synchronized version of master data across their landscape.

Benefits

. Centralized management of master data

. Ensures data integrity through Enriching and Standardization of master data across landscape

. Centralized Governance - Ensures that data changes in one application, other business applications that depend on that data, are updated with a consistent view of key data as it changes in real time.



MDM Benefits

Below are the benefits which organization will get with the help of master data management.

· Global View of Business

· Single view of truth

· Optimized Operations

· Easy Integration of newly acquired systems

· Elimination of manual and Redundant processes

· Full Interoperability

· Greater Visibility

· Better Accuracy

· Reduction in maintenance cost

· Faster cycles

SAP MDM Benefits

· Easy Integration with non-SAP, SAP and BI systems

· Predefined content for configuration

· Driving eSOA

· Java enabled content for EP and other external systems

· No coding required, easy configuration

· Workflow in windows based Visio


Company: Satyam Computer Services Ltd.

Role: Projects Lead / Project Manager

Above shared views are my personal views and might not synchronize with my company views.






Ankur Ramkumar Goel has over 7 years of SAP experience in ABAP, BI and MDM with implementations, Roll outs, Maintenance, due diligence and strategy projects across Europe, USA and APJ, and has been instrumental in MDM for 3 years.



Resolve Consolidated Reporting Problem in multiple BI systems through SAP MDM
Ankur Ramkumar Goe

Latest updates from sdn.sap.com

SAP Developer Network Latest Updates