Friday, February 27, 2009

Controlled\Restricted Access to Data in MDM sai charan singh

Controlled\Restricted Access to Data in MDM
sai charan singh


MDM provides security through Users, Roles and Privileges.

Each user has his own user access, with his own user name and password.

Each roles defines the permitted area and restricted section in the repository.

Privileges are to defined on tables, fields and functions as executable, read-only or read/write.


User:

When you create a new user you have to give a user name, password and assign role to the user, by default a 'Default' role is assigned, when you assign any other role then default role is replaced. When you create a repository by default an Admin user, with blank password is created, which you use to log into the repository. You can set a custom password for Admin but you cannot delete user Admin.



image




Roles:

When you create a new role you have to give a name to that role and assign different users to this role, by default all functions are enabled for execute and tables/fields are enabled to read/write. When you create a repository by default two roles are created, First role being Admin, with all functions enables to execute and tables/fields enabled to read/write and this roles cannot be edited, the second role created is Default, which also has all functions enabled to execute and tables/fields enabled to read/write but this role can be edited and changes can be made to functions and tables/fields, remember this role is the one assigned when you create a new user. Both the roles cannot be deleted.



image

Privileges:

Creating users and roles might be childs game, but when it comes to setting the right privileges then its MEN AT WORK, its very important to assign the right functionality to each role.

The second tab while creating roles, there are the list of functions provided, differentiate each role, understand why this role is required, for example, you want the user assigned to this role only to read and write data, then dont grant him permission to delete records, go through each and every function and set the right access, if you change the first row, functions[default] then by default all rows are effected.

The third tab while creating roles, has different tables and when you expand those you will find different fields. Hear you can set access at table level or for individual field. Tables n fields can be set with read-only or read/write access.


image


Constraints:

Once of the most important part actually for which I started this blog is this, constraints, you can find a constraint column as the last column in tables/fields tab while creating a role. Generally you dont want to give access to a role for complete table, you want to filter a group of records and then give access to them, then you should create a Mask or Named search and then select read-only access for all rows and only for the required Mask or Named search select read/write or select a constraint on a lookup table.


image



1. By default for all constraints, 'ALL' option is selected. By selecting the drop down list you can select you own options, its an multi-valued field.

2. Previous only Masks and Look up tables were allowed to be constrained, but now from SP6 even Named searches can be constrained.

3. When it comes to look up tables only non-Multi-valued fields with respect to main table are allowed, that means qualified tables and multi-valued lookup fields in Main table are not available for Constraints.

4. When you select a lookup tables value as an constraint, automatically both Main table and the lookup table gets short listed(Both the tables have records with respect to that constraint).

Monday, February 16, 2009

Using MDM Java APIs to add Text(type) taxonomy attribute allowable values Paras Arora

Using MDM Java APIs to add Text(type) taxonomy attribute allowable values
Paras Arora



Now consider this a continuation of my previous blog on MDM Java APIs (taxonomy attributes):

Using MDM Java APIs to retrieve Taxonomy attribute Values - this was while attempting to replicate all capabilities of MDM Data Manager using MDM Java APIs.

During the same exercise I had made up my mind, to develop a highly customised data manager using MDM Java APIs & Portal UI technologies. As of now, my Custom Data Manger is under build and would have to wait before I share the same with the community.

For now, I am sharing the following solution for the scenario, which propped up as a result of brief conversation over a cup of Tea.

"MDM Taxonomy attributes can have a set of allowable values for taxonomy attributes of type text for e.g. different country names for attribute 'Country', different values for attribute 'type' of a Material etc.While MDM Data Manger provides option to add allowable values for a Taxonomy attribute at design time, which are utilised in custom built MDM - Portal applications or MDM Business packages. What if the user wants to add a value to the set of allowable values for a text type taxonomy attribute at runtime? , which would enable him an additional option to select from a list of allowable values for a Taxonomy attribute. I couldn't find an out of the box set of API methods/Interfaces, which can help us achieve the same"

After a lot of analysis and indepth study of MDM Java APIs, I could find out a mechanism using which we can replicate this design time functionality of MDM Data Manager, at runtime by using MDM Java APIs.

The algo/code snippet given below can be re-used, after further customizing or extending as is needed.

We start at the point where we have retrieved the attribute object (retrieved using Attribute ID, which in turn is retrieved from current record object, refer embedded blog link for details)

Depending upon the screen design i.e. the number of entries you want the end user to enter for addition to list of allowable values for text type taxonomy attribute one can utilize the following lines of code.

int textAttrValueCount = 10; // allows the user to add 10 values to the set of allowable values for a text type taxonomy attribute

for (int i = 0; i < textAttrValueCount; i++)

{
TextAttributeValueProperties textValue = new TextAttributeValueProperties();
baseName = "TextValue" + i + System.currentTimeMillis();
MultilingualString textValueName = MultilingualHelper.createMultilingualString(regDefs, baseName);
textValue.setName(textValueName);

baseName = "Tooltip" + i + System.currentTimeMillis();
MultilingualString tooltip = MultilingualHelper.createMultilingualString(regDefs, baseName);
attr.addTextAttributeValue(textValue);
}

attr = text type taxonomy attribute object to which allowable values are to be added

Utilizing the code piece or the approach outlined above one can customize and extend MDM Business Packages (so that it gives end user the option to add to allowable values for a text type taxonomy attribute)or integrate the same into a webdynpro application providing cover to MDM repository and give end user the option which is otherwise available on at design time i.e. using MDM Data Manager.

Sunday, February 15, 2009

sap-netweaver-tep12-questions Part-2

sap-netweaver-tep12-questions Part-2

1) Accessing a Portal Component in the Default Mode

Ans : doContent();

2) Extending this class when Developing your Portal Components

Ans : AbstractPortalComponent.

3) Portal Runtime call the methods in the Life Cycle

Ans : init(),

service()

destroy()

4) What are the Parameters that we have to pass to doConent();

Ans : IPortalComponentRequest ,IPortalComponentResponse;

5) How do u access a Resource from a request object

Ans : request,getResource();

6) Personalization concept what are the data type ….the type attribute supports

Ans : String , Date , Select , Boolean

7) How do u get the property from the IPortalCompenentProfile

Ans : getProperty(String)

8) What is the method has to be overridden by the class that extends from PageProcessorComponent

Ans : getPage();

9) Give the sequence of methods execution of DynPage

Ans :

1) doInitialization()

2) doProcessAfterInput()

3) doProcessBeforeOutput()

10) sequence of method calls when an event occurs

Ans :

1) doProcessAfterInput()

2) on

3) doProcessBeforeOutput()

11) how do u get the getCurrentEvent();

Ans :

IPageContext myContext = PageConectFactory.createPageConext(request,responce);

Event = myContext.getCurrentevent();

12) onClientClick() and onClick() are specified then which method will be called first

Ans : onClientClick();

13) JSPDynPage uses _________ type of approach

Ans : Model View Controller

14) The two properties in the component profile indicate that a JSP need to be compiled into portal component :

Ans : Property name = “JSP”

Property name = “ComponentType”

15) Which one is true in the following statement

a)

<%@ taglib uri = “hbj” prefix = “htmlb”>

b)

<%@ taglib uri = “hbj” prefix = “hbj”>

c)

<%@ taglib uri = “hbj” prefix = “htmlb”>

d)

<%@ taglib uri = “htmlb” prefix = “hbj”>



Ans : a



16) How do u call a jsp file :

Ans : setJspName();



17) If java Script are used _______tag is necessary for the page

Ans : Page tag



18) Which tag is used for including bean in the jsp file

Ans :



19) What is the scope of the bean

Ans : Session



20) Give the objects that are extend form the IPrincipal

Ans :

IGroup,IRole , IUser,IUserAccount,IUserMaint,



21) ____is the Central object from which all UME object factories are obtained

Ans : UMFactory

22) IUser user = UMfactory.getUserfactory().getUserbyLogonID(uid)

String userName = user.getDisplayName();

String email = user.getEmail();

Response.write(“ userName”+ username+” Email :”+ email);



1) Displays the username and Email ID

2) Throws an exception

3) Doesn’t Compile



Ans : 1.

23) List the methods used to create successful user

Ans :

1) NewUser(uid);

2) setFirstName()

3) setLastName()

4) setEmail()

5) setPassword();





24) can we construct an unique id manually .

Ans : false can create a unique id .



25) Unique IDs are used to identify objects across data sources.

Ans : ..



26) How do u retrieve log on information

Ans : umdata.enrich(map);



27) What is the return type of map.get(“”);

Ans : String



28) How do u load the data in the Client Eventing across the iviews

Ans : EPCM.loadClientData();



29) What is the object available in the pages

Ans : EPCM

30) What problems does HTMLB the overcomes on servelts

Ans :

Visualization and business logic are not separate

Development has to take care of different web clients and versions

NamesSpace conflicts with form elements



31) Stored data is identified by the key …..

Ans : Namespace+name



32) Frame Work Levels

Ans :

Level = 0 ….. not supported by both javaScript, Java

Level = 1 ----only by browser (java Script )

Level = 2 ----both javaScript and Java



33) Features of portal Services in the portal

Ans :

1) portal services are way to provide functionality to portal component

2) portal services implement no user interface

3) portal service may be accessed from out side portal framework



34) Why do we need custom portal Services in the portal

Ans :

1) Can be used by other Portal Application

2) Provide commonly used Functionality

3) Can be exposed as webservice

35) To build a new portal service ……interface must be implemented

Ans :

IService



36) Life cycle of a portal service

Ans :

Init()

Afterinit()

Destroy()



37) Service name =

Ans :

.

.

.

.



38) Portal service name is myService …what would be the name of interface that extends IService

Ans : IMyService



39) JCA/J2EE connector Architecture is not API. True /False

Ans : true

40) ConnectorFrameWork is SAP Extended API from CCI . all methods in Connector FrameWork has methods with suffix as EX()….

Ans : False only some methods



41) What is the method used to get connection in the Java Connectors

Ans : Service.getConnection();



42) How do u get the locale from the request object.

Ans : Request.getLocale();



43) What is the return type for the table type structure .

Ans : IRecordSet.



44) Give the name of the method that returns resource bundle for the current locale.

Ans : getResourceBundle()



45) Localization.properties

Localization_de_DE.properties.

Localization_en_EN.properties.

What is the Resource bundle name : ?

Ans : Localization



46) What is the data type that returns by the method getString (key)



Ans : String



47) How do u access the a key in the properties file xyz = abc

Ans : getString (“xyz”)

48) What type of objects can be translated

Ans : Text

49) the portal translation process is supported by tools

Ans :

Translation worklist coordination

Worklist translation



50) to customize the Logoff screen to the portal ----file is used to change

Ans : masthead

51) SAP Recommends not to modify the SAP code , then what are the process to customize the code

Ans :

1) copy the existing file and rename it according to customer name space

2) create new custom component



52) how can we customize the company Branding



Ans :

1) masthead

2) through customize applicaton





53) Their one question on Desktop inner page




54) What are the components that are added to Portal Desktop

Ans :

Default Frame work

Themes



55) What is the jsp name that contains log on page

Ans : umLogonPage.jsp

56) Authschemes.xml is modified to get custom log on component



57) How do u access portal services from WebDynpro applications

Ans : WDPortalUtils



58) Cache Level :

Ans : none ,session ,user,shared




59) getCachingLevel() is used to get the Cachelevel



60) When will the doinitialization () method is called

Ans :

1 When the page is loaded for the first time

2 When the page is refreshed

3 When object is called from the another object



61) Cached objects are retrieved using the --- method

Ans : get(key)

62) How can the portal service access to external Web Service

Ans :

Generate java proxy out of WSDL file with PDK

You can execute the java proxy as portal service

sap netweaver TEP12 Questions - Part 1

sap net weaver TEP12 Questions-Part 1


1. Portal Applications ..
* Can be assigned to portal roles
* Are typically developed using NWDS
* Are stored in PAR files
* Are bundles of on or more portal components or portal services
* Are developed in portal content studio

2. Roles can assigned to
* WorkSets
* Groups
* Roles
* iViews
* Users



3. Which of the following statements about the Software Architecture of Enterprise Portal are true
* Portal Service acts like a interface that are enabled to exchange data and procedures
* The User Management Services is an interface between portal run time and the UME
* Page Builder is the portal component responsible for assembling the pages
* PRT is executed according to user requests, generating HTML output to display on client
* PRT service can be exposed as Web service



4. Business Packages are
* Always free of charge
* Are exclusively developed by SAP
* Typically contains iviews and worksets
* Can be downloaded from sdn.sap.com
* Can be downloaded from service.sap.com

5. What are the three main building blocks of SAP Enterprise Portal?
* Portal Runtime, Portal Server and Portal Framework
* Portal Platform, Unification and content Management
* Portal Platform, KM and Collaboration
* Portal Framework, Content Management and Collaboration

6. In Which Functional area is KM positioned
* Lifecycle Management
* Composite Application Framework
* Information Integration
* People Integration
* Process Integration



7. Which Functional Areas of SAP NetWeaver are delivered by SAP EP
* Multi Channel Access
* Knowledge Management
* Collaboration
* Portal
* Integration Broker



8. Pages can assigned to
* Worksets
* Groups
* Roles
* iViews
* Users
* Pages
9. Worksets can be assigned to
* Worksets
* Groups
* Roles
* iViews
* Users
* Pages



10. Which of the following statements are correct with regard to Portal Content Studio?
* The PCD is the tool to access PCD Objects offering browse and search interface
* PCD shows all PCD objects to every content administrator user
* Different editors respectively wizards are offered according to the view used to access Portal Content Catalog
* Both in Browse and Search, the view of the objects is organized by the type of object

11. Whats the difference between PCD Folders and folders within a workset/role?
* The MergeID property can be maintained only for PCD Objects
* The name can be maintained in different languages only for worsket/role folders
* The sort priority property can be maintained only for workset/role folders
* ACL’s can be maintained only for PCD folders
* Only PCD Folder names appear in Top Level Navigation and detailed navigation





12. You can integrate the following SAP Applications into SAP NetWeaver Portal as iViews
* SAP Transactions
* IAC Applications
* BSP Applications
* BEx Web Applications
* WebDynro for Java and ABAP
* Web Services



13. A Portal role determines
* Entries in Top Level Navigation
* Entries in Detail Level Navigation
* Portal Content user can access
* Authorizations in BackEnd Systems

14. Please Choose the type of Database required for VC connection when not accessing SAP System
* Any SAP DGDB compliant Database
* Any JDBC Compliant Database
* Any Oracle Database



15. What form of output is generated when a VC iView is created
* HTMLB
* WebDynpro for ABAP
* XML
* Java Applet

16. When using VC which of the following could be used to retrieve information from SAP System
* Java API
* BADI
* RFC
* BAPI
* Function Module



17. Please select the option that does not describes the portal run time
* Provides a runtime for all non JAVA Applications
* Defines and manages the objects that makes up the portal environment
* Provides runtime and its corresponding services
* PRT is one basic part of portal environment integrated into Web AS
* Provides Development environment

18. Please select the correct alternative with respect to Portal Applications
* Portal Applications are bundles of Portal components and API code
* Portal Application are bundles of Portal Services and API code
* Portal Applications are bundles of Deployment Descriptors and Portal Components
* Portal Applications are bundles of Portal Components and Portal Services



19. Which class is used to determine locale dependent strings?
* getLocale
* GoupResource
* BundleResource
* ResourceBundle
* ResourceGroup

20. Please select the correct form of localization file?
* XXX.de.DE.properties
* DE_de_xxx.properties
* EN_XXX_DE.properties
* XXX_Prop_EN.property

21. Which syntax is most likely to give value of single property?
* Profile.getPropertyAttributes(“name”);
* Profile.setParameter(“name”);
* Profile.getValue(“name”);
* Profile.getProperty(“name”);
* Profile.getParameter(“name”);



22. Which syntax is most suitable to get value of parameter?
* Request.getParameter(“name”);
* Request.getValue(“name”);
* Response.getParameter(“name”);
* Response.getPValue(“name”);

23. Which method would be used to retrieve the value of the text?
* getValue(key)
* getGroup(key)
* getText(key)
* getString(key)
* getLocale(key)

24. Which method returns the current users locale?
* setLanguage
* setResourceBundle
* setLocale
* getLanguage
* getLocale


25. Which method from the following methods return the resource bundle?
* getLocale
* getResourceGroup
* setResourceBundle
* getResourceBundle
* setResourceGroup

26. Please select the three types of personalization within the portal?
* Folder
* Workset
* Portal
* iView
* Page

27. When describing the personalization properties – what are the four types?
* Date, Select, Boolean, String
* Time, Date, Boolean, Select
* Time, Date, Select and String
* Date, Time, String and Boolean



28. Localization property files must be accessible by the Java class loading mechanism and must be packaged in the folder of the PAR files accessible by the Java Class Loader. Please select the correct examples of these files.
* JAR file in API section
* Properties file in the PORTAL-INF/private/classes
* JAR file in PORTAL-INF/private/lib
* JAR file in PORTAL-INF/lib
* Properties file in the PORTAL-INF/classes

29. What gives the Portal Service a view on the Portal Environment?
* ServiceContext
* IPortalService
* IPortalServiceContext
* IServiceContext
* IService
* IMyService

30. What interface needs to be implemented to build a new Portal Service?
* IPortalService
* IService
* MyService
* IMyService
* Service


31. The extension of our portal applications after uploading into a portal could be
* .rtf
* .txt
* .zip
* .par
* .doc



32. What portal development view could we use to add a new property without physically typing the code into the portalapp.xml?
* EP perspective
* Package explorer
* Debugging
* Outline
* Console

33. Please choose the methods that do not belong to the IUser interface
* setFirstGirlFriend
* setlastName
* setFirstName
* setHomeAddress
* setWorkAddress
* setEmail

34. Please select the non LDAP directory from the following?
* IPlanet
* Seimens
* MS-ADS
* SAP DB
* Novell

35. What service creates users in the external systems?
* User Management Engine
* Persistence Adapters
* Replication Manager
* Persistent Manager
* LDAP directory
* Database



36. Choose the interface from the following list that enables user maintaienence.
* IUserMaint
* IUser
* IPrincipal
* IPusher
* IRole
* IGroup

37. After the following search : ISearchResult result = userFact.searchUsers(userFilt), what state would determine the search was successful?
* SEARCH_RESULT_CORRECT
* SEARCH_RESULT_PASS
* SEARCH_RESULT_FOUND
* SEARCH_RESULT_OK
* SEARCH_RESULT_SUCCESS

38. After the following search : ISearchResult result = userFact.searchUsers(userFilt), what state would determine the search was unsuccessful?
* SEARCH_RESULT_UNDEFINED_STATE
* SEARCH_RESULT_BAD
* SEARCH_RESULT_INCORRECT
* SEARCH_RESULT_UNKOWN
* SEARCH_RESULT_INCOMPLETER



39. Which of the following role is to be given to Portal Developer
* PortalDeveloper
* ContentDeveloper
* JavaDeveloper
* SystemDeveloper
* JavaAnalyst
* JavaProgrammer
* PortalApp

40. A J2EE application server vendor is responsible for shipping a set of Java API’s these would be called
* Java Connector
* J2EE Application server
* JCA API
* EIS Systems
* Common Client Interface
* Resource Adapters

41. EIS vendors are responsible for developing what to shield the developer from the complexity of the EIS API’s?
* Java Connector
* J2EE Application server
* JCA API
* EIS Systems
* Common Client Interface
* Resource Adapters

42. Name of the architecture which defines interface for connection?
* JCA
* JTA
* JMS
* EIS
* ERP
* MYDB

43. BAPI stands for
* Business Additional Procedures Interface
* Business Applications Programming Interface
* Backward Applied Program Interface
* Business Applied Procedural Interface
* Business to Business Application programming interface



44. Which entry in the deployment descriptor deals with Connector Framework
* sharingReference
* ConnectorDB
* System_Alias
* ServicesReference
* ServiceReference
* privateSharingReference

45. What is an interaction when used in the following syntax: Interaction Ix = connection.createInteractionEx();
* Establishes the interaction with the EIS system
* Obtains the actual connection
* Describes the data needed to call a specific function
* Destroys the connection
* Establishes the connection
* Creates an interface to jar files

46. To iterate through the record what method could be used?
* Next()
* Iterate()
* getMore()
* getAnother()
* getNext()




47. In a JDBC connection – what format is the returned data held in ?
* IRecordData
* IRecordHeadings
* ISet
* IRecordSet
* IRecord

48. Please select the correct statements from the following
* EPCF has following levels 1,2,3
* EPCF have following levels 1,2,3 and 4
* EPCF have following levels 0,1,2
* EPCF have no levels
* Represents the sourceID

49. When subscribing to an event with these parameters what is the event handler?
(“urn”,”ABC”,eventHandler)
* Is an optional parameter
* Represents the data object
* Respresents the javascript function that is called
* Contains the data needed for the client side communication
* Respresents the sourceid

50. Please choose the correct syntax to subscribing for an event.
* EPCF.raiseEvent(“urn”,”ABC”, eventHandler);
* EPCM.subscribeEvent(“urn”,”ABC”,eventHandler);
* EPCM.raiseEvent(“urn”,”ABC”,eventHandler);
* EPCF.subscribeEvent(“urn”,”ABC”,eventHandler);

51. Please select the correct statement from the following?
* ClientEventing: common communication channel for java application communicating on the client side
* ClientEventing: common communication channel for javascript application communicating on the client side
* ClientEventing: common communication channel for javascript application communicating on the server side
* ClientEventing: common communication channel for java application communicating on the server side



52. Please select the correct statement from the following?
* Client Data Bag: a client side JavaScript object which serves as cross-iView storage mechanism
* Client Data Bag: a client side Java object which serves as cross-iView storage mechanism
* Client Data Bag: a server side JavaScript object which serves as cross-iView storage mechanism
* Client Data Bag: a server side Java object which serves as cross-iView storage mechanism



53. Please select the correct navigation syntax within the WebDynpro iView
* WD.navigateAbsolute(“ROLES://portal_content/Portal_Role/SimplePage.
* Portal.navigateAbsolute(“PCD://portal_content/Portal_Role/SimplePage.
* WDPortalNavigation.navigateAbsolute(“ROLES://portal_content/Portal_Role/SimplePage.
* WDPortalNavigation.navigateAbsolute(“PCD://portal_content/Portal_Role/SimplePage.
* PortalNavigation.navigateAbsolute(“PCD://portal_content/Portal_Role/SimplePage.



54. Please select the correct statements from the following?
* Instead of EPCM the WebDynpro uses WDPortalEventing e.g WDPortalEventing.subscribe
* Eventing between other Portal iviews is supported
* WebDynpro doesnot have wizard for eventing
* The System used is normally “SAP_localSysten” with WD iViews in the portal
* PortalEventing with WebDynpro iViews can interact with Events from different URL domains
* WebDynpro applications can be embedded in iViews

Monday, February 9, 2009

How to work with Command in the new MDM Java API Vijendra Singh Bhanot

How to work with Command in the new MDM Java API
Vijendra Singh Bhanot


I am new to the MDM Java API. For one of the existing project I was asked to explore the new MDM Java API. I do not have any experience with the earlier MDM4J but still I was able to understand and explore the new MDM Java API. I think this API is excellent and I am almost fallen in love with it. The best part I like about this API is the Commands. Initially it took me time to understand the concept but very soon it was all clear. With this blog I would like to share my knowledge about how to work with Commands. Also in this blog you will see an example of Validation Command.




Why you need a Command? What is a Command?


I command the MDM Java API to get me the list of Validations …..

Well, that’s what the idea is.


Command is a special class that instructs MDM Java API to perform some action. All actions like managing repository content, searching records, managing repository schema and etc, have dedicated commands.


All these commands are logically organized in packages. You can always refer Java Doc to identify which Command you need to use. (https://help.sap.com/javadocs/MDM/current/index.html)

How to Use a Command?



All Command’s are used in the following way:



1 RetrieveValidationsCommand objRetrieveValidationsCommand = new RetrieveValidationsCommand();

2 objRetrieveValidationsCommand.setSession();

3 objRetrieveValidationsCommand.setTableId(<>); // Required

try {

4 objRetrieveValidationsCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}



(1) You create a new instance of a particular command by passing it the ConnectionAccessor object.

(2) The very second step is mostly the setSession(). Here it is important that you use the right type of session (Server Session, Repository Session or User Session). For Example, you may not be able to get list of validations if you use a Repository Session instead of User Session.

As I already mentioned that setSession() is not always the second step. Actually the commands responsible for creation of session are the ones that do not require a session. Thus these commands (CreateServerSessionCommand, CreateRepositorySessionCommand and CreateUserSessionCommand) do not require setSession().

(3) Some commands require specific setter method to be set before it is used. These setter methods are marked as “Required” in the Java Dock. There are few which are marked optional.

(4) Inside a try block you execute the command. If there is an error then CommandException is thrown.


Here is a sample of RetrieveValidationsCommand in action…

/*
* Created on Feb 7, 2008
*
* To change the template for this generated file go to
* Window>Preferences>Java>Code Generation>Code and Comments
*/
package demo.validation;

import com.sap.mdm.commands.AuthenticateUserSessionCommand;
import com.sap.mdm.commands.CommandException;
import com.sap.mdm.commands.CreateUserSessionCommand;
import com.sap.mdm.commands.GetRepositoryRegionListCommand;
import com.sap.mdm.data.RegionProperties;
import com.sap.mdm.ids.TableId;
import com.sap.mdm.net.ConnectionException;
import com.sap.mdm.net.ConnectionPool;
import com.sap.mdm.net.ConnectionPoolFactory;
import com.sap.mdm.server.DBMSType;
import com.sap.mdm.server.RepositoryIdentifier;
import com.sap.mdm.validation.ValidationProperties;
import com.sap.mdm.validation.ValidationPropertiesResult;
import com.sap.mdm.validation.commands.RetrieveValidationsCommand;

/**
*
*
* To change the template for this generated type comment go to
* Window>Preferences>Java>Code Generation>Code and Comments
*/
public class GetListOfValidations {

public static void main(String[] args) {
// create connection pool to a MDM server
String serverName = "LOCALHOST";
ConnectionPool connections = null;
try {
connections = ConnectionPoolFactory.getInstance(serverName);
} catch (ConnectionException e) {
e.printStackTrace();
return;
}

// specify the repository to use
// alternatively, a repository identifier can be obtain from the GetMountedRepositoryListCommand
String repositoryName = "INQDemo";
String dbmsName = "localhost";
RepositoryIdentifier reposId =
new RepositoryIdentifier(repositoryName, dbmsName, DBMSType.ORACLE);

// get list of available regions for the repository
GetRepositoryRegionListCommand regionListCommand =
new GetRepositoryRegionListCommand(connections);
regionListCommand.setRepositoryIdentifier(reposId);
try {
regionListCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}
RegionProperties[] regions = regionListCommand.getRegions();

// create a user session
CreateUserSessionCommand sessionCommand =
new CreateUserSessionCommand(connections);
sessionCommand.setRepositoryIdentifier(reposId);
sessionCommand.setDataRegion(regions[0]); // use the first region
try {
sessionCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}
String sessionId = sessionCommand.getUserSession();

// authenticate the user session
String userName = "admin";
String userPassword = "admin";
AuthenticateUserSessionCommand authCommand =
new AuthenticateUserSessionCommand(connections);
authCommand.setSession(sessionId);
authCommand.setUserName(userName);
authCommand.setUserPassword(userPassword);
try {
authCommand.execute();
} catch (CommandException e) {
e.printStackTrace();
return;
}

// the main table, hard-coded
TableId mainTableId = new TableId(1);

// Get the list of validations

RetrieveValidationsCommand objRtvVldCmd =
new RetrieveValidationsCommand(connections);
// set the user session
objRtvVldCmd.setSession(sessionId);
// get validation for the following tables.
objRtvVldCmd.setTableId(mainTableId);

try {
objRtvVldCmd.execute();

} catch (CommandException e) {
e.printStackTrace();
return;
}

ValidationPropertiesResult objVldPropRslt =
objRtvVldCmd.getValidationPropertiesResult();

ValidationProperties[] validations = objVldPropRslt.getValidations();


//disply --> Validation ID | error/warning message | Validation Name
for (int i = 0; i < validations.length; i++) {

System.out.println(
validations[i].getId()
+ " | "
+ validations[i].getMessage()
+ " | "
+ validations[i].getName());
}

}
}

Vijendra Singh Bhanot is a certified XI Consultant

Thursday, February 5, 2009

MDM Java API 2 an introductive series part IV Tobias Grunow

MDM Java API 2 an introductive series part IV
Tobias Grunow


Introduction

In my last Blog I showed you how to get an authenticated user session using a trusted connection. In this part I want to show you how to gather MDM system information from a SAP Portal system object.
After that I will introduce the basic concept behind working with MDM and the Java API 2, showing you how the system works (RecordID, Lookup Values, Display Fields...).

If you build a solution based on MDM you will sooner or later face a two or three tier MDM system landscape. Meaning that there will be a development system (D-System) and there might be a quality assurance system (Q-System) and there will be a productive system (P-System). This scenario can include multiple MDM Server or maybe only different Repositories on the same server and in addition maybe (D-, Q-, P-Clients). So now you face the problem how to address the different repositories from your code without having to take care about addressing the right one dependant on which landscape you are in.
In case you are using a SAP Portal I want to point to a very useful thread I have found on SDN.

Hard-code credentials - any other solution exists?

This forum post shows you how to work with the SAP Portal system objects and how to retrieve information out of it. I have used this concept to address the problem I have just pointed out to you in addition with the trusted connection user session authentication and it worked fine for me.

Let us continue with the next topic...
MDM Java API 2 introducing the concept

First of all I want to give u a brief introduction into the concept of MDM to better understand my future coding. As a coder you need a much better understanding of the concept behind the MDM than anyone else. Simple GUI-Users (Graphical User Interface) don’t have the need to understand the technical details behind the data storage and operations like Java programmers. So at the beginning the most important task is to understand how MDM will store data and connect the data with each other.
So let’s start with the first graphic:
image
Figure 1: Table concept in MDM

The first graphic shows the table concept of MDM. The central table called “Main table” is the centre of the model. From the main table there will be references going towards any kind of sub tables. In the Main table will be fields. Those fields can hold values which are stored directly in the table or hold a reference to any record of a sub table. Sub tables can store data values or hold references to other sub tables.
To illustrate the possible layout of a main table take a look at figure 2.

[Click the picture to enlarge!]
image
Figure 2: Possible layout in main table (MDM Console view)

As you can see the main table stores e.g. text values directly as well as references to other sub tables (Lookup’s of different kinds). To find out where the lookup’s are pointing to, you can open up the MDM Console and click on an entry in the list of fields in the main table to see its details (Figure 3).


image
Figure 3: Main table field details (Bottom part of MDM Console)


If we look at the details we have to notice two important things.
First, figure 3 shows us a field detail named “CODE”. This code is very important for us because it is the name we will use in our code to address this field in the table. Second, we have to notice that the field is of “Type” Lookup [Flat]. This tells us, that the value we will find in this field will be of type RecordID.
A RecordID is the ID of the record in the sub table (e.g. Sales Product Key – Table [Detail: Lookup Table]). This means that we will not be able to access the value directly by accessing the field in the main table. We will only get the reference to the sub table which holds the actual value desired. In my future Blogs I will give more details on the concepts behind MDM and give examples of other techniques used.
So enough of MDM concepts and let’s get to the Java API and some examples.
Searching in MDM

Searching in MDM will be one of the most common used functionality there is.
Searching in a repository has some prerequisites. First we have to have a connection to the MDM Server and second we have to have an authenticated user session to a repository. In my Blogs published before I showed you how to setup those prerequisites. In the class provided I have combined all the necessary steps to get the connection and the user session. So now let’s get to the code.

package com.sap.sdn.examples;

import java.util.Locale;

import com.sap.mdm.commands.AuthenticateUserSessionCommand;
import com.sap.mdm.commands.CommandException;
import com.sap.mdm.commands.CreateUserSessionCommand;
import com.sap.mdm.commands.SetUnicodeNormalizationCommand;
import com.sap.mdm.commands.TrustedUserSessionCommand;
import com.sap.mdm.data.RecordResultSet;
import com.sap.mdm.data.RegionProperties;
import com.sap.mdm.data.ResultDefinition;
import com.sap.mdm.data.commands.RetrieveLimitedRecordsCommand;
import com.sap.mdm.ids.FieldId;
import com.sap.mdm.ids.TableId;
import com.sap.mdm.net.ConnectionAccessor;
import com.sap.mdm.net.ConnectionException;
import com.sap.mdm.net.SimpleConnectionFactory;
import com.sap.mdm.search.FieldSearchDimension;
import com.sap.mdm.search.Search;
import com.sap.mdm.search.TextSearchConstraint;
import com.sap.mdm.server.DBMSType;
import com.sap.mdm.server.RepositoryIdentifier;

public class SearchExamples {

// Instance variables needed for processing
private ConnectionAccessor mySimpleConnection;
// Name of the server that mdm runns on
private String serverName = "IBSOLUTI-D790B6";
// Name of the repository shown in the mdm console
private String RepositoryNameAsString = "SDN_Repository";
// Name of the DB-Server this could be an IP address only
private String DBServerNameAsString = "IBSOLUTI-D790B6\\SQLEXPRESS";
// Define the Database type (MS SQL Server)
private DBMSType DBMSTypeUsed = DBMSType.MS_SQL;
// Create a new data region
private RegionProperties dataRegion = new RegionProperties();
// Session which will be used for searching
private String userSession;
// Default user name
private String userName = "Admin";
// Password is empty on default setup
private String userPassword ="";
// result we will get from mdm
public RecordResultSet Result;

/**
* Constructor for class
*/
public SearchExamples(){
// Set the Data Region
dataRegion.setRegionCode("engUSA");
// Set the locale on data region
dataRegion.setLocale(new Locale("en", "US"));
// Set the name of data region
dataRegion.setName("US");
// get a connection to the server
this.getConnection();
// Authenticate a user session
try {
this.getAuthenticatedUserSession();
} catch (ConnectionException e) {
// Do something with exception
e.printStackTrace();
} catch (CommandException e) {
// Do something with exception
e.printStackTrace();
}
// Get resulting records
Result = this.SearchTypes();

}

// ResultDefinition Main Table declaration
private ResultDefinition rdMain;


/**
* Method that will search for all records in main table of a certain type
*
* @return RecordResultSet that holds all resulting records from search
*/
public RecordResultSet SearchTypes() {

/**
* 1. First we create the Result Definition. This result definition will
* tell the search which fields are of interest to us. The list could
* include all fields of the table or only the ones we are interested
* in.
*/
// Define which table should be represented by this ResultDefintion
// In my repository this is the table MAINTABLE
rdMain = new ResultDefinition(new TableId(1));
// Add the desired FieldId's to the result definition
// In my repository this is the field PRODUCT_NAME
rdMain.addSelectField(new FieldId(2));
// In my repository this is the field TYP
rdMain.addSelectField(new FieldId(27));

/**
* 2. Create the needed search parameters.
* Define what to search for and where.
*/
// Create the field search dimension [Where to search!?]
FieldSearchDimension fsdMaintableType = new FieldSearchDimension(new FieldId(27));
// Create the text search constraint [What to search for?! (Every record that contains ROOT)]
TextSearchConstraint tscTypeRoot = new TextSearchConstraint("ROOT", TextSearchConstraint.CONTAINS);

/**
* 3.
* Create the search object with the given search parameters.
*/
// Create the search
Search seSearchTypeRoot = new Search(new TableId(1));
// Add the parameters to the search
seSearchTypeRoot.addSearchItem(fsdMaintableType, tscTypeRoot);

/**
* 4.
* Create the command to search with and retrieve the result
*/
// Build the command
RetrieveLimitedRecordsCommand rlrcGetRecordsOfTypeRoot = new RetrieveLimitedRecordsCommand(mySimpleConnection);
// Set the search to use for command
rlrcGetRecordsOfTypeRoot.setSearch(seSearchTypeRoot);
// Set the session to use for command
rlrcGetRecordsOfTypeRoot.setSession(this.userSession);
// Set the result definition to use
rlrcGetRecordsOfTypeRoot.setResultDefinition(rdMain);
// Try to execute the command
try {
rlrcGetRecordsOfTypeRoot.execute();
} catch (CommandException e) {
// Do something with the exception
e.printStackTrace();
}
// Return the result
return rlrcGetRecordsOfTypeRoot.getRecords();
}

/**
* Create and authenticate a new user session to an MDM repository.
*
* @param mySimpleConnection
* The connection to the MDM Server
* @param RepositoryNameAsString
* name of the repository to connect to
* @param DBServerNameAsString
* name of DBServer
* @param DBMSType
* Type of DBMS that MDM works with
* @param dataRegion
* RegionProperties defining the language the repository should
* be connected with.
* @param userName
* Name of the user that should make the connection to repository
* @param userPassword
* password of user that should be used if connection is not trusted
* @throws ConnectionException
* is propagated from the API
* @throws CommandException
* is propagated from the API
*/
public String getAuthenticatedUserSession(
) throws ConnectionException, CommandException {
/*
* We need a RepositoryIdentifier to connect to the desired repository
* parameters for the constructor are: Repository name as string as read
* in the MDM Console in the "Name" field DB Server name as string as
* used while creating a repository DBMS Type as string - Valid types
* are: MSQL, ORCL, IDB2, IZOS, IIOS, MXDB
*/
RepositoryIdentifier repId = new RepositoryIdentifier(
RepositoryNameAsString, DBServerNameAsString, DBMSTypeUsed);
// Create the command to get the Session
CreateUserSessionCommand createUserSessionCommand = new CreateUserSessionCommand(
mySimpleConnection);
// Set the identifier
createUserSessionCommand.setRepositoryIdentifier(repId);
// Set the region to use for Session - (Language)
createUserSessionCommand.setDataRegion(dataRegion);
// Execute the command
createUserSessionCommand.execute();
// Get the session identifier
this.userSession = createUserSessionCommand.getUserSession();

// Authenticate the user session
try {
// Use command to authenticate user session on trusted connection
TrustedUserSessionCommand tuscTrustedUser = new TrustedUserSessionCommand(
mySimpleConnection);
// Set the user name to use
tuscTrustedUser.setUserName(userName);
tuscTrustedUser.setSession(this.userSession);
tuscTrustedUser.execute();
this.userSession = tuscTrustedUser.getSession();
} catch (com.sap.mdm.commands.CommandException e) {
/* In Case the Connection is not Trusted */
AuthenticateUserSessionCommand authenticateUserSessionCommand = new AuthenticateUserSessionCommand(
mySimpleConnection);
authenticateUserSessionCommand.setSession(this.userSession);
authenticateUserSessionCommand.setUserName(userName);
authenticateUserSessionCommand.setUserPassword(userPassword);
authenticateUserSessionCommand.execute();
}
// For further information see:
// http://help.sap.com/javadocs/MDM/current/com/sap/mdm/commands/SetUnicodeNormalizationCommand.html
// Create the normalization command
SetUnicodeNormalizationCommand setUnicodeNormalizationCommand = new SetUnicodeNormalizationCommand(
mySimpleConnection);
// Set the session to be used
setUnicodeNormalizationCommand.setSession(this.userSession);
// Set the normalization type
setUnicodeNormalizationCommand
.setNormalizationType(SetUnicodeNormalizationCommand.NORMALIZATION_COMPOSED);
// Execute the command
setUnicodeNormalizationCommand.execute();
// Return the session identifier as string value
return this.userSession;
}

/*
* The method will return a ConnectionAccessor which is needed every time
* you want to execute a Command like searching or as on any other Command
* there is. @return SimpleConnection to MDM Server
*/
public void getConnection() {
String sHostName = serverName;
// We need a try / catch statement or a throws for the method cause
try {
/*
* retrieve connection from Factory The hostname can be the name of
* the server if it is listening on the standard port 20005 or a
* combination of Servername:Portnumber eg. MDMSERVER:40000
*/
mySimpleConnection = SimpleConnectionFactory.getInstance(sHostName);
} catch (ConnectionException e) {
// Do some exception handling
e.printStackTrace();
}
}
}


To test the code we also need a simple test class that will instantiate the sample class and prints out the number of retrieved Records.

package com.sap.sdn.examples;

public class Test {

/**
* @param args
*/
public static void main(String[] args) {
// Create instance of search class
SearchExamples test = new SearchExamples();
// Print out the amount of found records
System.out.println("Total of " + test.Result.getCount() + " Records found that match criteria TYPE=ROOT");
}
}



I wrote a lot of comments in the code but I will give you some more details on what is happening there.

First of all there are some instance variables that hold the connection information to the MDM Server and the MDM Repository.
The constructor will set up some variables which are needed for the search.
A connection to the MDM server will be created.
A Session will be created and will be authenticated, if there is a trusted connection we will use it to authenticate and if we only have a normal connection to the server normal authentication will be used.
The search will be triggered and the result will be stored in instance variable.
The search needs some elements to work with.

At the beginning in the SearchTypes() method [Comment 1. in code] set up a ResultDefiniton which tells the search what fields are of interest to the program and should be accessible in the result. If this definition is not including some desired field and you want to access it later in your code, an exception will be thrown.

[Comment 2. in code] Define a search dimension and a search constraint to tell the search where to look at and what to look for. There are a lot of search constraints to work with.
All available constraints:

[Comment 3. in code] Define the search itself and use the dimension and constraint to set it up.

[Comment 4. in code] Build the command that is needed to execute the search.
More Details on command please see this link

So this is just a very simple example and I will give you more advanced codes in future Blogs.
If you have any questions on what the code does or need more detailed explanations please feel free to comment on this Blog. If this code helped you a little please feel free to comment as well.


So this will be the last Blog for this year since I am rebuilding my flat at the moment. The topic of the next Blog needs to be defined and I will update my agenda in my first Blog next year.


So I wish you a happy new year and as you would say in german: “ Einen guten Rutsch ins neue Jahr!”.

Best regards,
Tobi

Tobias Grunow is a Solution Consultant for IBSolution GmbH Heilbronn, Germany. He is a member of the research and development team.


MDM Java API 2 an introductive series part IV
Tobias Grunow

Resolve Consolidated Reporting Problem in multiple BI systems through SAP MDM Ankur Ramkumar Goe

Resolve Consolidated Reporting Problem in multiple BI systems through SAP MDM
Ankur Ramkumar Goe

Summary

Management uses BI reports for planning, Strategies and making Business Decisions. BI systems get data from multiple systems to give consolidated data. However all transactional systems maintain their own master data which in turn is sent to BI. Thus Master data in BI is quite redundant and does not show single version of truth. Due to this, the Management gets wrong reports/information about organizations performance which leads to wrong planning and decisions.

We will try to address this issue with the help of Master Data management solution so that Management gets true view of global business and thus enabling them with unified business information for optimized operations and faster and accurate decisions.



Current Scenario

Organizations, due to various reasons like across geographies, mergers and acquisitions, best of breed products landed in scattered and distributed IT landscape. With the introduction of BI for reporting, currently organizations have respective BI systems for reporting according to geographies, systems, units, functionality. Every transactional system needs to create its own master data as master data drives transactions. These transactional systems are usually connected to their own BI systems and do not share their data with other systems due to various reasons like complexity, number of connections, cost of maintaining connections, lack of governance etc. Thus every BI system ends up getting master data from their own transactional systems and which in turn fed to corporate BI system for consolidated reporting. Without consistent master Data, data warehousing becomes garbage in, garbage out.

Due to all this master data in BI is quite redundant and does not show single version of truth. Hence the Management gets wrong reports presenting wrong information about organizations performance which leads to wrong planning and decisions. The management might be looking on the reports which will be showing 1 single customer as 3 or more customers (e.g. John Kennedy, John F Kennedy, J F Kennedy etc) or same with vendors ( e.g. Satyam, Satyam Computers, Satyam Computers Services ltd etc). Surveys show that more than 70% decisions are made wrongly because of incomplete or wrong reports/information. Even organizations spend 30% of their time to verify the reports/information. There’s a organization which found that their 40% of orders struck because of mismatched master data.
BI system have ELT layer which is specially made for reporting use only. Unfortunately it is not configured and optimized for cleansing Master Data. Also organizations are maintaining and trying to solve master data problems for many years by their own methods and tools. By this organizations were trying to heal the symptoms but were not able to solve root cause of master data problem. Industry has recognized this problem and coming up with a tool to manage master data. With the evolution of MDM tools, organizations will be able to benefit by best practices, reduced efforts and ease. Also maintaining master data outside of BI system will help

Below are two scenarios showing organizations distributed IT landscapes.

Scenario 1 – Organizations BI landscape as per geography
bi 3

Scenario 1

This scenario describes organization landscape as per geography which is distributed across geographies to cater the local and corporate reporting requirements.
Scenario 2 - Organizations BI Landscape as per functionality
image

Scenario 2

This scenario describes organization landscape as per different functionality such as finance and logistics separately.


Suggested Approach

Single version of truth of master data across organization can be achieved through introduction of MDM system. There’s a organization which found that they had 37% duplicates vendors in their systems.

Since we are handling master data of organization, the introduction of MDM system should not be disruptive to other systems and business in mind. Thus small steps approach is recommended for approaching master data management system in organizations. For consolidated reporting giving correct reports, 2 approaches are suggested. The 2nd approach has 2 steps. However organization can go directly for approach 2.2, however decision has to be taken as per disturbance to existing landscape. Also Strong Governance mechanism has to be in place is suggested for CMDM.

Approach 1 Harmonization across BI

Approach 2.1 Master Data from Legacy Systems

Approach 2.2 Central Master Data Management (CMDM)
Approach 1 – Harmonization across BI systems only

Here, will not interfere in organizations current master data flow or mechanism thus not becoming disruptive in current landscape and processes. We will take master data from BI system only, cleanse it and map it and send back to BI systems. The MDM system will send back the mapping of local BI IDs with global MDM IDs. By this approach, management will be able to get reports on both local BI IDs and global MDM IDs.

image

Approach 1 Harmonization across BI systems to achieve consolidated reporting
Benefits:

. Derive immediate benefits from consolidated master data through BI reports

. Mitigated risk of using MDM in the transactional landscape

. Immediate benefit by control on Master Data of newly Merged and Acquired Companies.

. No Interface is required between MDM and BI

. Not disruptive to existing landscape
Limitations:

. Master Data integrity is not achieved throughout the organization and only for Analytics


Approach 2.1 – Master Data from Legacy Systems

In this, we introduce MDM system to transactional systems, before BI systems layer. MDM system gets all the master data from source systems and maintains it. Afterwards this cleansed master data is passed to BI systems to get correct reports based on consolidated and cleansed master data. Here also we are not interfering with source systems and source systems continue with their own master data creation and processes.

However it is still not solving the root cause of the problem. Hence after this organizations should go for next approach.

image

Approach 2.1 Master Data is coming from source systems and fed to BI
Benefits

. Immediate benefit by control on Master Data of newly Merged and Acquired Companies.

. Ensures data integrity across Transactional landscape

. Sets stage for easy re-use via distribution or business processes that directly leverage consolidated data.

. Sets stage to achieve CMDM


Approach 2.2 – Central Master Data Management (CMDM)

This is true Master data management. However organizations need to prepare themselves first to approach this and careful planning needs to be done. This approach will fail if strong governance mechanism will not set in place.

CMDM will be the central master data management system, thus it will maintain all the master data across organization landscape. This will result in single version of truth across organization landscape, and definitely reporting.

image

Approach 2.2 Consistent Master Data is coming through CMDM
It’s upto the organization to implement CMDM as per their convenience, readiness and governance mechanism. There are two ways to achieve CMDM:
1. Organizations can continue to create master data in local systems and then get it cleaned and synchronized to respectable systems with the help of MDM.
2. Organizations create master data centrally in MDM only and then sent back to respective systems to maintain synchronized version of master data across their landscape.

Benefits

. Centralized management of master data

. Ensures data integrity through Enriching and Standardization of master data across landscape

. Centralized Governance - Ensures that data changes in one application, other business applications that depend on that data, are updated with a consistent view of key data as it changes in real time.



MDM Benefits

Below are the benefits which organization will get with the help of master data management.

· Global View of Business

· Single view of truth

· Optimized Operations

· Easy Integration of newly acquired systems

· Elimination of manual and Redundant processes

· Full Interoperability

· Greater Visibility

· Better Accuracy

· Reduction in maintenance cost

· Faster cycles

SAP MDM Benefits

· Easy Integration with non-SAP, SAP and BI systems

· Predefined content for configuration

· Driving eSOA

· Java enabled content for EP and other external systems

· No coding required, easy configuration

· Workflow in windows based Visio


Company: Satyam Computer Services Ltd.

Role: Projects Lead / Project Manager

Above shared views are my personal views and might not synchronize with my company views.






Ankur Ramkumar Goel has over 7 years of SAP experience in ABAP, BI and MDM with implementations, Roll outs, Maintenance, due diligence and strategy projects across Europe, USA and APJ, and has been instrumental in MDM for 3 years.



Resolve Consolidated Reporting Problem in multiple BI systems through SAP MDM
Ankur Ramkumar Goe

Enterprise Dimension Management (EDM):a MDM approach S P

Enterprise Dimension Management (EDM):a MDM approach
S P
Enterprise Dimension Management (EDM):


I plan to undertake a journey of discovery whereby we shall try and use SAP MDM for chart of accounts maintenance- a critical component in EDM. I shall start by outlining the compelling business case for chart of accounts consolidation. Then I shall try and model chart of accounts in SAP MDM. We shall also try and analyse shortcomings if any in the current release (hmm…. hierarchy version creation, syndication of hierarchies need to be explored).


This is the first part of a series on chart of accounts management using MDM. The target audience is beginner level business users and architects in the financial domain.
Background:

The chart of accounts is a list of account names and numbers used in a company's general ledger. It is one of the most important business dimensions that the accounting team manages.

Larger corporations, particularly those made up of dissimilar businesses or that operate in different tax jurisdictions, can have multiple chart of accounts in these business units. Mergers further complicate the picture, since an acquired company almost always will have accounting approaches and systems that diverge from the parent’s. Hence the overhead required to manage this critical dimension increases with corporate growth and success, thus having an adverse affect on the quality of data, resources to ensure accuracy, timeliness of results and compliance with governance standards.

When two companies merge and try to merge their accounting systems that have different charts of accounts, lots of discrepancies have to be reconciled before the merged entity can even know for sure how much money it is making or losing. When the merged company finally agrees on the new chart of accounts, it may be almost impossible to compute performance trends because the historical numbers may not be convertible to the same structure as the current numbers.
Objective & Benefits:

The main objective of harmonizing chart of accounts is to speed-up the consolidation, closing and reporting cycles by reducing the amount of manual work involved. Other benefits include greater transparency and limiting the chance of fraud and errors that are the inevitable by-product of any manual system.

As the basic requirement for all statutory and internal reporting of information stored in the accounting ledgers, key drivers for standardizing the chart of accounts include:

1. Multiple lines of business/geographical operations—consolidation of different G/Ls and ERP systems for comparison reports and group level summaries

2. Organisation reschuffling—efficient chart of accounts management through mergers & acquisitions, restructuring, reorganization, and even new regulatory legislation or management reporting requirements

3. Business Performance initiatives—effective synchronization of chart of accounts reporting with BPM’s key performance indicators and metrics

Hence, taking a MDM approach to standardizing the chart of accounts empowers finance with a best-practice business management process to centralize and directly manage the structure of this crucial corporate data.

The promise of MDM is that by creating a software-defined abstraction level, top-down decisions about the “virtual” chart of accounts can be effected without having to change underlying systems. Moreover, having an abstraction level should enable companies to have parallel rather than sequential or iterative consolidation paths for statutory, management and tax accounting. This would produce a faster, cleaner financial reporting process, simplify and accelerate management reporting, and allow companies to centralize control over financial and managerial reporting if they prefer and manage tax implications far earlier in the closing cycle than is possible today.

Using master data management eliminates the often time-consuming activity related to mapping because it allows reporting directly from local systems. As a result, the management accounting cycle can be cut, often by days. Using MDM to resolve the chart of accounts also can improve data quality. Since it automates the data classification, it enforces consistency in the way data is aggregated so subsequent analysis will be more consistent, too. Reducing the ability to manipulate the system, via efficient data governance can enhance accountability. In addition, enforcing consistency in how data is treated can expose overlooked but persistent gaps in communications between regions, business units and headquarters, an issue all global companies face.
Part1:Chart of accounts hierarchy maintenance in SAP MDM 5.5 SP05

1.1 Lets start by trying to upload a chart of accounts into SAP MDM. The source data is downloaded from SAP R/3 into an excel sheet.

image"

1.2 Now, since excel is not an updatable data source for MDM, this data needs to be uploaded to a database like oracle, MS Access etc… In a live business scenario, Import manager shall be used in conjunction with SAP PI to import this data into MDM.


So, lets import this excel sheet into MSAccess.

image"
image

1.3 Then , lets log into the SAP MDM console and create a hierarchy table and two fields in this table for storing the account number and description

image

Now starts the task of importing the chart of accounts hierarchy into this table.

1.4 Log into the import manager and specify type as :Access and provide the path for the access database.

image

1.5 After logging in, the source and destination files need to be specified in the pane below the main toolbar. A preview of the source data shall be available in the source preview tab.

image 1.6 We shall select the two fields simultaneously that make up the hierarchy in the source file and right click. Then “create hierarchy field” needs to be selected.

image

1.7 In the “create hierarchy” dialog box, the child filed and node name field names need to be selected from the drop down. The parent field needs to be specified as none.

image

1.8 On pressing OK, MDM shall create the hierarchy field using the delimiter specified as default.

image"

1.9 Lets move to the tab “map field/values” and map the hierarchy field to the long text field in destination

image

1.10 In the value mapping pane, values in the source file need to be added as child to the main node in destination file.

image

1.11 After selecting the import action, records shall be imported into the hierarchy table.

image

image

1.12 The imported records can be viewed in the data manager.

image

In the next part, we shall delve into the next steps for chart of account maintenance and consolidation in SAP MDM.



S P is a project manager/lead consultant/sr enterprise architect with over a decade of experience in enterprise applications.



Enterprise Dimension Management (EDM):a MDM approach
S P

Using MDM WEB UI for tracking changes with Oracle Faycal CHRAIBI

Using MDM WEB UI for tracking changes with Oracle
Faycal CHRAIBI
The MDM Change tracking feature allows you to monitor any modification made to your master data repository and get useful information regarding this alteration such as the old value, the person who made the change or the modification date.

Although the activation of this feature is made through the MDM console, you will need to deploy the MDM WEB UI (delivered as a portal content) in order to visualize these information.

Unlike other portal contents, the Change tracking UI queries the database in order to get the information. The MDM RKT documents contains an excellent how-to deploy and configure this webdynpro but it doesn't cover the Oracle implementation which requires a specific configuration for the JDBC connector.

Make sure you have the latest support package of MDM (MDM 5.5 SP05 at the time being, the SP06 will be soon released to public). You will also need an SAP Web Application Server with a Java stack.

If you want to use this within a portal iView, deploy the Portal software units on the J2EE engine (note 883948).

As a pre-requisite, download the Oracle JDBC driver from Oracle website. Note 867176 may help you choose the appropriate driver.

First of all, download the MDM WEB UI latest build from the SAP Marketplace Software Distribution Center. Navigate to Support Packages and Patches -> SAP NetWeaver -> SAP MDM -> SAP MDM 5.5 -> Portal Content -> OS Independent, download the .sca file and deploy it through SDM.

SDM

The next step will consist in configuring your JDBC connection.

Open Visual Administrator, navigate to Cluster -> Server ->Services -> JDBC Connector

JDBC Connector

Select drivers and click on the Create button.

JDBC Drivers

Name your driver (ex: Oracle_JDBC)

JDBC Name

Load the JDBC driver file you had downloaded

Load JDBC driver

You should then see your new driver in the list

Drivers list

Select then DataSources, create a new datasource.

Provide an application name (this one should be unique), create an alias for this datasource (it will be used later in the webdynpro configuration).

Datasource name

Fill the information according to your database settings.

Database settings

Click on the "Additional" tab and fill these information :

* applicationName: this can be set to any name as long as it hasn't been used previously.



* databaseName: this needs to be set to _Z000. In our case, the repository is called CATALOGRECETTE. You may find its name thanks to the following SQL query: SELECT table_name from all_tables where table_name LIKE ‘%Z000';



* user: this should be set to the owner of your A2i_CM_History table (this where MDM stores the change tracking information). You may get the owner name through the SQL query: SELECT owner from all_tables where table_name = ‘A2i_CM_History';



* password: this is your user's password for Oracle



* portNumber: fill it according to your tnsnames or listener configuration.



* serverName: refers to the host of the Oracle instance



* url: this is the connection string that will be used by the JDBC driver to connect to Oracle. It is of form jdbc:oracle:thin:@::.

Additional properties

You may keep the default information for connection pooling.

Connection pooling

And select "Vendor SQL" for the SQL engine.

SQL Engine

Further information, on JDBC configuration, can be found on the SAP NetWeaver Library at the following address: http://help.sap.com/saphelp_nw70/helpdata/en/ab/082484173ae045ab8dad8a41d33da3/frameset.htm

Make sure you have enabled tracking changes in the MDM console (Repository -> Admin -> Change Tracking), and make few changes to your master data in order to have few entries in your A2i_CM_History table.

Enable change tracking

Access your webdynpro through http://:/webdynpro/dispatcher/sap.com/tc~mdm~changetracker/MdmChangeTracker&jdbcAlias=

In our case the Alias was : CATALOGRECETTE.

You may refer to the MDM Tracking changes RKT (available under Enhanced Generic Capabilities) which provides information on how to use this webdynpro with SAP MDM Data Manager or within an iView.



Using MDM WEB UI for tracking changes with Oracle
Faycal CHRAIBI

Tuesday, February 3, 2009

Central MDM Business Partner Creation Scenario with Galaxy (BPM) - Part I Satyajit Chakraborty

Central MDM Business Partner Creation Scenario with Galaxy (BPM) - Part I
Satyajit Chakraborty



With NetWeaver CE Ehp1 SAP has announced the availability of the new BPM tool called Galaxy. There have been a lot of blogs in SDN talking about Galaxy and also showcasing a few use cases. This blog is the first part in a series of two (or maybe three) blogs about a new scenario where I talk about managing a master data management scenario with Galaxy and the new rules framework.



The first part details out the scenario, lists out notable features about the scenario and finally shows the scenario as modeled in Galaxy. The second part will contain more implementation details.

The process is that of a new business partner creation in MDM in the central MDM scenario. It can be demonstrated using the following diagram
Process Specification



Note that even though the diagram is for a vendor creation scenario it can be re-used in the business partner creation scenario as well.

Before we look at the process modeling in Galaxy here are a few things to note about the process:

§ 3 roles: Vendor Manager, Accounting Agent, Procurement Agent

§ Parallel enrichment of data by the Accounting Agent and the Procurement Agent

§ Manual/Auto approval of a record is based on a rule

§ DUNS number assignment is automatic (not entered by either of the agent roles)

From MDM perspective the things to note are:

§ In the first step of the process the Vendor Manager searches for a record in MDM and absence of appropriate results allow him/her to create a new record in MDM with the search parameters

§ All enrichments happen on an already created and checked-out record

§ Before persisting any change the record should undergo MDM validations defined in the repository

§ When a record is approved it is checked-in and the process ends

A few implementation considerations that are worth mentioning are:

§ Usage of the standard SAP delivered Business Partner repository

§ Web Dynpro for Java for the user interaction steps in the process

§ Auto or Manual approval of a record is based on the presence or absence of the SSN entry respectively for the record

§ Usage of MDM web services and Java APIs in cases where there were no web services

§ Role based access to the application for the two different agent roles

With all that in mind let's look at the process model in Galaxy. This blog is not about how to model processes in Galaxy and so I'll just put the final completed process model here. If you are interested in learning modeling with Galaxy you can find a lot of good resources about both Galaxy modeling and BPMN modeling (quick hint: search SDN for "Ginger Gatling Galaxy").

BPM Model

Looking at the process model you might have already noticed that there is a disparity between the original process specification and the modeled process. The first step of searching for a business partner by the Vendor Manager is missing from the Galaxy model! The reason is this: the process to create a new business partner is started only if the Vendor Manager chooses to do so.

Hence there is a need to remotely start the process and not a manual start via the NWA tool. The future parts of this blog will also show the implementation details of how to start a Galaxy process remotely.

Satyajit Chakraborty is part of the BST Innovation Center team in Palo Alto.
Central MDM Business Partner Creation Scenario with Galaxy (BPM) - Part I
Satyajit Chakraborty

MDM Import Manager capability Ravi Kumar

MDM Import Manager capability
Ravi Kumar


In my previous project we had a requirement of running matching strategy for finding duplicate Material master. The matching rule was pretty simple based on Material Description but the repository size was in tune of 500K with around 250 attributes.

to enable better matching results we had used lots of transformations on Material Description field for normalizing and standardizing of data. We were using combination of two rules in the Duplicate matching strategy:

1. Rule1: using functionality equal(on Material Description transformed field)

2. Rule 2: token equal( on Material Description field)

On an average the description was having 1.5 tokens per record. We found that performance was very poor as we had both token equals and a huge list of transformations( few of them were replacing string XXX to AAA, deleting blanks, other special characters etc etc). Even after restricting the total number of records considered for matching( we used Material group for Clustering) it was taking 20-40 mins for matching results.

Solution: We improved the data quality by re importing the same set of records which was used for initial load from ECC but this time MDM Import manager capabilities were harnessed to reduce number of transformations.

HOW: After mapping Material Description field apply value conversion filter on the mapped field and we have almost all excel based powerful functions available like replace, Append, Prepend. We can have even multiple such conversion rules applied on the same mapped field.

We also made Keyword normal for field Material Description, which optimized token based comparisons.

This effectively reduced our number of transformations from 22 to 5 and increased matching strategy performance.



Ravi Kumar is Consultant with Infosys.

MDM Import Manager capability
Ravi Kumar

SAP MDM integration with R/3 system Ravi Kumar

SAP MDM integration with R/3 system
Ravi Kumar


•We all know what are the different It scenarios supported by MDM, namely:
Master Data Consolidation

*
– Cleansing and de-duplication
*
–Data normalization including categorization and taxonomy management
*
–New interactive consolidation capabilities

•Master Data Harmonization

*
–Automated synchronization of globally relevant master data information
*
–New interactive distribution capabilities

•Central Master Data Management

*
–One-stop data maintenance
*
–Ongoing master data quality

In this Blog I will try to cover the different steps required for integrating MDM with R/3 system which is the source of Master data.After cleansing and de-duplication of Data in MDM all changes/ updates will be reflecting back in SAP R/3. this step by step procedure should help in understanding the procedure for implementing MDM IT scenarios. :)



Different settings required for doing this are:

* Settings in R3
* Settings in XI
* Settings in MDM
* MDM console
* Import Manager
* Syndication Manager

MDM Process flow









Process Flow: Trigger Idoc containing Masters from R/3.XI converts these Idocs to XML files and places in ready folder for Inbound port. Files received by Import Manager into MDM and after changes etal via Syndicator Manager xml file placed in Ready folder of outbound Port from where it will be picked by XI and sent to R/3 using the Idocs.





Setting in R/3:

Step 1



T.Code: SALE

Define Logical System





Sender Logical System:





Receiver Logical System:



















Step 2: T.code: SALE



Assign Client to Sender Logical System:







Step 3:

Create RFC destination of type ‘3' for SAP XI system using transaction - SM59





Step 4:



Create a Distribution model through T.Code: SALE/BD64



Here we need to mention Sender Client (DEVCLNT500) and Receiver Client (XICLNT100) and Message type (MATMAS)









Step 5:

Using T.Code: WE21



Maintain Ports for IDoc Processing



















Step: 6



Maintain Partner Profile using the T.code: WE20









Step 7:

Using BD12/BD10 T.code to Send Customer IDoc or material IDoc











Using T.code WE02 we can confirm or see the message transformations.







Now we are able to generate IDoc from R/3 containing the master data.

Settings in MDM:

Select the repository for which you want to do the settings.









Go to "Admin" and select "Client Systems" and right click on it and create your client system















Then go to "Ports" for creating a Port for Client System







Here you are defining the Outbound Port for the Client system (MDC R/3) defined in previous step.

You have an option of processing the data Automatically/manually.





After Saving the "Port" and "Client System" you need to ensure the folders in respective repository has been created in server.







Because "Ready" is the folder where all the files get exchange from.



Similar steps should be repeated for Inbound Port.





MDM Import Manager: Assuming that all the settings are done in XI also we move to Import manager where we will select the file to be imported and steps to be followed for import of Data into MDM



Step 1: Login to Import Manager and connect to Source file. Select type as PORT. System automatically connects to the Inbound Port of the repository logged into.







Step 2: Do all Field mapping and value mapping in Map Field/Value tab.we can use the standard maps provided in Business content or do all mappings manually and save the map.







Step 3: Go to match records Tab select the Field used for Matching records and select the import action. PS : For each record we can manually override create/skip as import action.











Step 4: After all mappings has been done and import action is Ready to Import in Import status tab execute the import









This will import the records contained in the xml file from the Ready folder into MDM.GO to Data manager and check all records created from R3.





Syndicator Manager MDM:



Any changes made in MDM data manager based upon the validations/Assignemnts and Business rules the changed record should be syndicated back to R3 which is Data source.



Step 1: Login to Syndicator Manager giving the repository name. select File> Destination Properties and select Port as shown:



Select the remote system R3 in this case

Select the Port which we have created for Outbound port where the Syndicated file can be placed in XML format.











Step 2: Do all the mappings again. Use the standard maps provided in Business content or do it manually like in Import manager. We have the options of selecting few records based on search parameters. We also have the option of Supressing all unchanged records in map Properties. This will select only those records which has been changed in Data manager instead of all existing records.









Step 3: In Destination Preview we can actually see all records with the values for fields before syndicating. This should be always done before executing to reduce the erroneous /incomplete data flow. Once you have all the details execute the syndication.











Step 4: Check the IDoc list in SAP. In case of status 51 do the further analysis why it has failed.











RESULT: data changed in MDM Data manager will be updated in SAP R3 system provided all the mappings are correct.

Ravi Kumar is Consultant with Infosys.

SAP MDM integration with R/3 system
Ravi Kumar

Latest updates from sdn.sap.com

SAP Developer Network Latest Updates