Thursday, November 13, 2008

MDM Materials

The following links will help you to understand the mdm and its market trends and scenarios.
MDM :http://hosteddocs.ittoolbox.com/RD021507b.pdfDemo on MDM applicationhttp://www.sap.com/community/int/innovation/esoa/demo/MDM_demo/index.html
http://www.asug.com/DesktopModules/Bring2mind/DMX/Download.aspx?TabId=66&DMXModule=370&Command=Core_Download&EntryId=3431&PortalId=0
MDMhttp://www.asug.com/DesktopModules/Bring2mind/DMX/Download.aspx?TabId=66&DMXModule=370&Command=Core_Download&EntryId=1666&PortalId=0SAP Netweaver MDM Overviewhttps://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/b09b548d-7316-2a10-1fbb-894c838d8079SAP NETWEAVER MDM Leverage MDM in ERP Environments - An Evolutionary Approach -https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/4059f477-7316-2a10-5fa1-88417f98ca93Master Data Management architecture patternshttp://www-128.ibm.com/developerworks/db2/library/techarticle/dm-0703sauter/MDM and Enterprise SOAhttp://www.saplounge.be/Files/media/pdf/Lagae---MDM-and-Enterprise-SOA2007.10.10.pdfEffective Hierarchy Management Using SAP NetWeaver MDM for Retailhttps://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/70ee0c9e-29a8-2910-8d93-ad34ec8af09bMDM Worldhttp://mdm.sitacorp.com/MDM: Master Data for Global businesshttp://www.sitacorp.com/mdm.htmlMDM Master Data Management Hub Architecturehttp://blogs.msdn.com/rogerwolterblog/archive/2007/01/02/mdm-master-data-management-hub-architecture.aspxImprove Efficiency and Data Governance with SAP NetWeaver MDMhttp://www.sapnetweavermagazine.com/archive/Volume_03_(2007)/Issue_02_(Spring)/v3i2a12.cfm?session=Data Modeling i MDMhttps://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/5d4211fa-0301-0010-9fb1-ef1fd91719b6http://www.sap.info/public/INT/int/index/Category-28943c61b1e60d84b-int/0/articlesVersions-31279471c9758576dfSRM-MDM Cataloghttp://help.sap.com/saphelp_srmmdm10/helpdata/en/44/ec6f42f6e341aae10000000a114a6b/frameset.htm
This will tell you about market trends.http://events.techtarget.com/mdm-ent/?Offer=DMwn716mdmhttp://viewer.bitpipe.com/viewer/viewDocument.do?accessId=6721869http://searchdatamanagement.bitpipe.com/data/search?site=sdmgt&cr=bpres&cg=VENDOR&sp=site_abbrev%Asdmgt&cp=bpres&st=1&qt=Master+Data+Managementhttp://viewer.bitpipe.com/viewer/viewDocument.do?accessId=6721819http://www.dmreview.com/channels/master_data_management.htmlhttp://searchdatamanagement.techtarget.com/originalContent/0,289142,sid91_gci1287620,00.html?bucket=NEWS&topic=307330
MDM Console -- http://help.sap.com/saphelp_mdmgds55/helpdata/en/88/9f9c427055c66ae10000000a155106/frameset.htmMDM Import manager --http://help.sap.com/saphelp_mdmgds55/helpdata/en/43/120367f94c3e92e10000000a1553f6/frameset.htmMDM DataManager --http://help.sap.com/saphelp_mdmgds55/helpdata/en/43/e0615a82b40a2ee10000000a11466f/frameset.htmMDM Syndicator --http://help.sap.com/saphelp_mdmgds55/helpdata/EN/43/5fe0e8a55f5f6be10000000a1553f6/frameset.htm

Centralized Repository

Product Catalog Management (PCM):
PCM focuses on highly-structured data (attributes such as measurements, features and benefits text, list price, images) of any product that need to be published , and thus have to be enlisted in a catalog to make it available to the end user . PCM data does not imply transactional data (data about a specific order, inventory, or specific customer pricing), but may include supplier and sourcing information. This product information need to be stored in a centralised container with a single version ensured. SAP Master Data Management Extension (MDME) is a complete solution for structured product content management. The components of content management, catalog publishing, importing and exporting have been designed and rigorously developed to meet the demands of the most sophisticated manufacturing and distribution businesses. It should be understood that the goal of MDM-PCM implementation is to create the “single version of the truth” for Product Content Management. This means that MDME should be the master repository for the majority of product data.
Landscape



Centralized Repository for MDM-PCM:
Need:
single version of truth in product information is very essential for an augmented and streamlined product management process. The home grown legacy solution, involving a custom DBMS may be supporting well but just may be overlooking the downpour of complicacies through disparate ownership in the company. Numerous owners and departments create and disseminate data, and many different tools and processes are used. It is broadly accepted that the practices in place in many of these areas are inefficient and not optimized.Due to the number of data sources and fragmentation of the data, selling opportunities are lost because product relationships are not apparent or readily available.
Solution :
The recommended solution is to bring all data from the current DBMS and other related data sources such as the replacement parts information into MDME. Data modelling is an essential part of this process and data stewards will be involved for this. The central repository is built, keeping the following points in mind :
1) Current state of data.
2) Scope of cleansing required.3) Business process re-engineering that might be required.

SAP MDM Faqs

What platforms are supported with SAP Master Data Management?
Please find availability and supported platform information on SAP Service Marketplace, alias PAM (http://service.sap.com/pam). And then drill down into NetWeaver -> SAP MDM -> SAP MDM 5.5. Note that appropriate Service Marketplace authorization is required.

How integrated is SAP NetWeaver MDM 5.5 with SAP NetWeaver and applications?
SAP NetWeaver MDM 5.5 is an integral part of the NetWeaver stack. In the current feature release, enterprise application integration, both SAP and non-SAP, is accomplished through SAP XI. Interoperability with other systems is possible via SAP NetWeaver MDM 5.5’s APIs (including ABAP whose development is currently in process). Tight, native integration is part of the SAP NetWeaver MDM 5.5 roadmap and further pre-built integration points will be rolled out as we progress along the development path. SAP MDM 5.5 SP2 will provide view-only iViews for SAP Enterprise Portal.

Is the Product Catalog Management application part of the SAP NetWeaver Integration and Application Platform? Does print publishing belong to this platform as well?
Yes, these are all part of the SAP NetWeaver platform and print publishing is an extension of the capability to product content management. By definition, this is the case since the former A2i xCat application, now further augmented and known as SAP NetWeaver MDM 5.5, is part of the SAP NetWeaver MDM family of products.

How will MDM fit into Enterprise Services Architecture? Which Web services will be provided and when?
MDM is integral to SAP’s ESA strategy. The initial list of documented Web services with MDM 3.0 were provided with MDM 3.0 information release. These refer to the ability to access master data information in MDM as a service to create records, etc. New web services will be available as per the roadmap. With SAP MDM 5.5 in conjunction with SAP Exchange Infrastructure, one can create web services by exposing MDM functions using MDM JAVA or .NET APIs.

What tools are available to integrate SAP MDM and other non-SAP applications and platforms?
SAP MDM 5.5 exposes its core functions using published JAVA and .NET APIs. Any integration between MDM and other non-SAP software can be handled using APIs. Also, MDM functions can be exposed as web services using APIs in conjunction with SAP Exchange Infrastructure. Broader integration between SAP MDM 5.5 and other SAP NetWeaver components will be available through product roadmap.

Can Mask functionality be used for determining which BP records exist in R/3?
There is no need for a mask to be generated, as Syndicator can filter records to be sent according to the Agency and remote key stored within MDM. The “suppress records without key” option needs to be set to “Yes”.

Can a mask be recreated automatically from a saved search selection criteria?
This is not currently supported. Records can be hidden per role using “constraints” functionality in the console.

Can MDM send only changed fields and data and not the whole record?
There are two possible answers to this.
1. If you are extracting changed data through the API, you can set the timestamp field to change only when your key fields change. This will allow you to select only those records whose changes need to be sent to R/3.
2. Using the Syndicator you can use the timestamp technology in calculated fields or set up the relevant search criteria in the Syndicator to select only those records that have relevant changes.

What options are available for resending from MDM within XI or R/3 in case an update fails?
If the failure lies with XI or R/3, the same XML can be reprocessed (no resending is required). If there is a validation or data problem, the records needs to be identified and modified in MDM Data Manager Client and the Syndicator batch will resend them as they were updated since the last syndication.

How easy is it to maintain the front-end when the data model changes?
The effort depends on the number of fields required for the front-end. Fields that are added have no impact. Fields that are deleted (and maintained in the front-end), need to be removed. Fields that are renamed need to be updated.

Is it possible to develop web forms (outside of EP6) that link to standard Java MDM APIs and communicate with the MDM repository?
Yes it is possible as you are not limited to the use of iViews that exist. Your own application-specific iViews can be created. You can also access the server with direct calls to the API from the java environment.

Is it possible to assign the saved search criteria to a role or person to restrict what he or she can view in the search?
The saved search option is client computer specific. That means that a user’s search criteria are available only to the user and not to other users. Therefore the saved search is not an option in this case. Using role constraints you may achieve the required results.

Are adapters/extensions available in MDM for integrating monitoring tools? (ie. does Tivoli register if an exception occurs in MDM?)
MDM currently does not trigger external processes on errors. The system uses logging capabilities to register errors and there are specific log files for the various components of the system. If the monitoring system/s can be triggered on changes to the log files then the system can be monitored.

Is it possible to hide certain fields and their values (depending on profile)?
The MDM security mechanism allows you to define constraints to be used for display and hide field values in MDM Data Manager Client. Currently the MDM capabilities do not allow you to entirely hide fields upon a constraint setting. However, you can use the APIs for building a User Interface to allow display/hide of fields and attributes as required.

Is it possible to trigger external processes depending on type of errors raised, for example alert management functionality?
Currently an extended error handling with follow up processing is not on the roadmap. However, the usage of MDM Expression Language needs to be evaluated for this usage.

MDM stores change history in a separate database which can track the selected fields in any table, the before and after state of a record for that field and the user performing the change. As a result, if you activate too many fields or have frequent updates to the same field, you experience performance problems. How can I better manage this?
Limit the number of fields to be tracked to the minimum required. Establish an archive and purge procedure on the track changes log/database on daily basis to keep this database size to minimum, ensuring optimal performance.

User Interface - Client and Web Front End
Are saved searches shared between users or roles?
The saved searches (produced from the top menu Search->Save current search) in the client or the syndicator in the current version are saved locally per repository. That means they’re shared among different users working on the same workstation. Although it may seem a limitation such an approach makes the saved searches more flexible. The saved searches (as files) can be distributed over different workstations working with the same catalog and/or accessed from a share.

Can a saved search be shared between the client and the syndicator?
Searches are saved locally to a file and hence can be shared between the client and the syndicator by copying files (having extension sqf) to the syndicator or the client directories.

Do file-shared searches break security restrictions
Searches are merely sets of query’s criteria. In other words, every user that opens a saved search will get as results only records she is allowed to see.

There are too many search tabs in the Client's "Search parameters" pane. How can I pick up only ones I want to display?
In the Console, every field has parameter “Display Field” which accepts Yes/No values. In the Client’s “Search parameters” pane only fields with “Display Field” option “Yes” are shown. Another way to hide a field is in the Client. There make a right click on the search tab and choose “Hide”.

MDM Server, Console and Repository
What are the relationships between Consoles Servers Repositories and Databases?
Once an archive is deployed in a database it becomes a repository. So a repository exists in one database. A repository may be mounted on many servers. However, it can be loaded only on one server at a time. One server may be accessed (mounted) from many consoles. The server’s status is updated on all consoles where it’s loaded.

Can two servers run on the same computer?
The current version (MDM 5.5. SP1) doesn’t allow you to run two instances on one computer.

Why in the Console’s security tabs do Constraints appear only against certain tables?
If a lookup table is referenced by only single-valued fields of the main table or other lookup tables, then its value is available in the Constraints fields. If the table is referenced either by a multi-valued lookup field or by a qualified lookup table’s field, its value is not available in the Constraint field.

Are Text Blocs and Text multilingual?
Text Blocs as well as PDF and Images are always multilingual. The Text and Large Text data type objects optionally may be defined as multilingual in the Console.

What is NULL and how is it used in MDM?
NULL is a special data marker. It denotes missing or not populated yet data. It can’t be treated as an existing value and doesn’t participate in equations of uniqueness integrity. That is, multiple records are permitted to have NULL for a unique field. To prevent records from having these undefined records, use validation functions IS_NULL and IS_NOT_NULL. Also, nulls can be handled through import manager.

What does “Required” stand for?
The Required field parameter is not a mandatory but an advisory property. In other words, it is different from “NOT NULL” definition of the RDBMS world. Its purpose is to let the creation of validation rules treat all “Required” fields at once. Such validation expression is not available yet but will be exposed in coming versions. The advantage of this approach is when your validation logic is changed and you decide the field should or should not participate in the validation you simply change the value of “Required”. Otherwise every validation expression should be changed manually.

Can LDAP property "MDMERoles" be changed to something else?
Yes, the name of an LDAP field listing MDME role is defined in the mds.ini file in field "MDME Roles Attributes" (see MDME Console guide p.255). MDMERoles is just a predefined name.

Can an attribute have additional parameters associated with it like status or userID timestamp?
Attributes can’t have any additional parameter except type, name, alias, description, and set of predefined values (for text type). For other purposes, you should probably use a lookup table.

Import Manager
Do validations work during the import process?
Validation rules don’t work during importing through the Import Manager. To execute a validation (or a group of validations) over newly-imported records you may use the API. In such a case a special flag during importing can arise and then the validations can be executed against an easily found record set.

Can I import my data in several steps?
Yes, you may have different maps for populating the same records. These maps may bring data to different or even partially overlapping fields. The only thing to take into account is to match records correctly and to choose the right import action.

Does the Import Manger have to import all records of the source or can I skip importing of some of them?
The Import Manger deals with source records on an ‘all or nothing’ basis in terms of mapping. Every record must be mapped to let the Import Manager start importing. However, what you can do is map undesirable records to a NULL or to a flagged record to make it easily recognizable for future cleaning. Another option to consider is to Skip some of the mapped records for actual mapping. The Skip action may be applied to a group of records (grouped according to their Match Class) or individually by changing their default inherited import action.

Can data be imported from two different sources concurrently?
The Import Manger works with only one source at a time. On the other hand, several tables of the same database connection can be joined on-line in the Import Manager. For different XML files, all pre-upload data operations should be performed before the Import Manager gets it. In case of XML files, XSLT transformation can be used effectively.

When I import a qualified lookup table it duplicates records. What's the right way to avoid it?
When you right click on the field of the main table that points to the qualified lookup table, choose “set qualified update”. Then three options of how to treat repeating values appear. Option “Append” will add new records. Option “Replace” will replace it. And option “Update” will update them. In all three cases an additional option appears where you can specify how to match existing records through available qualifiers.

The Import Manager treats an XML file containing only one element and an XML file containing a collection of the same elements differently. How can I use the same map for both cases?
You should create an XML schema (XSD file) and put it in the catalog (Console->Admin->XML Schema). Then such schemas are available as source types in the Import Manager. When you create a map, the corresponding XSD file is saved with map allowing for reuse for future imports.

What is the difference between Update and Replace in the Import Manager?
The Replace erases the matching existing destination record and creates new ones based on the new source record. As a consequence, all relationships with other fields in case of Replace are broken. The Update really updates the fields leaving all existing relationships in place (also see p.296 of the Import Manager manual)

Is there a way to have a field displayed (in a lookup combo-box) but not participating in matching destination fields in the import process?
If a table has “key mapping” set to “Yes” then “Remote Key” appears in the destination fields list of the Import Manger. You may choose an existing field in the source fields tab or create a new one and map it to the remote key. Mapped “Remote Key” is sufficient to perform actual import and hence display fields may or may not participate in the matching.

Can I run two Import Managers at the same time?
Yes, several instances of the Import Manager can be open concurrently. The only existing limitation though is when actual importing process starts the tables involved in the importing process are locked for write access. So if two instances of the Import Manager import to different tables then no synchronization issue occurs. If they use the same table, then the instance getting access second has to wait until the first ends.

Where are import maps stored and can maps be edited externally?
The maps are stored in the repository and can be edited only though the Import Manager. They can be exported and imported to/from binary files. The maps are being archived and un-archived together with other repository data.

Can a source field partake in mapping twice?
Yes, to achieve it you have to clone the field in the Import Manager. You have to take into account though that cloned fields can’t start Partitioning string.

What is the most efficient format for source data when using Import Manager?
The efficient formats for massive upload are Microsoft Excel and Access, with no differences in performance. As Excel has limitations related to maximum number of records a single spreadsheet can hold, Access is the recommended option.

Is it possible to handle exceptions during import automatically (continue with import if 1 record fails, stop processing for a specific error but continue for another, or notify specific user if record update fails)? How are failed records reprocessed both in MDM and in the Business?
The import process generates log files containing the products and the reasons why these products failed to import. As in the previous process, these files can be collected and placed into a user’s process or email inbox. The user would then have the responsibility to fix the errors and re-import the data that was rejected by the batch process.
An archiving process needs to be in place to remove and store old files that have already been processed. How can I handle files that are partially processed?
The organization of the file import should be handled by the program calling the Batch import manager. This could be using the XI BPM or could be a simple program which checks the input files and invokes the import manager accordingly. Both scenarios are relatively easy to achieve as is the process of archiving and removing those files that were processed.

Syndicator
How do I handle a situation where the Syndicator produces one XML file with many elements or many XML files with one?
If you map the table name to the top element of your XML collection, then you’ll get one output file. Otherwise every element will be output in its own file. The best way to deal with this is to create an XML schema, upload it to the server (in the Console), and then use this schema for mapping.

Can repository data be syndicated as it is or is special output processing possible?
There is a "Custom Item" pane where a custom field may be created and then mapped as a regular source field.

MDM Interview Questions Mock Test

1. Data Manager
Masks allow you to partition a single master repository into many customized, virtual subset repositories. Which of the following correctly describe the use of masks in an MDM repository? More than one answer is correct.
Individual masks must exist before records can be added to, removed from or replaced in the masks. True
If a mask is created based on search criteria, the mask will automatically update whenever a record is added to or deleted from the repository which matches the search criteria. False
Masks are not used to publish a custom subset catalog, such as a sale catalog. False
Multiple masks can be modified at the same time using the Modify Mask command. True
Masks are defined at the individual record level rather than the query level to prevent performance penalties. True
2. Import Manager
The Import Manager takes data from the source file and maps it to the destination fields. This functionality includes which of the following. More than one answer is correct.
The mapping of multiple source tables to multiple destination tables False
The mapping of source fields to destination fields True
The mapping of source values to destination values True
The saving of mapped tables, fields, and values for later use True
The application of MDM's matching strategy False
3. Syndicator
A reusable syndication map is a requirement for using the MDM Syndication Server. What elements are included in a reusable syndication map? More than one answer is correct.
How data should be formatted in the syndication file True
The search selections of the current record set prevent unchanged records from syndication. False
Which remote system is going to receive the syndication file True
What the field structure of the remote system may look like True
How source values maps to the field structure False
4. Data Modeling
An effective SAP NetWeaver MDM repository for a master data object must have the characteristics of which of the following. More than one answer is correct.
Stores data using MS Access False
Contains rich structured master data True
Contains an essential search tool designed to provide access to transactional data stored in an Oracle RDBMS False
Organizes and classifies records into a taxonomy consisting of an arbitrary hierarchy of categories and subcategories True
5. Relationships
Product relationships are useful for merchandizing and defining structural compatibility between different products. Which one of the following represents the most efficient manner of defining record relationships? Only one answer is correct.
Taxonomy with shared-attribute categories False
A product hierarchy with single assignment of records to hierarchy nodes False
Multiple taxonomies with shared-attribute categories False
Category-level product relationships True
Families pointing to a secondary taxonomy False
6. Print Publishing
The business scenario of rich product content management is based on the ability for MDM to support print publishing. Which of the following is true regarding this ability? Only one answer is correct.
The SAP MDM print publishing interface supports the following DTP tools with the following versions: Adobe InDesign CS1 and CS2 QuarkXPress 5.01. True
DTP users can share DTP licenses and installations. False
The same operating system must be used for both the MDM Layout Server and the Plug-ins/Extensions. False
Plug-ins to InDesign and/or QuarkXPress are contained in a separate MDM delivery package. False
7. Global Data Synchronization
The business motivation for the global data synchronization (GDS) solution involves which of the following? More than one answer is correct.
Slow product updates via multiple channels True
Challenges of getting new products to the market True
Corrupted data during data transfer False
Conflicting data True
8. SAP NetWeaver Process Integration
An important aspect of the deployment of SAP NetWeaver MDM is its ability to operate with SAP NetWeaver Process Integration (SAP NetWeaver PI). SAP NetWeaver PI can facilitate the movement of data from SAP transactional systems by doing which of the following? More than one answer is correct.
Using the message types DEBMAS, MATMAS, CREMAS for both inbound and outbound messages for SAP ERP True
Using the message types DEBMDM, CREMDM, MATMDM for both inbound and outbound messages for SAP ERP False
Using DEBMDM and CREMDM as outbound messages and transforming the output into DEBMAS, CREMAS and ADRMAS for SAP ERP True
Using ARTMAS and BOMMAT for both inbound and outbound messages for SAP Retail True
9. Workflow
The MDM workflow enables the orchestration of parallel and sequential activities at the data management level, including user tasks, validations, and approvals. Which of the following activities must be accomplished before the execution of a workflow? More than one answer is correct.
Ensure that the MDM server is capable of executing and monitoring multiple job threads. False
Install the MDM workflowinstall application on the MDM server. True
Create a workflow record in the workflows table if there are no pre-delivered workflows. True
Ensure that the MDM workflow Visio stencil is included in the MDM server install. False
Install Microsoft Visio Professional for workflow designers in the Programs folder on the MDM server. True
10. SAP NetWeaver – Integration Portal
Included with the enterprise portal content for SAP NetWeaver MDM is an item details iView. Which one of the following statements correctly identifies the features of this iView? Only one answer is correct.
Allows the transfer of selected items to a shopping cart via OCI False
Allows the management of master data associated with workflow steps False
Allows the grouping of fields in a header area, and also in multiple tabs True
Allows the comparison of selected records False

MDM Certification Questions

User Interface -
Client and Web Front End• Are saved searches shared between users or roles?•
Can a saved search be shared between the client and the syndicator?
Do file-shared searches break security restrictions• There are too many search tabs in the Client's "Search parameters" pane.
How can I pick up only ones I want to display?
MDM Server, Console and Repository•
What are the relationships between Consoles Servers Repositories and Databases?•
Can two servers run on the same computer?•
Why in the Console’s security tabs do Constraints appear only against certain tables?
• Are Text Blocs and Text multilingual?•
What is NULL and how is it used in MDM?•
What does “Required” stand for?•
Can LDAP property "MDMERoles" be changed to something else?•
Can an attribute have additional parameters associated with it like status or userID timestamp?Import Manager•
Do validations work during the import process?•
Can I import my data in several steps?•
Does the Import Manger have to import all records of the source or can I skip importing of some of them?
• Can data be imported from two different sources concurrently?
• When I import a qualified lookup table it duplicates records.
What's the right way to avoid it?•
The Import Manager treats an XML file containing only one element and an XML file containing a collection of the same elements differently.
How can I use the same map for both cases?•
What is the difference between Update and Replace in the Import Manager?
• Is there a way to have a field displayed (in a lookup combo-box) but not participating in matching destination fields in the import process?
• Can I run two Import Managers at the same time?•
Where are import maps stored and can maps be edited externally?•
Can a source field partake in mapping twice?•
What is the most efficient format for source data when using Import Manager?
• Is it possible to handle exceptions during import automatically (continue with import if 1 record fails, stop processing for a specific error but continue for another, or notify specific user if record update fails)?
How are failed records reprocessed both in MDM and in the Business? •
An archiving process needs to be in place to remove and store old files that have already been processed. How can I handle files that are partially processed?
Syndicator• How do I handle a situation where the Syndicator produces one XML file with many elements or many XML files with one?•
Can repository data be syndicated as it is or is special output processing possible?

MDM Certification Books

Course: MDM100 Course version: 2006/Q2PrerequisitesEssential •noneRecommended •Basic knowledge of SAP NetWeaver Exchange InfrastructureDuration•5 days Goals•
Understand the features and options to support a customer implementation employing the current version of MDM 5.5 SP04.
•The focus for this session is clearly on the IT scenariosAudience•
MDM Solution Consultants •Project team membersSoftware•
SAP NetWeaver MDM 5.5 SP04Content•
This course will cover the modules of MDM:
Console o Data Manager (including Taxonomy, Validations, Workflow, and Matching)
Import Manager o Syndicator• Master Data Harmonization • Rich Product Content Management RPCM (Overview) •
Global Data Synchronization (Overview) •
Integration scenarios within SAPNetWeaver:
Portal integration
BI integration
XI integration
R/3 Communication•
OCI integration •
Security and User Management •
PerformanceCourse:
MDM101 Course version: 2006/Q2
PrerequisitesEssential • MDM100: Master Data Management Recommended • Knowledge of SAP Exchange Infrastructure (SAP XI)Duration•4 hours Goals•
This course will prepare you to understand the features and options and enable you to support a customer implementation employing Global Data Synchronisation (GDS).
Audience•MDM Solution Consultants •Project team membersSoftware•
SAP NetWeaver MDM 5.5 SP04Content
•Overview of the Global Data Synchronisation (GDS)
Process •Installation procedure for GDS Console •
Architecture of GDS •Data import to GDS •
Data maintenance on GDS •
System configurationNotes•Course length: 4 hoursCourse:
MDM300 Course version: 2006/Q2
PrerequisitesEssential •Attendees should meet the following pre-requisites •equivalent work experiences:
Completed MDM Training (MDM100, MDM200).Recommended •noneDuration•3 days Goals•Prepare for MDM implementations where there are print publishing requirements, with knowledge of the methodologies, concepts and tools used in this type of business scenario. •Review the print and publish related functions and features within MDM 5.5 SP4, including the Data Manager, Publisher and Indexer components with hands-on exercises. •
Interface with DTP (Desktop Publishing)Audience•
MDM Solution Consultants who will implement MDM 5.5 Print Publishing componentsSoftware•
SAP NetWeaver MDM 5.5 SP04Content•Print Publishing Concepts and general philosophy •Impact of Print Publishing on Data Model •Paper Publishing Considerations •
MDM Print Publish Process flow •MDM Print Publish features and functions •Architectural ConsiderationsCourse:
MDM400 Course version: 2006/Q2PrerequisitesEssential •Sound knowledge of the material covered in the
MDM100 Master Data •Management OverviewRecommended •noneDuration•3 days Goals•Model MDM Repository tables and fields •Model MDM Repository Taxonomies •Model MDM Hierarchies •Model MDM Qualifiers •Model MDM RelationshipsAudience•Project team members with extensive knowledge of MDM Console and MDM Data ManagerSoftware

MDM XI R3 Integration

This blog will show you how to integrate Master Data Management (MDM) and SAP R/3 using SAP PI in short frame using SAP NetWeaver as framework.MDM and R/3 integration is required in two different scenarios as listed below. PrerequisitesPrerequisites for integration of these scenarios include some of the software applications installed. The list is as below:SAP R/3 (4.6, 4.7, or any),SAP MDM version 5.5 SP3 Server,SAP MDM Syndicator,SAP PI (Process Integration), formally XI (Exchange Infrastructure) Version 3.0 SP16 Outbound Scenario - MDM-to-R/3 (Inbound to R/3)Inbound Scenario - R/3-to-MDM (Outbound to R/3) PART – I (Inbound Scenario)We will be discussing about Inbound process i.e. Inbound to the SAP R/3 and Outbound to the SAP MDM.
Figure 1.Inbound to R/3MDM Console side Configuration Steps:Step One-->Create Client System.How To?Logon to SAP MDM Console and choose the repository which you going to syndicate. Navigate the selected Repository (Example Repository, BP3_PoC_Customer)-->Admin->Client Systems.This right hand side you can find Client Systems context area. If you right click on context area, you will get the Context menu Items,choose-->Add Client System, Enter Name of a client system (Example.Siebel), Code and Type. See the below Figure 2.
Figure 2.Client SystemStep Two-->Create Port for Client System.How To?Select-->Ports--> by navigating MDM Console-->Repository (BP3_PoC_Customer)-->Admin-->Ports.By clicking on the Ports you will get Context Area in the right side. Right click on the Context Area and you will get Context Menu.From selections choose the Menu Item -->Add port. Here you need to give TWO ports details, one is for Outbound, and other one is for Inbound. In this step we need Outbound Port. The Port Name is SIEBEL_OB_CUS_SIEBELCUS01, Code SIEBEL_OB_CUS_SIEBELCUS01, and Type as Outbound. See the Figure 3 below.
Figure 3.PortMDM Server Side ConfigurationIn the SAP MDM Server-->Look at the server folder structure for the repository.Here, you can see the Client system and Outbound Port details as folder name in the SAP MDM Server folder, which creats in MDM console as shown in below figure 4.
Figure 4. SAP MDM Server FolderNOTE: No customizing required in MDM Server but you need to ensure whether the folders as explained here are created in the inbound and outbound folders after additions of the ports in console.FTP Server Side ConfigurationStep Three in the FTP ServerHow To?Configure/Specify the file folder path for pulling the file in FTP Server.This is screen refers the FTP server configuration (WS_FTP server)
Figure 5.FTP ServerSAP XI / PI Side ConfigurationStep FourSender Communication Channel In the PI Integration Repository
Figure 6.Sender CCStep Five Receiver Communication Channel
Figure 7.Receiver CCInbound Process Flow
Figure 8. Process Flow>>>Here, I will be discussing about inbound process, the above figure shows the entire process Process step in MDM: You need to choose the destination structure, i.e. type of the destination file like XML, XLS, and CSV. Then mapping will have to do in the MDM Syndicator that for specified repository and then syndicate the repository. The syndicated repository data will save as predefined type in the Ready folder of SAP MDM Server. (SAP MDM 5.5--> Server-->Distributions-->bp1bocap080.bp.co_MSQL--> BP3_PoC_Customer--> Outbound-->Siebel--> SIEBEL_OB_CUS_SIEBELCUS01--> Ready)Process step in PI: Here you need to need specify the File Communication Channel Path as folder specified/created as in the MDM sever. i.e.(Outbound-->Siebel--> SIEBEL_OB_CUS_SIEBELCUS01 -->Ready) as per FTP Server configuration.The File Adaptor picks up the file and then PI process starts. In the PI transformation and conversion happens and then sends the IDoc to the R/3Inbound port.Process step in R/3: Here the IDoc receives to Inbound port and then IDoc is available in R/3.

Roles and Responsibilities of MDM Consultant

Introduction:
Yesterday I was going through the Forums on SDN and found that there are many people around who always try to gather information related to the key differences between the Roles and Responsibilities as an MDM Consultant. Also we have seen that today in every organization if you are trying for the designation of a MDM Consultant than this is the key question that is asked. Every person whether he/she is from a Technical side, a Business Analyst, a Quality person etc all have certain Roles and Responsibilities.
Then what are the Specific Roles and Responsibilities that differs a MDM Consultant from others?
If a person is able to work on different SAP MDM GUI applications can we say he/she is a SAP MDM consultant? He can be designated a SAP MDM developer. But from solution architecting perspective, a SAP MDM consultant has a very good understanding of all technologies that connects to MDM.
Expectations from SAP MDM Consultant:
The role of an SAP MDM Consultant is different from that of a technical consultant .As a SAP MDM Consultant one should have a better view for both - the business side as well as technical side.
MDM Consultant should not only be technically strong but should have a wider view for the Business for which he/she is working for.
MDM Consultant should not only know the technologies of MDM but should try to wider there view for all the different platforms that are used with MDM such as Net weaver , XI, Tibco , ABAP etc.
Consultants having the knowledge of all these platforms would not only enhance their knowledge but would also provide them an upper hand among his/her colleagues in their company.


Role and Responsibility of SAP MDM Consultant:
MDM does not require any coding from the MDM Consultant’s perspective, it does not expect the Consultant to sit for hours and find out the bugs from the code. MDM as it stands for Master Data Management expects a Consultant who is not only good with his/her technical skills but also have the capabilities to convince the client on benefits of MDM. The various Roles and Responsibilities of an MDM Consultant are:
Roles
Solution Architecting
Should have understanding of Business scenarios and converting it to MDM Business logic.
Should have a crystal clear understanding of the various scenarios that are being used in the project.
Should be a good Business Analyst
Should have skills to convert Business Requirements into Technical Specifications.
Responsibilities

Design solution using SAP MDM
Build MDM repository
Facilitate Data load and syndication
Develop MDM solutions for workflows, de-duplication, validations etc
Perform Unit tests
Complete MDM Project documentation
Aid integration tests
As we have seen that there are various expectations an organization has from its MDM Consultants and also there are various Roles and Responsibilities defined for the Consultant. If the consultant satisfies all the above criteria’s then we can say that he/she has emerged as a MDM Consultant.

5 Keys For Maximising Your ROI Through Optimal ERP Performance: Key 4 - 6 Critical failures, classic mistakes by Peter Clarke

Key No. 4 -- ERP Implementations: Critical Failure Factors, Classic Mistakes And Best Practices

The complexity and wide encompassing nature of ERP solutions means that there are inherent challenges in any ERP software implementation. The issue is to ensure that these challenges enhance the project and final outcome rather than become problems or disasters that undermine the project's viability.

According to Carol Ptak, failure is "an implementation that does not achieve a sufficient return on investment identified in the project approval phase. Using this definition, it has been found that failure rates are in the range of 60-90 per cent."

This is a fairly uncompromising definition of failure. The industry and the media are rife with stories of more dramatic IT project failures, and sometimes even disasters, and these are occasionally even backed up with reliable information and data. The Standish Group's oft-quoted and on-going CHAOS study suggests that two out of every three IT projects fail -- ie succumb to total failure and cancellation, or suffer cost overruns, time overruns, or a rollout with fewer features or functions than promised. Sometimes these failures have disastrous consequences beyond time and budget, and can seriously impact on the continued existence of the organisation itself.

But every IT implementation need not end in disaster. In fact, by studying the nature of past failures and finding common elements, mistakes and problems can be avoided.

R. Ryan Nelson (MIS Quarterly Executive, June 2007) investigated a number of 'infamous' IT project failures that in some instances involved sums in the billions of dollars. A post-mortem of these projects revealed that "While some of the projects experienced contractor failure, others cite poor requirements determination, ineffective stakeholder management, [over-extended] research-oriented development, poor estimation, insufficient risk management and a host of other issues."

And what is our reaction when something does go wrong? Nelson says that "We tend to make some mistakes more often than others. In some cases, these mistakes have a seductive appeal. Faced with a project that is behind schedule? Add more people! Want to speed up development? Cut testing! A new version of the operating system becomes available during the project? Time for an upgrade! Is one of your key contributors aggravating the rest of the team? Wait until the end of the project to fire him!"

Nelson cites an on-going study at the University of Virginia into the reasons for project failure. During 2006, the students in the Master of Science degree in the Management of IT program studied 99 projects to elicit any common lessons, regardless of whether or not the project was ultimately considered a success.

"The first major finding," he reports, "was that the vast majority of the classic mistakes were categorised as either process mistakes (45 per cent) or people mistakes (43 per cent). The remaining 12 per cent were categorised as either product mistakes (8 per cent) or technology mistakes (4 per cent). None of the top 10 mistakes was a technology mistake, which confirms that technology is seldom the chief cause of project failure. Therefore, technical expertise will rarely be enough to bring a project in on-schedule, while meeting requirements. Instead, the finding suggests that project managers should be, first and foremost, experts in managing processes and people."

He goes on to add that, while scope creep did not make the top 10 mistakes, "the fact that roughly one out of four projects experienced scope creep suggests that project managers should pay attention to it, along with its closely connected problems of requirements and developer 'gold plating'".

"Two other surprising findings were contractor failure, which was lower than expected at #13 but has been climbing in frequency in recent years., and adding people to a late project, which was #22, also lower than expected.

"The third interesting finding is that the top three mistakes occurred in approximately one-half of the projects examined. This finding clearly shows that if the project managers in the studied projects had focused their attention on better estimation and scheduling, stakeholder management and risk management, they could have significantly improved the success of the majority of the projects studied."

Recognising problems and potential problems is one thing; doing something about them, preferably before they occur or incur great harm, is another.

Below is a summary of "six fatal mistakes" in ERP implementations, along with methods that can be employed to avoid or, at worst, rectify them.

The failures and methods to avoid them are:

1. Ineffective project leadership

There are many different aspects to this and they are by no means all controlled by the Project Sponsor and the Project Manager.

Leaders in all areas affected by the project need to have a clear understanding and commitment to the reasons for the project and its end goals. Without their support, the project team will often be side tracked on insignificant issues by end users with their own personal agenda. Commitment starts with the Project Charter which should clearly articulate key aspects of the project. Project Charter approval should not be taken lightly in an effort to achieve an early milestone. Many Project Managers have been frustrated by people who have signed off a Project Charter without fully understanding what they have committed to. This always manifests itself during the tough times when it is least helpful.

Leadership also embraces the management of risks both from a project perspective and the management of the on-going business during the implementation. The company cannot afford for either to fail, yet it is often key resources who are forced to make priority decisions instead of the company leaders who should understand the overall picture.

Modifications to the standard system are at the forefront of potential mistakes. The leadership has an important role to play. Any modification that is proposed should be endorsed by the business leader most affected by the modification. This endorsement should incorporate clear reasons why the standard solution cannot be used and what benefits will be achieved. Where possible the benefits should be built into operational budgets to ensure they are realised.

2. Lack of frequent and realistic milestones throughout the implementation project.

In developing your project plan, you should always have the ability, at any time, to answer three critical questions -- where are we? are we there yet? and how do we confidently know we are there?

By setting frequent milestones at key points along the project timeframe, you will be able to quickly measure your progress and more importantly celebrate achievements with the team. Of course, this is also the time to make adjustments if, for whatever reason, the project is not going to plan.

The important thing is to ensure that any milestones set are simple and realistic.

3. Having no dedicated, high quality people in your implementation team and no compensation scheme in place for them.

The reality is that the people you really need in your implementation team are undoubtedly your best people, and it is almost guaranteed that they are also the busiest and least able to find additional time for the project in hand.

The best thing you can do is to offload some of their daily workload onto junior staff. And by giving junior staff the chance to prove themselves at a higher level, you also gain a wider spread of skills in the business and identify potential promotions at the same time.

The many different ways of rewarding project staff for their achievements range from revised job descriptions and salary scale to higher duty payments. The most effective, is a double bonus scheme, made up of a financial bonus against achieving major milestones and a public recognition or even celebration at each relevant stage. An extended leave at the end of the project may also be effective.

In the overall scope and cost of your project, the additional bonus and public recognition will pay dividends well past the life of the project. Bonuses should be significant enough so that recipients feel proud and respected rather than cheated.

4. Lack of adequate budget for training users on the new system.

Almost every organisation approaching systems implementation fails to budget sufficient dollars and time for training and the end result is that uptake on new systems, processes, policies is slow and the immediate effect is longer time to benefit.

It is normally true that whatever figure you have budgeted for training, you should double.

5. Making modifications to the standard system without carefully weighing benefits against risks.

There is a tendency in many organisations to quickly modify the system in areas where it does not match present business processes. The end result of this, is a system where future upgrades become extremely difficult to apply and any help desk support is always compromised because of the need to know the modifications as well as the standard system before any help can be offered.

The approach is to apply three "whys":

* Why are we considering this request for a modification and what is the proven measurable benefit?
* Why haven't we looked at all the alternatives and their risk/benefit first before choosing to modify?
* Why don't we see what other companies have done in this area? Unless we are the first, there must be lessons out there that we can learn from.

6. Failing to protect and insure the most critical parts of your business.

One example is of a managing director of a large pharmaceutical firm who wanted three guarantees before signing a contract for a new system:

* That his system would never, never put him in a position where he couldn't take orders from customers
* , That his new system would never, never prevent him from dispatching customer orders from his warehouse, and
* That his new system would never, never put him in a position where he couldn't accept his customers' payments and put their money in his bank account.

The lesson of this is to take a hard look at your business and identify the critical areas that you need to have available 24/7 and then talk to your hardware and software vendors to make sure they can provide adequate backup/recovery options to keep you operational when the unexpected happens.

Even with so many catastrophic examples of companies going bankrupt due to failed software implementations, many companies still don't pay enough attention to the risks involved. Adequate planning and preparation is essential to help you identify and manage potential risks.

Previewing is just as important than reviewing, certainly when it comes to avoiding potential disasters. By previewing your current business practices, goals, risks and articulating a solid implementation plan, you can go some way (at least) to making the life of your ERP system project that much less risky.

References:

* Nelson, R. Ryan, "IT project management: Infamous failures, classic mistakes, and best practices", MIS Quarterly Executive, June 2007
* Ptak, C., "ERP: Tools, techniques and applications for integrating the supply chain", 2000, St Lucie Press (as cited in Wong et al)
* Wong, A., Scarbrough, H., Chau, P.Y.K., and Davidson, R., "Critical failure factors in ERP implementation".

To subscribe to the entire article series visit Supply Chain Secrets.


About the Author

Peter Clarke, Chief Technology Officer IBS Asia Pacific has over 20 years experience in ERP Software, ERP Systems, Supply Chain Management Software and EAI.

5 Keys For Maximising Your ROI Through Optimal ERP Performance: Key No. 5 - Maximise Your Benefits by Peter Clarke

Key No. 5 - Maximising The Business Benefits And Return On Your IT Investments

It goes without saying that, despite the best planning and implementation processes, the proof of an ERP System project is in the business benefits and the return on investment achieved. A well-executed project is less than successful if there are no benefits or returns. This should be obvious to all, and this should be the primary focus of every project, in any field.

But judging by the number of 'failed' projects - whether this means never being completed or simply not living up to expectations - one would be forgiven for wondering if this overriding priority is forgotten in many cases. The attitude that "the operation was successful but the patient died" must be avoided at all costs, and it is an attitude that must be avoided throughout the project lifecycle.

Realising the benefits of your ERP implementation is determined by the actions taken in the initial planning stages of a project, as well as the result of how well the system is used once it is up and running. Poor planning can mean limited or even zero benefits or, at worst, a highly negative impact on the organisation. There are cases where organisations have suffered terminal effects of a poor implementation.

Companies focus on the process of selecting and installing ERP software systems but once the initial project is complete, it often happens that both the original team and the business focus move on, whether or not all the original goals have been achieved. The end result is something of an instant legacy system, with no budget or plan for further training or realisation of business objectives.

It cannot be emphasised enough that business benefits come after the project is complete. This is often a difficult issue to manage during the implementation phase, as those involved require faith in the project that the benefits will be achieved, but often see little evidence of this during the implementation itself.

A good example of this is in the consideration of modifications. It is often easier for project teams to request a modification to maintain the status quo and appease business users rather than try and introduce an improved process. For this reason any modification must be endorsed by the business leader most affected by the modification and reviewed by his peers.

Where possible the benefits should be built into operational plans budgets to ensure they are realised. This will encourage business managers not directly involved with the project to maintain an interest as they know they will be measured once it is in operation.

Joe Peppard et al referred to this in an article in MIS Quarterly Executive (called MIS1 from here on): "With the information technology investments, most organisations focus on implementing the technology rather than on realising the expected business benefits. Consequently, benefits are not forthcoming, despite a project's technical success."

They go on to say: "When considering return on investment calculations, organisations are so pre-occupied with manipulating the denominator - reducing spend - that they do not focus on the numerator - how IT can generate significant benefits. Equally worrying is the traditional investment appraisal process, which is often seen as a ritual that must be overcome before a project can begin. Many benefits are overstated to get the project through this process.

"No wonder few companies engage in post-implementation reviews. They already know that many of the benefits described in the business case are unlikely to be achieved."

As cynical as this last judgement is, there is an element of truth in it. But it needn't be so. There are ways to ensure - or, at least, to maximise - the business benefits that your ERP implementation can achieve.

Peppard et al, in another article (MIS Quarterly Executive, March 2008 - called MIS2 from here on), say "There is an important difference between investment objectives and benefits. Objectives are overall goals or aims on the investment, which are agreed on by all relevant stakeholders. In contrast, benefits are advantages provided to specific groups or individuals as a result of meeting the overall objectives."

They suggest (in MIS1) that, underlying an ERP project proposal, there should be five principles adhered with in order to realise value through IT.

Principle #1 is that IT has no inherent value. "Just having technology does not confer any benefit or create value. The value of technology is not in its possession. In fact, IT spending only incurs costs. Benefits result from effective use of IT assets."

Principle #2 is that benefits arise when IT enables people to do things differently. "Benefits emerge only when individuals or groups within an organisation, or its customers or suppliers, perform their roles in more efficient or effective ways. Generally, these new ways of working require improving how information is used."

Principle #3 says that only business managers and users can release business benefits. "Benefits result from changes and innovations in ways of working, so only business managers, users, and possibly customers and suppliers, can make these changes. Therefore, IT and project staff cannot be held accountable for realising the business benefits of IT investments. Business staff must take on this responsibility. Getting business staff to acknowledge this principle is a key way to ensure that they become involved in so-called IT projects."

Principle #4 reminds us that all IT projects have outcomes, but not all outcomes are benefits. "Many IT projects produce negative outcomes, sometimes even affecting the very survival of the organisation. The challenges for management are to avoid such negative outcomes and to ensure that the positive outcomes deliver explicit business benefits.

Principle #5 says that benefits must be actively managed to be obtained. "Benefits are not outcomes that automatically occur. Furthermore, the accumulation of benefits lags implementation; there can be a time gap between initial investment and payoff. Therefore, managing for the benefits does not stop when the technical implementation is completed. Benefits management needs to continue until all the expected benefits have either been achieved, or it is clear they will not materialise."

To implement an ERP project based on their five principles, Peppard et al recommend that seven key questions should be asked, the answers to which "are used to develop both a robust business case for the investment and a viable change management plan to deliver the benefits".

These questions are:

* Why must we improve?
* What improvements are necessary or possible?
* What benefits will be realised by each stakeholder if the investment objectives are achieved?
* Who owns each benefit and will be accountable for its delivery?
* What changes are needed to achieve each benefit?
* Who will be responsible for ensuring that each change is successfully made?
* How and when can the identified changes be made?

They do warn (MIS2) that "Some benefits can only be measured by opinion or judgement. … Quantifiable benefits are ones where an existing measure is in place or can be put in place relatively easily. Since quantifying benefits inevitably involves forecasting the future, the challenge is to find ways of doing this as accurately and robustly as possible."

But an ERP system does not stand alone, and realising benefits often relies on other, external inputs. ERP software systems are largely data dependent and data driven and the key to realising the full benefits is integration. If information is incomplete or inconsistent it becomes very difficult to integrate reliably and the results can lead to a loss of business confidence in the system and increased non-productive work load for support personnel.

Often day-to-day operation of systems falls to staff who weren't involved during the initial implementation and who have not received the same levels of training and handover as the original team. As a result effective system usage tends to deteriorate over time.

Your approach, then, to realising benefits of an ERP implementation (or any IT project, for that matter) relies on taking an holistic approach, which means not just all participants and stakeholders, but also all inputs. And this also means across the entire spread of a project - from the very first inkling of a suggestion or realisation of a need, to the end project and, importantly, beyond - to the ultimate users and how they will react to and live with the system that has, hopefully, been fully implemented. An implementation is not successful until anticipated benefits are achieved - largely after implementation - or a good reason established for why they won't be. Responsibility, therefore, for doing this rests with many players, at many stages.

Complex? Yes.

Difficult? Maybe.

Essential? Absolutely.

References

* Clarke, P., "How to maximise your investment in ERP technology", June 2008, IBS Australia
* Peppard, J., Ward, J., and Daniel, E., "Managing the realisation of business benefits from IT investments", March 2007, MIS Quarterly Executive
* Ward, J., Daniel, E., and Peppard, J., "Building better business cases for IT investments", March 2008, MIS Quarterly Executive


About the Author

Peter Clarke, Chief Technology Officer IBS Asia Pacific has over 20 years experience in ERP Software, ERP Systems, Supply Chain Management Software and EAI.

Capture Spend: New Search Engine from IBX now available by IBX

(United Kingdom) 10 November 2008 - IBX, the provider for efficient purchasing solutions, today officially launched the IBX Search Engine. The new application supports established procurement systems from world leading ERP vendors and channels all purchasing transactions into one system. Due to its state-of-the-art usability, high performance and outstanding category support, IBX search engine users will be able to find and purchase necessary goods and services easier and faster.

An easy-to-use search engine is the very first step in sustainable e-procurement success, enabling users to make every purchasing transaction through one channel. Therefore the IBX Search Engine provides a completely redesigned and more flexible user interface where end-users can search all products and services simultaneously from a single source, without having to consider whether it is a contract, a product catalogue or a supplier web shop that would fulfil their procurement needs.

Besides the look and feel, the underlying architecture of the new version of the IBX Search Engine was completely updated. "IBX are committed to becoming the global frontrunner in efficient purchasing and our overarching goal with the redesign is to be the industry leader for user performance and increase support for a larger share of purchasing categories," states Gustav Hasselskog, SVP Product Management and Marketing at IBX Group. "Our customers have demanding implementations. Our new search engine will cover more than 86 countries and 200 000 users."

"When it comes to performance, the only thing that counts is the time that it takes from hitting the search button until all the search results have been displayed in the browser window," adds Stefan Brönner, Chief System Architect at IBX Group. "In IBX Search Engine this is done in less than one second and this is what end users expect - real-time speed!"

To achieve this ambitious speed goal, IBX combined state-of-the art WEB 2.0 technologies like AJAX, JSON and REST with a high performance search kernel. This search kernel is based on Apache Solr, an open source search server, and proprietary IBX technology and allows searching the content in all major languages from Chinese to Russian. The user interface is currently available in nine languages.

"Originally we wanted to buy an existing procurement search application, but none of the available engines were suitable in terms of speed, search logic, availability and scalability," says Hasselskog. "For that reason we decided to develop our own application on top of Solr which proved to be the winning solution."

The performance goals were verified during an intensive test phase. The endurance test was conducted with 50 million line items. Even at this size, the new search engine didn't show any signs of slowing down; the search engine currently contains 6 million line items with plans to add another 10 million in the next four months.

"Adding line items without affecting the end user experience was also high on our list and 100% availability is a must," adds Stefan Brönner. "Solr really shines when it comes to scalability and availability. You can feel that it was designed to do this."

Customer Viewpoints The first rollouts of the search engine were conducted in May at IKEA, Deutsche Post and Skanska. The improvements are illustrated below:

"Our employees expect an intuitive user experience and IBX Search Engine fulfils their needs. The smart search logic in combination with quick uploads of supplier catalogs gives us faster user adoption and higher usage", says Lars Henriksson, IKEA IMS eProcurement Roll-out Manager

"We've had positive end-user feedback from across our organisation that the IBX Search Engine meets and exceeds expectations," said Otto André Winterstad, Purchasing Manager, Skanska Norge and Project Manager for eFFECT. "Users can quickly and easily find what they are looking for and have access to detailed summaries that meet their search specifications. As such, I expect increased usage of the system now that users can experience seamless end-to-end transactions which will result in very happy and productive internal clients."

Harry Longwitz, Senior Expert at Deutsche Post AG, comments: "We have received only positive feedback from our users following the roll out of the IBX Search Engine. The intuitive navigation as well as the speed and the search reliability are the best features of this tool."

Technical Background - IBX Search Engine: - In addition to the regular 1-line-entry mask for the product searches, organizations are able to put features such as "most searched items" and/or "latest frame agreements" on their home page in order to improve the overall usability of the tool. - The IBX search engine is compatible with all established procurement systems based on Open Catalog Interface (OCI) or Oracle Punchout. - IBX Search Engine can load up to two million line items per hour without any slow down in search performance and without any downtime at all. - Availability for the first four months of operations exceeded 99.9%. - The IBX search engine will be delivered with a new content workbench to create and maintain the catalogue content. Existing catalogues can be easily migrated to the new framework.

About the Author

IBX: - http://www.ibxgroup.com/

Warehouse Control System Leader QC Software Talks to Right Order Picking by Thomas R. Cutler

According to Rich Hite, founder of QC Software, "There isn't a "best" order picking system for a specific distribution scenario…using multiple approaches is increasingly the smart choice. Order picking strategies must take into account many different variables such as, storage and retrieval mode, picking method, order release method, picking communication (RF, voice, PTL, etc). With that premise companies need something to tie all of these methods and technologies together, and that is a WCS."

Cliff Holste, SCDigest's Material Handling Systems Editor, recently argued that order picking strategies must consider these different variables: • 5 Unit of measure (pallet, cases, eaches, etc.) • 5 Storage mode to be used (floor stacking, selective rack, pallet flow, carton flow, etc.) • 5 Potential use of automation (pick-to-belt, carousels, ASRS, etc.) • 5 Order release method (discrete order release, waves, etc.) • 5 Picking method (discrete order pick, cluster picking, batch picking, etc.); this is usually directly connected to the order release method • 5 Order picking communication and validation approach/technology (RF, pick by label, voice, RFID, etc.)

The WCS solutions provided by QC Software enables companies to streamline their warehouse operations with the lowest total cost of ownership in the industry ensuring increased corporate profitability. Traditionally, a Warehouse Control System (WCS) executes instructions provided by an upper level host system, such as an ERP system or a WMS system. QC Enterprise, on the other hand, is a true Tier 1 WCS which provides advanced management capabilities including inventory control, resource scheduling and order management.

QC Enterprise is comprised of 4 tightly integrated modules which provide state of the art warehousing capabilities and the QC Toolkit to easily configure QC Enterprise and its modules to fit any warehouse environment.

QC Software (www.qcsoftware.com) is the leading provider of Tier 1 warehouse control systems to the warehousing and distribution industries. Since 1996, QC Software, utilizing state of the art technology combined with extensive research, development, and rigorous testing, has developed the QC Enterprise suite of products. Designed to be modular in nature, easily configurable, and platform independent, this highly scalable solution satisfies the needs of any size warehouse.

QC Software, Inc. www.qcsoftware.com Jerry List JerryList@qcsoftware.com (513) 469-1424

About the Author

Professional Marketing Firm for the Manufacturing Community and Manufacturing Journalist to most manufacturing magazines

ERP isn't about IT in the SME market by Ken Eybel

ERP is about streamlining processes and integrating a tool into your business effectively and efficiently. The people charged with selecting and implementing the software should not be IT personnel but the people who are going to be using the system. Don't get me wrong in some cases the IT person(s) know the processes of companies intimately and understand how the end user needs to work with the system. The key word in that statement is some.

While it is important for IT to be involved in the process, there role should be very minor. In smaller organizations this person is usually a one man show or an outside contractor so charging them with this task is a stretch from the get go. The people who need to be involved besides the ultimate decision makers are the key users of the system. What does the software do for these people, how will it improve their departments and the overall performance of the system.

In smaller companies these "key users" or power users are generally the busiest people so their time is precious, there is no doubt. Having said that to have them waste it when the wrong solution is chosen is more of a travesty. Get them involved early and often through out the process. Ensure that their needs are being met and if not what if anything can be done to meet them. This also helps greatly with buy in, if the are part of the decision they will be more likley to be involved and lead the way.

SME's can't afford to make mistakes or waste a lot of time implementing software. I am sure everyone know someone who has purchased the wrong system, spent way too much money and way too much time to try to make it work. Only to be left with nothing at all or a completely different system. Do your self a favour and make sure the people who will use and benefit most from a system are the ones who decide what system you will use and the functionality you will use.

ERP is not about IT, it is about the continuous improvement of your administration, tracking and execution of day to day activities. No one knows more about improving and maintaining those things than the people that do them every day.

About the Author

Ken is in the marketing department of IntegrateIT,,, provide affordable ERP software to Small Business Contact Ken by email at keybel@integrateit.ca. Request a demo by sending an email . Need some ERP advice or definitions

Wednesday, November 12, 2008

SAP Business One EDI Custom Integration - Overview for Programmer by Andrew Karasev

SAP B1 ERP and MRP application could be easily integrated with your Electronic Document Interchange channel, either for outbound (when you are ordering products from your vendors) or inbound (when your customers places their orders in EDI code). This publication is written in technical manner and ideally it should help your with your SAP Business One VAR, consultant, developer, reseller selection. Similar approach could be tried in SB1 integration scenarios. We will not describe SAP BO SDK programming techniques here and will try to give you highlights on the solution, which doesn't require additional software licenses cost

1. EDI as fixed length fields format. This is traditional EDI, when you have document header, lines and trailer. All the fields have predetermined fixed length and position

2. EDI as XML. This is new trend, where XML does the same job as fixed length format, mentioned above. If your EDI channel requires XML, you are probably more flexible in the tools and coding methods selection

3. Outbound EDI. You should either research SB1 tables structure, which is described in SAP Business One SDK. Another option is to enable system information in SB: View -> System Information. If it is enabled, open your Purchase Order and place the cursor over the field you want to export in EDI code - in the left bottom corner you should now see the table and the field names. To format your records in fixed field length manner, use CAST or CONVERT SQL clauses. Save your EDI export in the text file - you should consider deploying DTS (Data Transformation) package to extract records and save the file. For XML you have robust support in MS SQL Server 2005 or 2000

4. Inbound EDI. Here we recommend DTW or SAP Business One Data Transfer Workbench. However you do not use here CSV files (also often referred as Excel templates). Here you deploy ODBC source to MS SQL Server. From MS SQL side you should prepare views or SQL stored procedures. Again, deploy DTS package, which will first move your EDI file into SQL staging tables. Then from these tables, use View to prepare the results for Workbench. To avoid linking and configuration problems, in your SQL view - have field names to be the same as in Workbench Excel template for the same object, you are intending to integrate

About the Author

Andrew Karasev, Alba Spectrum LLC, help@albaspectrum.com, http://www.albaspectrum.com, 1-866-528-0577 , subdivision of M2-D2, SAP Business One VAR and Reseller in Illinois, Georgia, South Carolina, California, Texas. Please visit our info portal Pegas Planet: http://www.pegasplanet.com Local Service in Chicago, Atlanta, San Diego, Los Angeles, Orange County, Houston

SAP Business One VAR newsflash: Remote Support by Andrew Karasev

If you business is located in USA or Canada countryside, where local SAP B1 consulting practices are limited and not easy to find, modern remote access technologies, such as web sessions, skype, remote desktop connections, VPN - allow you to get instant and very efficient support from SB1 nationwide support call center and software development factory. In this publication we will review remote support scenarios. Ideally this paper should help you with your SAP BO partner, reseller and consultant selection

1. User training. Here the best technology is web session, where our consultant shares with your trainees the same computer monitor and you can see her or his face, captured by video camera on the laptop - in the case if you are using skype VOIP telephony

2. Integration and Customization projects. In this scenario the best approach is to install SAP Business One Data Transfer Workbench with SQL Server management Studio one of the test servers in your office with open remote desktop connection (over the Virtual Private Network or VPN). On the same machine you should enable our SB1 SDK programmers to upload their customization versions for deployment and testing

3. Initial Data conversions. In Sap B1 traditionally this is Data Transfer Workbench routine, where you prepare CSV files, based on DTW templates

4. New modules implementation. It is often desirable to come out onsite to shake hands and meet people face-to-face, see your facility and discuss implementation goals and tactics

5. New Licenses sale. Nationwide support partners can sell SB1 user licenses nationwide and internationally in most of the cases

About the Author

Andrew Karasev, Alba Spectrum LLC, help@albaspectrum.com, http://www.albaspectrum.com, 1-866-528-0577 , subdivision of M2-D2, SAP Business One VAR and Reseller in Illinois, Georgia, South Carolina, California, Texas. Please visit our info portal Pegas Planet: http://www.pegasplanet.com Local Service in Chicago, Atlanta, San Diego, Los Angeles, Orange County, Houston

SAP Business One Integration: Data Transfer Workbench advanced topics by Andrew Karasev

SAP B1 ERP and MRP platform should be seamlessly integrated with your legacy applications, such as Oracle, Microsoft SQL Server custom databases, ODBC compliant sources: Excel, MS Access, CSV and tab delimited text files. If you got initial training in SAP BO Data Transfer Workbench, you should be already familiar with Microsoft Excel and templates concept. In this small publication we will cross the Excel limits and give you the highlights on SQL queries, and this means virtually unlimited ability to integrate any database platform and not in initial data migration and conversion, but even in ongoing data integration:

1. ODBC query anatomy. As Workbench is really tuned for Excel CSV templates, where field names are defined according to SB1 importing object rules, you should follow the same rules in creating SQL view - please in view creation define the same column names as in Excel template for the intended object. If you follow this rule, then your object integration will recognize your columns mapping automatically on the fly, and you will not need to change and save changes for DB schema - the topic we would like to avoid in this executive and IT programmer level publication

2. SQL View technology. If you move query design from Excel to SQL select statement, then you are virtually breaking all the boundaries. We recommend you to consider the following path - import your source master records or transactions into MS SQL Server based staging table (where you may consider adding identity insert column for uniqueness). Please consult your SQL DBA person to understand the limitations of Excel and advantages of the SQL Server Views and Stored Procedures

3. 64 and 32 bit Windows 2003 Server dilemma. If you are reading your integration from the text tab or comma separated values files, then you should be aware about 64 bit platform limitations. You can't use such popular driver as Microsoft.Jet.OLEDB.4.0. Instead you will have to import text file through Microsoft SQL Server DTS or data import wizard directly. If you are still on the old-good-days Windows 2003 Server 32 bit, enjoy the advantage to deploy select statement from Microsoft.Jet.OLEDB.4.0 compliant text file. On the Windows x64 OLEDB seems to be not yet available, or at least, there are technical challenges to use OPENROWSET, reading from the text file

About the Author

Andrew Karasev, Alba Spectrum LLC, help@albaspectrum.com, http://www.albaspectrum.com, 1-866-528-0577 , subdivision of M2-D2, SAP Business One VAR and Reseller in Illinois, Georgia, South Carolina, California, Texas. Please visit our info portal Pegas Planet: http://www.pegasplanet.com

SAP Business One Integration Consultant: Workbench and SQL data conversion by Andrew Karasev

SAP B1 initial data migration and conversion is typically done through Data Transfer Workbench, where you basically use CSV Excel templates, fill them in with required fields, leave everything optional blank, build integration on the fly and run it with series of tests and rollbacks until you are satisfied with the data massage and migration quality. It is also good idea to copy your production company database into the test company and probe the data import there first. Let's try to break through MS Excel restrictions and see how you could place into production something more complex, ongoing integration, for example. In this publication we will just mention, that real time integration could be programmed in SAP Business One SDK, however this is outside of the scope

1. ODBC integration. At this time ODBC source integration could not be scheduled for automatic run via batch command, as SAP doesn't support this and it doesn't work. The way you do advanced ODBC integration - you create SQL view which has exactly the same column names as you see in CSV template for the intended objects. And you will have to provide enumerated column in your view for RecordKey

2. Scheduled Run. The work around to schedule ongoing integration is to create Data Transformation Package, which will import text file (try this nice construction: Microsoft.Jet.OLEDB.4.0, instead of creating linked server), then it will transform the data via SQL stored procedure (feel free to create temporary staged tables, SQL cursors, etc.). Then it will export your data from SQL to CSV file, ready for integration. You can build and save DTW integration, and schedule it in the batch file, assuming that it will work off the CSV templates

3. DTW limitations. Current SB1 version 2007A doesn't have all the SAP Business One objects covered, or at least we found some objects, which we had to move into SAP BO through SDK small add-ons. Open Inventory, Item Management, Alternative Items - there are no template for Alternatives and you have to push them into SBO through SDK to avoid problems with Early Watch Alert

About the Author

Andrew Karasev, Alba Spectrum LLC, help@albaspectrum.com, http://www.albaspectrum.com, 1-866-528-0577 , subdivision of M2-D2, SAP Business One VAR and Reseller in Illinois, Georgia, South Carolina, California, Texas. Please visit our info portal Pegas Planet: http://www.pegasplanet.com Local Service in Chicago, Atlanta, San Diego, Los Angeles, Orange County, Houston.

Tuesday, November 11, 2008

SAP Business One Consultants Atlanta, Chicago Newsflash: ODBC integrations by Andrew Karasev

If you plan to do ongoing data integration in SAP B1 without deploying SB1 SDK programming, try to research SAP BO Data Transfer Workbench ODBC source option. This small publication should help you in SAP B1 partner, reseller selection - at least you should be able to ask the right questions about your SAP Business One VAR expertise

1. Initial Data conversion. Data Transfer Workbench has set of CSV templates and all you need to do is to fill them in Excel, paying attention to required and in necessary add values to optional fields. And good news is the fact that most of SAP B1 consultants are familiar and reasonably good with initial data conversion and migration and will help you with filling CSV templates

2. Ongoing data integration. Here, you should find people who are familiar with advanced Workbench techniques. For example, if consultant is saying you that she or he will help you prepare your ongoing integration files in Excel - this is probably not the answer you need to accept

3. ODBC source. Here you use the following technique. Create SQL view, which will prepare your data for import in exactly the same format as discussed above CSV template. ODBC source can work with text file, however the limitations of Text driver may lead you to importing source files into MS SQL Server staging tables. One of the limitations of ODBC source based integration is the fact, that SAP doesn't support scheduled run for it

4. Workbench integration scheduled run. You can create batch file, where you call your XML integration saved file with -s parameter, please read Workbench help. If you like scheduled run for really complex integration, where you would have to deploy ODBC sources, please consider to deploy SQL data import, massage via SQL stored procedure or view and then export it to CSV files - the best tool to do the job is likely Data Transformation Services (DTS) package

About the Author

Andrew Karasev, Alba Spectrum LLC, help@albaspectrum.com, http://www.albaspectrum.com, 1-866-528-0577 , subdivision of M2-D2, SAP Business One VAR and Reseller in Illinois, Georgia, South Carolina, California, Texas. Please visit our info portal Pegas Planet: http://www.pegasplanet.com Local Service in Chicago, Atlanta, San Diego, Los Angeles, Orange County, Houston.

Latest updates from sdn.sap.com

SAP Developer Network Latest Updates