DronaBlog

Showing posts with label Master Data Management. Show all posts
Showing posts with label Master Data Management. Show all posts

Friday, April 9, 2021

How to fix ORA-00604-Error occurred at recursive SQL level 1

 Are you working on a project where the oracle database is being used for implementation? Are you also facing an ORA-00604 and looking for fixing this error? If so, then you reached the right place. In this article, we are going to see how to fix ORA-00604-Error that occurred at recursive SQL level 1.





What is ORA-00604?

The error message ORA-00604-Error occurred at recursive SQL level 1 is commonly noticed in the application logs or identified by oracle user. This error message is a little complex and has its own challenges to fix. Let's understand what is the root cause of it.

The root cause of ORA-00604 

There are several causes of this error, however, the main cause is processing a recursive SQL statement. you might have a question what is the recursive statement? A recursive statement SQL statement that is applied to internal dictionary tables.

Example of ORA-00604

Assume that you are using Oracle 11g or 12c and you are getting 'ORA -00604-Error occurred at recursive SQL level 1' where table or view does not exist one of the causes for this is a trigger. The trigger might be 

1) to insert records into an audit log table or

2) to fire DDL statements or

3) to drop the audit log table Let's see possible option to fix this error





 How to fix ORA-00604 - 

Option 1:- Analyze and fix trigger error in order to determine if the error is related to DB trigger, execute statement below-

ALTER SYSTEM SET "_system_trig_enabled"=FALSE; 

Find the trigger which causes the issue & disables it.

Option 2:- syntax error in the SQL generated in the application. If an ORA error caused because of an error or bug in the  SQL code then reach out to developers of the code & fix it.

Option 3:- Oracle support If the error still persists then reach out to the oracle support team.


Learn more about oracle in detail here -












Sunday, March 21, 2021

Understanding Timeline in Informatica MDM

Timeline is one of the useful features in Informatica Master Data Management. It enables us to manage various versions of the business records. The timeline is totally different from the history of the records updates and inserts. In this article, we will focus on Timeline granularity and Timeline actions features. So let's dive in.


Timeline Granularity

In Informatica MDM, We use time measurement such as years, months, days, etc to define effective periods for versions of records. This time measurement is nothing but timeline granularity. We can define timeline granularity as a year, month, day, hour, minute, or seconds.




Timeline Action

The timeline action is nothing but an action to perform for entities for which you track data change events. We can perform add, edit actions on a record and edit effective period. 

a. Timeline action = 1 --> This value is set if we update the data
b. Timeline action = 2 --> This value is set if we update Effective Period
c. Timeline action = 4 --> This value is set if we Add Effective Period

Versioning Sequence :

The stage table column VERSION_SEQ  is set to 1 by default and gets changed based on the new update.


Learn more interesting facts about Informatica MDM Timeline



Monday, February 15, 2021

What are the different phases of Informatica MDM Implemantion


Are you planning to implement the Informatica Master Data Management project? Are you looking for what are the details you need to capture? If so, then you reached the right place. In this article, we will understand what are the phases of Informatica MDM implementation. We will also see what will be the outcome of each phase.


Introduction:

There are 5 phases in Informatica Master Data Management implementation.




1. Discovery Phase
2. Design Phase
3. Development Phase
4. Testing Phase
5. Deployment Phase



Let's understand each of these phases one by one.


A. Discovery Phase:

We need to perform the activities below as part of the discovery phase -
  • Perform data profiling
  • Conduct interviews and workshops 
  • Document solution requirements 
  • Define conceptual architecture 
  • Define the logical data model 

What will be output?
The output of this phase will be 1. Functional Requirements Specification 2. Testing Plan


B. Design Phase:

We need to perform the activities below as part of the Design phase -
  • Define detailed business rules 
  • Develop system design specifications for all technology components 
  • Develop Traceability Matrix 
  • Establish DEV Environment 

What will be output?
The output of this phase will be 1. System Design Specification 2. Traceability Matrix

C. Development  Phase:

We need to perform the activities below as part of the Development phase -
  • Configure and develop MDM components
  • Perform unit testing
  • Develop Test Scripts 
  • Develop Test Schedule 
  • Update Traceability Matrix 
  • Establish QA and PROD Environment 

What will be output?
The output of this phase will be 1. Configured and developed MDM components 2. Test Scripts 3. Traceability Matrix





D. Testing Phase:

We need to perform the activities below as part of the Testing phase -
  • Execute System Testing 
  • Execute Rules Testing 
  • Execute Integration Testing 
  • Execute User Acceptance Testing 
  • Develop Test Summary Report 

What will be output?
The output of this phase will be 1. Tested, production-ready MDM solution 2. Test Script Results 3. Test Summary Report 

E. Deployment Phase:

We need to perform the activities below as part of the Deployment phase -
Deploy the Solution into the Production Environment 
Execute the initial IDL Initiate 
Gray Area Reconciliation 

What will be output?
The output of this phase will be  1. Deployed solution Initial data loaded and ready for reconciliation





You can learn more about Informatica Master Data Management here -



Thursday, January 21, 2021

Informatica MDM Installation Checklist


 Are you looking for an article about how to prepare a checklist for Informatica MDM Installation? You might have gone through the Informatica MDM Installation guide and must be facing issues from where to start. The best way to start is nothing preparing checklist. In this article, we will see what are the main topics to consider for check and we will also see a sample checklist file.





Introduction

After going through more than 100 pages in the Installation guide, I realized that we need to prepare a checklist for each component of Informatica MDM. Informatica MDM comes with Hub Server, Process Server, Informatica Data Director Configuration Manager, Provisioning Tool, Business Process Management i.e. Active VOS, Elastic Search, Application Server (such as Jboss, WebSphere, Weblogic), Database (such as SQL Server, DB2, Oracle). Each of these components has a separate set of instructions. Sometimes we get lost or overwhelmed with all these instructions. So take a long breath and start documenting the main section as like mentioned in our next section

Components to consider for installation

The number of components that are needed for installation is based on the business needs. However, there few components are commonly required irrespective of business need e.g. Process Server, Hub Server. So first document all possible components which you are planning to install. Here is a sample list of components. 


Informatica MDM Installation checklist

The checklist contains details about each of the components which are captured in the previous section. e.g. Create the MDM Hub Master Database section will have details about database name, server name, port, and credentials. You can access the complete checklist here.


For reference, the checklist will look like as below





If you are looking for more details about Informatica MDM then here is a video for Informatica MDM.







Saturday, December 12, 2020

How to prepare for Informatica MDM Developer Certification?

 If you planning to get Informatica MDM Developer certified and wanted to know how to prepare for Informatica MDM Developer Certification. If so, then you reached the right place. In this article, we will discuss all the topics on which Informatica Master Data Management certification is based on. We will also discuss the Informatica MDM Certification cost. With that, I would like to congratulate very first steps towards your MDM Developer certification and good luck with your exam.






Introduction

In this article, we will discuss all the topics on which questions will be asked in the certification examination. We will provide the necessary video tutorial and other material. With help of this material and proper training, you will be ready for your certification. You do not have to look for Informatica MDM certification dumps and avoid online frauds. You can find Informatica MMD training online either on Udemy or other platforms. You can also take training from Informatica but these are a little expensive. The Informatica Certification exam includes questions that are multiple-choice, multiple responses, or true/false formats. The exam has 70 questions with a passing score of 70 percent. The duration of the exam is 90 minutes. The exam access will expire 90 days after purchase, with no refund.


Informatica MDM Certification cost

Informatica MDM Certification costs $340 USD, it includes a second free attempt. After that, if you would like to attempt again then the original cost will apply. This cost varies from time to time, so this cannot be a guaranteed price.

Informatica MDM Developer Certification

In this section, we will discuss the Informatica MDM syllabus. I would strongly recommend going through these topics before taking any Informatica MDM training online or attending any Informatica certification program. If you already working on the MDM project and have good experience in Informatica MDM real-time projects then it will be easy to get certified. 





A. Introduction to MDM Multidomain Edition

E. Overview of Match and Merge Processes

F. Fuzzy matching

G. Configure of queries and packages

H. User Objects

I. Introduction to Entity 360

J. MDM Entity 360 Architecture





Tuesday, December 1, 2020

Infomatica MDM - MDM Installation Topology

 Are you planning to install the Informatica MDM hub in the Development or Production environment? And looking for the details about the best possible way to make use of your infrastructure? If so, then you have reached the right place. In this article, we will explore different Informatica MDM installation topologies






Introduction

Basically, there are three types of topologies recommended by Informatica. We can use one of them while installing the Informatica MDM hub, based on project needs and benefits which we are looking for. Here is a list of recommended topologies

a. MDM Topology for Clusters

b. No Cluster - No High Availability

c. No Cluster - High Availability


A. MDM Topology for Clusters

In this type of topology Hub Server and Process server resides in a different machine and these are clustered together.


Characteristics:




B. No Cluster - No High Availability

In this topology, Hub server and Process servers are not clustered, hence we will not achieve high availability.



Characteristics:



C. No Cluster - High Availability

In this type of topology, Hub Server and Process are not clustered, however an external load balancer can be used to make the MDM system highly available.




Characteristics:







The detailed information types of MDM styles are provided here -















Friday, November 13, 2020

Informatica MDM - How to fix an error - ORA-01555: snapshot too old?

While working on Informatica MDM jobs, I came across one issue. The issue is an ORA-01555: snapshot too old. This error message was reported while running the tokenization job. If you are also noticing a similar issue then this article will help. This article provides details about an error message and a solution to fix it.






Error Message:

The detailed error message is as below -

java.sql.BatchUpdateException: ORA-01555: snapshot too old: rollback segment number 11 with name "_SYSSMU11_2399779032$" too small

SIP-16084: Error occurred while verifying the need to tokenize records. Return code 12801, 

Error SQLException During VerifyNeedToStrip :ORA-12801: error signaled in parallel query server P000,
ORA-01555: snapshot too old: rollback segment number 30 with name "_SYSSMU30_2998435469$" too small.
 at com.siperian.common.SipRuntimeException.createNotExternalized(SipRuntimeException.java:74)
 at com.delos.cmx.server.interact.caller.InteractCleanseClient.executeGenerateMatchTokens(InteractCleanseClient.java:460)


Solution:

To fix the issue perform the steps below -

A. Database Issue
First, analyze if there is any database issue going on. If the database looks good then perform the steps below

Step 1: Stop any job running as 'INCOMPLETE'
Step 2: Stop the Application server
Step 3: Verify undo_retention value by running the query below on db side
           
             show parameter undo_retention;

Step 4: If the value is lower then increase the value to 4000 by executing the command below

            ALTER SYSTEM SET UNDO_RETENTION = 4000;

Step 5: Increase Undo Tablespace to Auto Extended on.

Step 6: Restart the database servers with the clear cache

Step 7: Drop T$ table if present any in database

Step 8: Start the MDM servers with a clear cache.






This will fix the issue. I hope this is helpful. You can learn more about job tuning here -





Wednesday, November 11, 2020

How to fix Error - SIP-52054: Failed to create collection name for orsId


Are you looking for how to fix an error : SIP-52054: Failed to create collection name for orsId  in the MDM hub? Are also interested in knowing what is the root cause of this error? If so, then you reached the right place. In this article, we will focus on Elastic Search error in Informatica Master Data Management (MDM).






Error Message:

If you are running any soap request against Business Entity services which internally uses Elastic Search then may encounter an error below :

SIP-52054: Failed to create collection name for orsId


Detailed Error stack:


[ERROR] com.informatica.mdm.cs.server.CompositeServiceInvoker: SIP-52054: Failed to create collection name for orsId [mdmsbx-CMX_ORS] because of error: Connection refused.
com.informatica.mdm.spi.cs.StepException: SIP-52054: Failed to create collection name for orsId [CMX_ORS] because of error: Connection refused.
 at com.informatica.mdm.cs.steps.SearchCO.invoke(SearchCO.java:337)
 at com.informatica.mdm.cs.server.CompositeServiceInvoker.executeStep(CompositeServiceInvoker.java:426)
 at com.informatica.mdm.cs.server.CompositeServiceInvoker.processService(CompositeServiceInvoker.java:308)
 at com.informatica.mdm.cs.server.CompositeServiceInvoker.executeService(CompositeServiceInvoker.java:385)
 at com.informatica.mdm.cs.server.CompositeServiceInvoker.processService(CompositeServiceInvoker.java:312)
 at com.informatica.mdm.cs.server.CompositeServiceInvoker.process(CompositeServiceInvoker.java:187)
 at com.informatica.mdm.cs.server.CompositeServiceInvoker.invoke(CompositeServiceInvoker.java:118)
 at com.informatica.mdm.cs.server.ejb.CompositeServiceEjbBean.doProcess(CompositeServiceEjbBean.java:53)
 at com.informatica.mdm.cs.server.ejb.CompositeServiceEjbBean.process(CompositeServiceEjbBean.java:37)


How to fix this issue?

In order to fix this issue, perform the steps below -

1. Verify the MDM hub is accessible. Also, verify the connection to the process server from MDM hub -> Utilities -> Process Server

2. Verify Elastic Search server is working fine

3. If the above two steps look good then make sure the Elastic Search is properly configured in the Provisioning tool.

The location is : Provisioning Tool -> Configuration -> Infrastructure Settings  -> ESCluster

Here make sure the server name is properly configured for Elastic Search






Root cause:

The error 'SIP-52054: Failed to create collection name for orsId ' normally occurs when the Elastic Search server tries to make a connection to the MDM hub. If there is a mismatch in server ion in the Provisioning tool then we get this error.


Learn more about the provisioning tool here -



Wednesday, July 29, 2020

Informatica MDM - How to create Elastic Search certificate to access Elastic Search secure way

Are you trying to access Elastic Search API through the browser? Are you also planning to execute Elastic Search APIs using Postman or Soap UI? If yes, then you need to create a certificate in order to access Elastic Search API in a secure way. In this article, we will discuss what are the steps which need to be executed in order to generate the certificates.



Step 1: Location of steps execution
We need to execute certificate generation commands from the location below. Hence go to this location

 <MDM hub install directory>/hub/server/resources/certificates

Step 2: Execute the command below to convert Java Key Store (JKS) files to p12 file. P12 file contains a digital certificate with Public Key Cryptography Standard #12 encryption. P12 file is a portable format to transfer personal private keys and other sensitive information. This file will be used to access Elastic Search API such as GET, POST, PUT etc.

keytool -importkeystore -srckeystore MDM_ESCLIENT_FILE_JKS.keystore -srcstoretype jks -destkeystore MDM_ESCLIENT_FILE_JKS.keystore.p12 -deststoretype pkcs12 -alias esclient -destkeypass changeit

Here, changeit is a password.

Step 3: We need public key to create access to Elastic Search. In order to create a public key, we need to use P12 file which is created in Step 2. The public key will be used to encrypt the data before sending over the network. Execute the command below to generate a public key.

openssl pkcs12 -in MDM_ESCLIENT_FILE_JKS.keystore.p12 -out file.key.pem -nocerts -nodes

Step 4: Certification creation is another important step. Before understanding why we need crt file, we need to know little about .pfx file. The .pfx  file includes both the public and private keys for the given certificate.  Normally used for TLS/SSL on web site. The .cer file only has the public key and used for verifying tokens or client authentication requests. To generate certificate run the command below-

openssl pkcs12 -in MDM_ESCLIENT_FILE_JKS.keystore.p12 -out file.crt.pem -clcerts -nokeys

Step 5: Execute the command below check Elastic Search accessible in a secure way. The command below will list all the indices present in the Elastic Search server.

curl -k -E ./file.crt.pem --key ./file.key.pem https://<Elastic Search Server host>:<Port>/_cat/indices

Step 6: This step is optional but if you are looking for how to make POST or PUT call using curl command on Elastic Search server then this will be helpful.

First, prepare the request body and save it in the file.  e.g. Create file Sample.txt. Add request below anything you want (a JSON message). A sample one is provided below:

{
   "index.search.slowlog.threshold.query.debug": "-1ms",
   "index.search.slowlog.level":"info"
}

Execute the command below using the Sample.txt file. Here we need to use the index name on which the PUT or POST request will be executed. e.g 43456-customer is an index name which you can get from step 5.

curl -d "@Sample.txt" -H "Content-Type: application/json" -X PUT  -k -E ./file.crt.pem --key ./file.key.pem https://<Elastic Search Server host>:<Port>/43456-customer/_settings  


Step 7: If you are using a clustered environment and would like to check the status of the cluster then execute the command below -

curl -k -E ./file.crt.pem --key ./file.key.pem -XGET 'https://localhost:9200/_cluster/health?pretty'



Wednesday, July 22, 2020

Informationca MDM - Validation after install or upgrade

Are you looking for the details about the validation of Informatica MDM components after installation or upgrade or patch fix? Are you also looking for what functionalities we need to validate in IDD or Informatica MDM hub? If yes, then you reached the right place. In this article, we will understand component validation details. So let's start.



Components to validate

Here is a list of components we need to validate after installation, upgrade or after applying the patch to Informatica MDM:
1. Informatica MDM Hub Validation
2. Informatica Data Director Validation
3. Provisioning Tool Validation
4. Active VOS Validations

1. Informatica MDM Hub Validation

We need to perform the mentioned below validations in Informatica Hub Console after new install or upgrade:


    A)   Validation of MDM hub access-
          i) Launch the Hub Console using URL and try to login with the user name and password.

    B)   Validation of MDM hub tool-
          i) Verify all users are corrected created or migrated by using the Users tool in the Configuration workbench. Verify that the properties of the users are intact.
          ii) Verify the data model by selecting the Schema Viewer tool in the Model workbench, and then connect to an Operational Reference Store.
          iii) Verify that the cleanse functions are working fine. You can select the Cleanse Functions tool in the Model workbench and execute any cleanse function and make sure it is working properly.
          iv) Verify Base Object tables, Staging tables, Relationships among the tables, Validation Rules                 (if exist), and the Match/Merge Setup for a base object.
          v) Validate that record creation working as expected by creating a record using the Data Manager tool.
          vi) Use the merge manager and merge some sample records to make sure merge processing is working as expected.
          vii) Verify that jobs are running fine by running any sample batch job such as Stage job and make sure it executes successfully.
           viii) Verify the connectivity to process servers from the MDM hub by selecting the Process Server tool in the Utilities workbench and click the Test the connection.
           ix) Verify that queries and packages are showing data in the view page


.

2. Informatica MDM- Data Director Validation

The validation below needs to be performed if you are using the Data Director with subject areas.  you need to deploy the application before you begin the tests. Perform the following upgrade tests that apply to your environment:



    A)   Validation of Data Director access-
           i) Use the Data Director Configuration Manager URL and try to access it. then access the Informatica Data Director application using the username and password.

    B)   Validation of Informatica Data Director-
           i) Create search query using fields from Subject Area and Subject Area child fields and make sure able to create, edit, and delete the queries.
          ii) Run the queries to perform searches. Perform multiple searches to verify search functionality.
          iii) Open searched the record and perform the update operation.
          iv) Verify record creation process by creating a new record.
          v) Verify History, Timeline sections are working fine
          vi) Validate Matches section and try to add merge candidate and merge record

    C)   Validation of Tasks in Informatica Data Director -
         i) Open task manager in IDD and verify all the tasks are listed.
         ii) Verify the opening of tasks is working fine.
         iii) Claim the task to make sure, claim action is working as expected.
         iv) If it is an update task then update the record and make sure the task successfully completed.
         v) If it is a merge task then merge the record and verify the task get cleared from the task list.




3. Provisioning Tool Validation

We need to perform the validation below for the Provisioning tool.

A)   Validation Provisioning Tool Access-
           i) Login to the Provisioning Tool using username and password.

B)   Business Entity, Transformation, View verification, and Task Configuration
          i) Verify that all the Business Entity are present in the provisioning tool
          ii) Verify all the transformation between View to Business Entity and Business Entity to View as well as Business Entity to Business Entity
          iii) Verify all the views
          iv) Verify Task configuration such as Task Type, Task Triggers etc

C)   Verify Elastic Search configuration
        i) Verify Elastic Search server configuration under Infrastructure settings
        ii) Verify all layout manager, application configuration



4. Active VOS Validation

Validate Active VOS  for the items below:
        i) Verify status of Active VOS in Active VOS console
        ii) Verify Identity Service connection from AVOS console
        iii) Verify all the workflows are in a deployed status
        iv) Verify all the task in running state







Friday, July 17, 2020

What is Build Match Group (BMG) in Informatica MDM?

Are you looking for details about the Build Match Group (BMG) process which is used in Informatica MDM? Are you also would like to know when the BMG process gets executed? Would you be interested in knowing how to control this behavior? If so, then you reached the right place. In this article, we will discuss the BMG process in detail.



What is the Build Match Group (BMG) Process?

The process by which redundant matching records are removed from the match set prior to the consolidation process is called the Build Match Group (BMG) process. It is a very important process for the matching process and plays vital role in Informatica MDM jobs.

How does the Build Match Group process removes the record?

Let's assume that the BMG indicator is on then in such a case if we run a match job then it will remove one of the symmetric matches from the manual match pairs.
e.g.
Let's consider the records below
Pair 1: 'Bob Paul' is matched with 'Robert Paul' with match rule number 3
Pair 2: 'Robert Paul' is matched with 'Bob Paul' with match rule number 5

As we know that the automerge_ind is set 1 for the matching pairs if records matched through auto-merge rules. The BMG process will trigger if all the records are matched with manual match rule then the BMG process will take effect. However, few records matched with the auto-merge rule, and few records matched with manual merge rule than one of the symmetric match entries will be removed from the match table.

When does the BMG process get execute?

There are two jobs during which the BMG process executed. 
1. During Match Job: BMG process get triggered during match process if we enable 'BMG on match indicator' property.
2. During Merge Job: BMG process always gets executed during the merge job. There is no option to turn ON and OFF during the merge job.

What is impact of the BMG process on Manual match records?

There is no impact due to the BMG process on manually matched records. BMG process only applicable for auto-merge jobs i.e. AUTOMERGE_IND is 1 in <BASE_OBJECT>_MTCH table and we also need to enable Base Object for BMG process.




How to enable the Base Object for the BMG process?

In order to enable the Base Object for the BMG process, we need to update the C_REPOS_TABLE table for the BMG_ON_MATCH_IND field. If value of BMG_ON_MATCH_IND is 1 then BMG is ON, if the value is 0 then BMG is OFF for the given table.

Here is sample sql statement to update this field-
update C_REPOS_TABLE set BMG_ON_MATCH_IND=1 where table_name='<TABLE_NAME>'

Important note: Restart the application server with clearing the cache after making the above change.






Thursday, July 9, 2020

Best Practices for Elastic Search in Informatica MDM

Elastic Search a search engine that is based on the Lucene library is used in the Informatica MDM in order to achieve free text searches like google as well as a fuzzy search like match engine search. In this article, we will understand what are the best practices which we need to follow in order to implement Elastic Search using the Informatica MDM solution successfully.



Introduction

It is vital to follow best practices while integrating Elastic Search with Informatica MDM. Some minor configuration may lead to expensive performance cost. The best practices provided here helps not only to achieve better performance but also for better search results.

Elastic Search Best Practices

Here are the details about the Best Practices

1. Indexing Job Execution
If we enable searchable properties for Base Object tables including lookup table then we need to run indexing job for lookup table first then followed by indexing job on remaining Base Object table. 

2. Indexing Job execution for all tables
If we have configured Searchable property for parent and child tables e.g. Party table, Party Phone table, etc. Then we need to run an indexing job for all the tables. First, run the indexing job for Party table and then run jobs for child tables

3. Facets configuration
Facets are used for pre-emptive grouping of the records. We need to use a limited number of facet fields as it has an advance impact on the performance of search functionality. We also have to make sure the fields for which we need to configure facets are having low entropy. Low entropy fields have a low set of unique values.

4. Unused Business Entities
If there are unused Business Entities with searchable properties then delete those as it will cause performance issues for indexing and load jobs.

5. Index Auto commit property
We need to increase the value of the auto-commit property and keep it optimum based on your environment configuration. The property es.index.refresh.interval can be used to set it

6. Indexing jobs in parallel
We should try to avoid running indexing jobs in parallel as that may cause resource exhaustion. 

7. Running load jobs in parallel
If we have configured searchable on multiple tables such as Party and Address tables then do not run load jobs for these tables in parallel. This is because during load job indexing job get executed and may lead to resource exhaustion scenario and job will fail.

8. Deleting indexes
The CleanTable API will not delete the indexes, we need to manually delete it if required. However, in case you still would like to delete the indexes then we need to use the curl command to execute Elastic Search APIs to delete those. As of now, there is no Informatica API to handle this use case.

9. Limiting the number of searchable fields for Business Entities
We have limitations on how many searchable fields we should use for the Elastic Search document. By default 50 number of nested fields are allowed in Elastic Search. Apart from it, there is a limit on the amount of data is required for Elastic Search REST calls. The limit is 104857600. So make sure less number of searchable columns are configured for the Business Entities.



Learn more about Informatica MDM here -




Monday, June 22, 2020

What is the future of Master Data Management?


At present, Master Data management (MDM) has become the core project of any organization. The various industries such as banking, healthcare, insurance, telecommunication, manufacturing, and logistics, etc realized that with the implementation of MDM, businesses can achieve better growth in the competitive market. In this article, we will explore the future of Master Data Management. So let's start.



A. MDM with Cloud Solution

The MDM vendors such as Informatica, Reltio, IBM provides cloud solutions. However, the companies who are using these solutions are criticizing about growing cost of cloud and control aspect of it. The initial cost of the cloud solution implementation is less compared to in house MDM implementation. As data is a growing asset and it leads to more usage over time. Cloud solution cost is directly proportional to usage and hence cost cloud MDM solution increases drastically over the years.  The infrastructure is owned and managed by the product vendor and we need to rely on the vendor for infrastructure issues. These issues are not limited to quarterly or monthly upgrades, server maintenance, emergency bug fixes, server crashes, major product releases, etc.

Even though with having these concerns, companies are still moving forward to use cloud MDM solutions and the reason is the cloud solution provides more sustainability. With recent pandemic, it is proved that businesses with cloud implementation survive better than in house solutions. There is no doubt, cloud solutions will be used by all the applications in the near future.

B. Artificial Intelligence and MDM

Artificial Intelligence (AI) is a buzz word in the current market. The MDM solution which has AI components will have better survivorship compared to one which does not. With recent releases, Informatica MDM has used AI features for small components in the data steward user interfaces. This tells us that the MDM solution components have started looking AI aspect more seriously. Many business intelligence applications are used to capture, store, access, and analyze data to assist business users in making better decisions. AI with business intelligence will create another world and MDM will be part of it.

There is a great scope for improvement in MDM solutions. AI can be used in extracting and transporting data from source to landing area and from landing area to MDM system. This will reduce the development, testing, performance tuning and deployment time. The cleansing and standardization heavily rely on manual configurations. If AI is leveraged then this manual effort can be reduced to a great extend. Another aspect where AI can be used is customer matching. Currently, many vendors use their proprietary match engine to identify and match customer records. Identifying and matching is an iterative process that takes a long time spanning from few months to few years, in some cases it is a never-ending process. If AI is used to identify and match the records then it will help business users as well as stakeholders to achieve their business goals.



C. Smart MDM and User Interface

The user interfaces (UI) used with the MDM solutions are developed with the technologies which are more stable. This is because of new features and smartness comes with newer versions and is hardly replicated in these interfaces. In many cases, we have noticed that the decade-old source code has never been touched in the MDM user interface. Most of the programming languages such as HTML5, JavaScript, Spring, Java, Python, R2, etc are evolving with great space. The future will not be far when these technologies will be self-improving the use of better infrastructure and intelligence. If these user interfaces needed to be survived in the global market then these need to use smartness in the applications. The end users are capable of handling these advanced features in doing daily routine work.  

The end goal of these smart features is to make end users experience not better but the best. The main challenge in the current environment is these user interfaces are not self-explanatory. We have to spend much of the time in training business users. UI can be improved to accept voice and touch commands and in some cases, UI should be smart enough to take its own decisions. This way it will improve productivity and ultimately the profitability.

D. Quicker and Simpler

With the development of data processing technologies, we are able to achieve better improvement in the data processing. However, we see that it take a day to a month to perform initial data load from the source system to MDM systems depending on the volume of the data. This is a situation while dealing with gigabytes or terabytes of data. What will happen if we need to handle exabyte, zettabyte, or yottabyte data in the future? We need to think through now itself about handling future growth of the data within the stipulated time. 30 days of time is going to cost heavily as the value of time is growing at a faster pace. The value of 1 hr from now will be higher in comparison with the value of 1 hr now.

Most of the underlying technologies such as databases, JVMs are not improving in faster processing than what expected. MDM is heaving dependent on these technologies. If underlying technologies improve over time then MDM solutions will be improved automatically else MDM vendors need to come with their own underlying technologies in order to sustain in the future.




E. Increase in Cost - Increase in Value

Due to advancements in technologies such as AI, Cloud computing, etc. the cost of the MDM solution will go high. As it will use extensive data and time for research and solutions. Having said that those increased costs can be explained by the increase in the value of the smart MDM solution.

With the smart MDM approach, we will be creating sustainable, profitable, and future proof solutions that will benefit end customers as well as businesses. The smart MDM is not far!






Saturday, June 6, 2020

Top 10 new features in the Informatica MDM 10.4

Are you looking for an article that will provide detailed information about the new features in Informatica MDM 10.4? Are also would like to know what are the components changed for MDM 10.4? If so, then you reached the right place. In this article, we will discuss what are the new features introduced in the Informatica MDM hub, Provisioning Tool, Customer 360, or Entity 360.

Categories

The new features in the Informatica MDM 10.4 are broadly categorized are:
1. MDM Hub features
2. Provisioning Tool features
3. E360/Customer 360 features

Here are top 10 new features in the Informatica MDM 10.4 -
1. MDM hub login screen
2. Match Rule Sets in Provisioning Tool
3. ElasticSearch customization
4. Hyperlink configuration
5. New Hierarchy views
6. Chart Components
7. Multiple Task handling
8. Find and Replace
9. Bulk Data Import
10. Ad hoc Matching



A. MDM Hub features


1. MDM hub login screen
When user access MDM hub URL https://<server name><port>/cmx, the .jar file will be downloaded instead of JNLP file. Once the user double clicks on .jar file, it will open the login page. From MDM 10.4 onwards you can provide connect URL in login pages. So do not have to download .jar for each environment.



B. Provisioning Tool features


2. Match Rule Sets in Provisioning Tool
The new feature Match Rule Sets is introduced in the Provisioning Tool. Using this feature we can perform match tuning activity with help of business users.






3. ElasticSearch customization

We can customize ElasticSearch properties such as Tokenizers, Token Filters, Character filters, and Analyzer using the Provisioning Tool.



4. Hyperlink configuration

Prior to MDM 10.4, there was no provision to configure hyperlinks in Entity 360 or Customer 360 applications. With this latest upgrade, we can configure hyperlinks for Email, Web, etc fields.




C. E360/Customer 360 features


5. New Hierarchy views
With MDM 10.4, Hierarchy configuration has changed, and also the look and feel of hierarchy different than the earlier versions. 


6. Chart Components
More controls are provided in MDM 10.4 for the configuration of charts in Entity and Customer 360 applications.



7. Multiple Task handling
With newer version of Informatica MDM, you can assign, claim or edit multiple tasks in a single request


8. Find and Replace
The update operation in Entity 360 and Customer 360 application became easier. We can update multiple records in a single request using Find and Replace functionality.

9. Bulk Data Import
The bulk import functionality improved by adding artificial intelligence to it. The mapping of source and target fields automatically done by using artificial intelligence.






10. Ad hoc Matching
Using Ad hoc matching, we can match records dynamically and make a golden copy of it.






Learn more about these features in detail here -


Monday, June 1, 2020

Informatica MDM - Sample requests using Business Entity Services

In this article, we will understand various sample requests using Informatica Business Entity Services. The sample requests are prepared using PKEY_SRC_OBJECT and ROWID_OBJECT values. The server name, port, and other parameters need to be updated specifically to your project.


1: Get request using PKEY_SRC_OBJECT
https://localhost:8080/cmx/cs/orcl-CMX_ORS/Customer/SFA:PKEY_000001?systemName=SFA&depth=5&suppressLinks=true

2: Get request using 'query' parameter
https://localhost:8080/cmx/cs/orcl-CMX_ORS/Customer?systemName=SFA&depth=2&suppressLinks=true&action=query&filter=CustomerAdditionalInformation.xslpTaskId={{SFATaskNumber}}



3: Create a new customer
Endpoint: https://localhost:8080/cmx/cs/orcl-CMX_ORS/Customer?systemName=SFA&depth=2&suppressLinks=true

Request Body:

{
"key":{ "sourceKey": "PKEY_00000021" },
"fullNm": "Sample Customer 123",
"fstNm": "KC First Name",
"lstNm": "KC Last Name",
"partyType":"",

"CustomerAltIdSMPLCustId":
  { "item": [
   {
"key":{ "sourceKey": "PKEY_0000002~SMPLID" },
"SMPLCustId":"123"
   }
   ]
},

"CustomerAltIdSMPLCustUniqId":
  { "item": [
   {
"key":{ "sourceKey": "PKEY_0000002~SMPLUniqId" },
"SMPLCustUniqId":"123"
   }
   ]
},

"CustomerAltIdContractId":
  { "item": [
   {
"key":{ "sourceKey": "PKEY_0000002~CNTRID" },
"ContractId":"123"
   }
   ]
},

"PartyAddress":
{ "item": [
   {
"key":{ "sourceKey": "PKEY_0000002-SHIP" },
"xaddrtyp":"Shipping",
"xaddrln1": "123 ST",
"xaddrln2": "",
"xcity": "New York",
"xcntrycd": "US",
"xstprvnc": "NY",
"xpstlcd": "10101"
   }
  ]
  },
 
"CustomerAlternateNames":
{ "item": [
{ "key":{ "sourceKey": "PKEY_0000002~FORMAL" },
"altNm":"FORMAL_1234",
"xprtyAltNameType":"Does Business As (FORMAL)"
}
]
},

"CustomerPhone":
{ "item": [
{ "key":{ "sourceKey": "PKEY_0000002~PHONE" },
"phnNum":"123456890",
"phnTypeCd":"Telephone"
}
]
},


"CustomerPartyParentRel":
{ "item": [
{ "key":{ "sourceKey": "PKEY_000001~PKEY_0000002" },

"CustomerPartyParent" : {
"key":{ "sourceKey": "PKEY_000001" }
}
}
]
}
}




4. Update customer using rowid_object
Endpoint: https://localhost:8080/cmx/cs/orcl-CMX_ORS/Customer/rowId?systemName=SFA&suppressLinks=true&depth=2

Request Body:

{
    "rowidObject":{{SFDCMDMID}},
    "fullNm": "Integration Update Test 890",
    "fstNm": "Samle first Name 890",
    "lstNm": "Samle last Name 890",
    "AccntStsCode": {
        "accntstscd": "CUST"
    },
    "$original": {
        "fullNm": "fullNm",
        "fstNm": "fstNm",
        "lstNm": "lstNm",
        "AccntStsCode": {
            "accntstscd": "CUST"
        }
    },
    "CustomerAltIdTaxId": {
"$original":{"item":[null]},
        "item": [
            {
                "key": {
                    "sourceKey": "PKEY_0000003~TAXID"
                },
                "taxId": "tax01"
            }
        ]
    },
    "CustomerAltIdABC_CORPOfficeNumber": {
    "$original":{"item":[null]},
        "item": [
            {
                "key": {
                    "sourceKey": "PKEY_0000003~ABC_CORPRTLSNUM"
                },
                "ABC_CORPOfficeNum": "ABC_CORPOffice01",
                "xaltIdType": "ABC_CORP  Office Number"
            }
        ]
    },
   
    "CustomerAltIdGovLicNum": {
    "$original":{"item":[null]},
        "item": [
            {
                "key": {
                    "sourceKey": "PKEY_0000003~GOVLICID"
                },
                "GovLicNum": "tc001",
                "xaltIdType": "Gov License Number"
            }
        ]
    },

    "CustomerPartyParentRel": {
    "$original":{"item":[null]},
        "item": [
            {
                "key": {
                    "sourceKey": "PKEY_0000003~HeadQuarter"
                },
               "xprtyPrnt":8
            }
        ]
    }
}



5: Update customer using PKEY_SRC_OBJECT
Endpoint URL: https://localhost:8080/cmx/cs/orcl-CMX_ORS/Customer?systemName=SFA&suppressLinks=true&depth=2

Request Body:

{
    "key": {
        "sourceKey": "PKEY_0000003"
    },
    "fullNm": "Integration Update Test 6",
    "fstNm": "Samle first Name 6",
    "lstNm": "Samle last Name 6",
    "xhastbcclcnsflg": "1",
    "xnsleffctvdt": "2015-08-19T00:05:31.630+05:30",
    "xnslpblshddt": "2015-08-19T00:05:31.630+05:30",
    "AccntStsCode": {
        "accntstscd": "CUST"
    },
    "$original": {
        "fullNm": "fullNm",
        "fstNm": "fstNm",
        "lstNm": "lstNm",
        "AccntStsCode": {
            "accntstscd": "CUST"
        }
    },
    "CustomerAltIdTaxId": {
        "item": [
            {
                "key": {
                    "sourceKey": "PKEY_0000003~TAXID"
                },
                "taxId": "tax01"
            }
        ]
    },
    "CustomerAltIdABC_CORPOfficeNumber": {
        "item": [
            {
                "key": {
                    "sourceKey": "PKEY_0000003~ABC_CORPRTLSNUM"
                },
                "ABC_CORPRtlSoreNum": "ABC_CORPOffice01"
            }
        ]
    },
    "CustomerAltIdGovLicNum": {
        "item": [
            {
                "key": {
                    "sourceKey": "PKEY_0000003~GOVLICID"
                },
                "GovLicNum": "tc001"
            }
        ]
    },

    "CustomerPartyParentRel": {
        "item": [
            {
                "key": {
                    "sourceKey": "PKEY_0000003~HeadQuarter"
                },
                "CustomerPartyParent": {
                    "key": {
                        "sourceKey": "PKEY_0000005"
                    }
                }
            }
        ]
    }
}


What is CRM system?

  In the digital age, where customer-centricity reigns supreme, businesses are increasingly turning to advanced technologies to manage and n...