Hello Experts,
We have a requirement to use data from various systems and even with flat files.
the problem is due to different data standards. eg. MATNR.
Its an 18 length field.
In some systems I values like
000000006400750200
6V90045100
Z24387
AIMS04-16-002
While in another the values will be
6400750200
6V90045100
Z24387
AIMS04-16-002
As can be seen i have problem if I join directly on material numbers. Thus I use LTRIM to remove all leading zeroes.
However, we are facing a lot of performance issue when using LTRIM in our scripted views.
Sometimes even the memory is crashing. After analyzing the plan we found that the records after using LTRIM is raising to billions of data consuming a high memory.
COuld you please look the below two views. One is created using attribute views having LTRIM in calculated column MATNR. And teh other without LTRIM using the same MATNR .
Eg As per the Plan the no pf records without LTRIM are 159718.
Now with LTRIM
The only difference is in the attribute views built on top of SLT layer.
First one, directly uses the matnr from source tables like (MARA, MARC,etc)
Second one, has a calculated field which is a LTRIM("MATNR",0)
Additionally I have another observation.
As of SP9, SAP recommends using Calculation views as dimensions instead of attribute views.
I even tried that and I am afraid to say the performance is even pathetic.
Code with Attribute views without LTRIM takes 50 seconds
Code with Attribute Views with LTRIM takes 2 minutes but with huge memory footprint.
Code with Calculation Views as dimensions runs for 6 minutes and then gives out of memory exception
I ll really appreciate your pointers as this is a high priority issue for us.
Regards
Krishan