cancel
Showing results for 
Search instead for 
Did you mean: 

XML Unnesting and In Memory issue

Former Member
0 Kudos

I have a source table in oracle that contains a column (Nclob) that have XML messages and my target is BW data souce

I build my job like that

Source Table------Quey1 (to convert to varchar)-------------Query2(Unnest XML using extract_xml)-----------------BW Target

I successfully running the job with small test data in source (About 100,000 records) takes about 30 minutes

But when I running with the actual table contains about 140000000 it takes so long time my be more than 12 hours and never ending and it is affect the job server so I decide to kill the job

I checked the log I found that the job running in im memory mode instead of peagable because it is nested schema

I need help to enhance this job and let running successfully

Accepted Solutions (1)

Accepted Solutions (1)

former_member187605
Active Contributor
0 Kudos

Include the XML_Pipeline transform in your data flow. It processes large XML inputs in small batches.

Check out the DS Reference Guide for all details.

Former Member
0 Kudos

Do you mean that I need to remove Query transform that do extract_xml from the column and instead of it use XML_Pipeline?

I tried but I found XML_pipeline only accept files not database

former_member187605
Active Contributor
0 Kudos

Yes, you're right, you will have to convert your db content to a file first.

Are the figures in your post correct? 140,000,000 records will take 1400 * 30 minutes or 700 hours to process!

Answers (0)