on 01-16-2015 2:01 PM
I have a source table in oracle that contains a column (Nclob) that have XML messages and my target is BW data souce
I build my job like that
Source Table------Quey1 (to convert to varchar)-------------Query2(Unnest XML using extract_xml)-----------------BW Target
I successfully running the job with small test data in source (About 100,000 records) takes about 30 minutes
But when I running with the actual table contains about 140000000 it takes so long time my be more than 12 hours and never ending and it is affect the job server so I decide to kill the job
I checked the log I found that the job running in im memory mode instead of peagable because it is nested schema
I need help to enhance this job and let running successfully
Include the XML_Pipeline transform in your data flow. It processes large XML inputs in small batches.
Check out the DS Reference Guide for all details.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
81 | |
10 | |
10 | |
9 | |
7 | |
6 | |
6 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.