Loading Large Datasets into Custom Tables
We will need to convert tables from our legacy systems that contain 10's of millions of rows. Our initial tests done with loading that data through the ECC application shows that it will take weeks to complete.
We have done similar dataloads into other ERP applications (competetors to SAP) by making use of SQLLOAD and loading the data directly into the custom tables. This method allows the data load to be completed in a fraction of the time that an "full bodied" application can do.
Of course, we fully realize that loading seeded SAP tables in this manner would be a big no-no, but we are talking about our own custom SAP tables here.
The advantages of the SQLLOAD approach include:
1) No application overhead
2) Database tools are quicker
3) We can disable indexes
4) We can turn off archive logs
5) It get's done in a fraction of the time.
So then, what is it that I am getting at?
Well, SAP notes say that loading tables in this manner could violate licensing (of course they don't distinguish between seeded SAP tables and our custom tables).
Have you seen companies load large datasets through methods other than through the application itself? What methods? Or have you see large datasets loaded through the application quickly and efficiently?