Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Loading Large Datasets into Custom Tables

Former Member
0 Kudos

Hello

We will need to convert tables from our legacy systems that contain 10's of millions of rows. Our initial tests done with loading that data through the ECC application shows that it will take weeks to complete.

We have done similar dataloads into other ERP applications (competetors to SAP) by making use of SQLLOAD and loading the data directly into the custom tables. This method allows the data load to be completed in a fraction of the time that an "full bodied" application can do.

Of course, we fully realize that loading seeded SAP tables in this manner would be a big no-no, but we are talking about our own custom SAP tables here.

The advantages of the SQLLOAD approach include:

1) No application overhead

2) Database tools are quicker

3) We can disable indexes

4) We can turn off archive logs

5) It get's done in a fraction of the time.

So then, what is it that I am getting at?

Well, SAP notes say that loading tables in this manner could violate licensing (of course they don't distinguish between seeded SAP tables and our custom tables).

Have you seen companies load large datasets through methods other than through the application itself? What methods? Or have you see large datasets loaded through the application quickly and efficiently?

Thank you,

John Klaassen

Haworth, Inc.

3 REPLIES 3

Former Member
0 Kudos

So long as they are custom Z tables, there won't be any licensing issues. We all have these sorts of tables that we update using update report programs.

I should add that we do this with ABAP, not database utilities.

Rob

Message was edited by:

Rob Burbank

Former Member
0 Kudos

John,

Could you elaborate on what you mean by "fraction of the time?" (Even 9/10 is a fraction.)

I have never seen anybody using DB tools to load data into custom tables in SAP.

10's of millions of rows shouldn't take weeks to load. If the data file is in the application server and if the application server and the database server are on the same network, it shouldn't even take days if the ABAP program is written by an experienced programmer. But, you mention of ECC application... I don't know what it is.

BTW, you can disable indexes and recreate them even when loading via SAP.

In conclusion, a simple upload program written in ABAP with an efficient logic shouldn't take twice the amount of time taken by SQL*Loader. This is assuming the data file is on the application server and there is not too much network overhead.

0 Kudos

Thanks for taking an interest in the topic. I really appreciate it.

As a point of clarification: This is for a conversion load (more of a "one time" event, not a periodic occurence).

Regarding "fraction of the time": In the past we have loaded large datasets directly into the database, after turing off archive logs, triggers, indexing and we've seen load times drop from (as an example) 4 hours when using the application to 20 minutes when loading directly into the db.

That experience was not in an SAP erp system (Oracle Apps actually), but that experience has lead our development staff to want to avoid loading large conversion datasets through the application in SAP. That is why they are pushing to use the same method (SQLLOAD).

ECC application = SAP ERP 2005.

We will encourage the dev staff to use the ABAP method for their next load test.

Thank you again!

John Klaassen

Haworth, Inc.

www.haworth.com