10-16-2008 10:26 AM
Hi Experts;
The requirement is like 15 million records are to be inserted into the z-table from application server at a time in foreground. The Processing like insertion happens for the limited no. of records successfully but in the case of entire records i.e. 15 million records at a time, it shows an error like memory / buffer size problems.
Regards
Shashi
10-16-2008 11:16 AM
Hi
I would suggest take the values in to a internal.
and insert records into ztable in blocks of 10000 and use commit work
Regards
Madhan
10-16-2008 12:06 PM
Yes, I am taking it into an internal table but as we know max at a time 1.5 lacs data can be stored after the sap gives the short dump.
Regards
Shashi
10-16-2008 12:16 PM
Hi ,
First u want to upload more than the 1.5 lac records,, try to follow this way ...
1. create the Table maintain gen.
2. creat the tcode for the one.
3.Write the BDC program for this to upload .
Hope this will work..Try
Regards,
bharani
10-16-2008 12:22 PM
Hi,
You need to insert batch by batch and commit aftre each batch. Trying to insert all the 15 million records in one stretch will surelu dump.
try out this code line:
data: low type i,
high type i,
blocksize i value 50000.
low = 1.
high = blocksize. 'any optimal value
do.
clear: lt_data_block.
append lines of lt_huge_data from low to high
to lt_data_block.
if lt_data_block is initial.
exit.
endif.
-
> put your insert statements form table lt_data_block
COMMIT WORK.
low = low + blocksize.
high = high + blocksize.
enddo.
This should work for you,
Poornima
10-21-2008 6:00 AM
Hi,
Was your issue resolved with solution provided ? Please mention it , to bring the thread to its logical conclusion.
Regards, Poornima
10-23-2008 9:26 AM
07-09-2009 12:48 PM
07-09-2009 12:59 PM
Hi,
Can you Please elabarate the solution,as i am also facing the similar kind of requirement??