Client copy of a database with some large tables
I am currently performing a client copy as part of an R/3 4.6C test system refresh.
We used parallel processes which does speed the client copy up and the largest tables will always take the longest to copy. It appears to be still processing our last table which is 90Gb in size. It is currently performing a sequential read on this table and the only way I can see that it is doing anything is that it is using CPU and I/O from WRKACTJOB. The client copy has now been runnig for 2.5 days and we have a fast machine (570 with 5 CPU 70Gb Ram 4Tb DASD with 50% free)
Is anyone familair with the process or know of any other ways to work out how much longer this process will take or to check if it actually is doing something, advice would be much appreciated.
Any other experiences / tips would be welcomed. We have been looking at other thrid pary software like gold client as this process will get longer and longer as the database grows and funding for archiving projects are being withdrawn.