cancel
Showing results for 
Search instead for 
Did you mean: 

Minimize downtime during UNICODE conversion

Former Member
0 Kudos

Hello forum !

I am looking at the UNICODE conversion of a large (?) SAP Oracle database for ECC. The Database it self is at about 10 TB. UNICODE converting this in one single run will take too long to fit inside the company defined tolerable downtime window.

When looking at the data I find that the largest 50 tables makes up about 75% of the total database size. Most of the 50 largest tables have data for several years back, and I am able to partition the tables with one partition pr year.

The good thing about this is that I am able to make the partitions with old data (more than one year old data) READ-ONLY.

My assumption is that I will be able to UNICODE convert 60-70% and create the indexes for these partitions without taking down the database to be converted.

Now,- doing the downtime-phase I will be able to exclude the old partitions of the largest tables in the export since I have already created a new UNICODE database with these objects and thereby drastically reducing the required downtime.

Is there anyone who has any experience with an approach like the one I have schetched out here ?

Kjell Erik Furnes

Accepted Solutions (0)

Answers (5)

Answers (5)

Former Member
0 Kudos

Hello again !

Yes,- I am aware of the SAP service IMIG. I was curious if anyone had done any work without having to buy the IMIG service from SAP.

Kjell Erik Furnes

markus_doehr2
Active Contributor
0 Kudos

> Yes,- I am aware of the SAP service IMIG. I was curious if anyone had done any work without having to buy the IMIG service from SAP.

I have no experience with that - I just read about it some time ago... I think they pulled that off-marketplace because the procedure is pretty complex.

Markus

Former Member
0 Kudos

Can you please be a bit more specific on your installation? How long is your possible downtime (export time should be about 2/3 of that)? How long is your actual export estimation? Do you have suitable hardware (CPU/disk)?

What you absolutely need to have:

[1043380 - Efficient Table Splitting for Oracle Databases|https://service.sap.com/sap/support/notes/1043380] -> export tables with ROWID splitting

[855772 - Distribution Monitor|https://service.sap.com/sap/support/notes/855772] -> enables distributed and parallel export/import

So on the hardware side you will need, a storage subsystem as fast as possible with two copies of the disks (one for exporting and one for importing). A bunch of application servers with fast CPUs helps a lot, everything connected with at least gigabit ethernet.

If you knew all this already, sorry for the spam. But i achived throughputs at about 400gb/hour on a medium sized hardware, i think throughputs of 1tb/hour should be possible (not talking about cluster tables

Regards, Michael

stefan_koehler
Active Contributor
0 Kudos

Hello Kjell,

the performance for the unicode conversion depends mostly on the following factors:

- The CPU and I/O performance limit the amount of parallelism

- Table splitting (and access path)

- Parallelism by unloading / loading the data

I have already answered such a question and pointed to some tools in this thread .. please check:

This thread is not about unicode .. but you use the same tools (i also mentioned this in the thread)

Regards

Stefan

markus_doehr2
Active Contributor
0 Kudos

I read some time ago an article how to do UC conversion with very large databases, unfortunately, since SAP changed all the service.sap.com/unicode@sap links to SDN I can't that presentation.

The article was about a IMIG approach to do the conversion:

693168 - Minimized Downtime Service and Incremental Migration (IMIG)

Markus

Former Member
0 Kudos

Hello !

Yes,- we are running archiving on the database. We have been archiving on the system the last 10 years and there is quite a large amount of data that has been taken out of the system.

The archiving of the database is an ongoing task and we are archiving out masses of data every month.But we are still having a high growth-rate and a large database to handle for a UNICODE conversion.

Kjell Erik Furnes

Former Member
0 Kudos
When looking at the data I find that the largest 50 tables makes up about 75% of the total database size. Most of the 50 largest tables have data for several years back, and I am able to partition the tables with one partition pr year.

Did you check the possibility of archiving/reorg before conversion?