Minimizing Downtime during large database Migration with Oracle Data Guard
we have to migrate a large database to a new datacenter. My approch to avoid downtime was to install a Oracle DataGuard system completely and do log shipping of the current live produciton system to this target system.
We are aware that the SAP provides the following propositions (but no option):
Would you recommend this method and there are documentation for this (DataGuard)- method?
Many thanks for your help!
Stefan Koehler replied
> Would you recommend this method and there are documentation for this (DataGuard)- method?
Absolutely. I have used this method several times (at client site) for moving critical / large Oracle databases (> 7 TB) from one site to another or for OS migrations. You can also use heterogeneous Data Guard for various OS combinations (e.g. from Windows to Linux). The downtime itself depends on the DG configuration, but mostly the business allows a downtime of just a few minutes which is no problem at all with DG.
Long distances (with minimal performance impact) can be realized with cascading standby databases or Far Sync in 12c (12c is currently not supported for SAP, but you can use the "manual" cascading approach). In combination with redo compression (ACO option) it becomes even more smoothly.
Another benefit of using physical standby databases is that you do not change any "physical" data structure by applying the changes logical. All the statistics and "physical" databases objects are kept the same and so the risk of of Oracle related performance issues after migration is also minimized. The standby site (including SAP) itself can also be tested and verified before Go-Live with a feature called "snapshot standby database" without rebuilding the environment at all.
> Where to find documentation about Oracle DG?
I recommend to use the Data Guard Broker (except by using a cascading standby solution as this is only possible with DG Broker 12c onwards).
Go for it