cancel
Showing results for 
Search instead for 
Did you mean: 

Migration of SAP applications running on SQL Server 2008 DB

Former Member
0 Kudos

Hi Team,

We have SAP applications ECC,BW,GTS,Portal,HCM running on Windows 2008 R2 and Database MS SQL Server 2008. Our client is planning to move all the SAP applications to cloud,Upgrade of All SQL Server Databases SQL 2008 to SQL 2014 and Upgrade of all SAP applications to latest releases( For Ex: ECC6 EHP5 to ECC6 EHP7). I have to suggest the best approach to migrate the all SAP applications to cloud environment(Target environment is also Windows and MSSQL).

What are the best options to upgrade and migration to cloud environment or vice-versa.

1)First do the Migration of all SAP applications to cloud environment and Up-gradation of SQL Database from 2008 to 2014.

2) First do the SQL Databases upgrade from MS SQL Server 2008 to MS SQL Server 2014 for all SAP applications in the current environment and then migrate the all application from physical environment to Cloud environment.

What are the best approaches to Migrate all SAP applications to Cloud environment If production DB size is 5TB

1) MSSQL specific attache and detach method.

2)Backup and recovery

3) SAP Specific Export and Import method.

Note: SQL Server 2014 is supported on Windows Server 2012 and higher.SAP releases prior to SAP NetWeaver 7.0 are not supported to run on SQL Server 2014. For more information, see SAP Note 1966701.

Our all SAP applications are running on further release of NetWeaver 7.0

Please suggest with your ideas.

Regards

Chandrasekhar

Accepted Solutions (0)

Answers (1)

Answers (1)

ImtiazKaredia
Active Contributor
0 Kudos

Hi,

It depends on various factors. There is no straight answer here.

If you provide below answers it can give more clarity.

1. Is your current SQL database compressed, like page compression is done?

2. Is you single Production DB 5TB or all in combine.

3. How many and how long outages can you take?

With the DB size and assuming DB is page compressed, I would say below you perform in place DB and EHP upgrade before moving to cloud. This reduces unknown risk as issues related to DB upgrade and EHP gets resolved beforehand and you know any issues which occur are related to hardware changes once you move to cloud.

Doing all after moving to cloud may mean a big outage and to many variables.

Another way could be to move as is to cloud and come to a steady state and then do DB and EHP upgrade as separate projects

Thanks

Imtiaz

Former Member
0 Kudos

Hi Imtiaz,


Thank you for prompt response. Currently we are in Planning stage only.


1) My Current SQL database is not compressed. Can we perform compress the SQL DB before moving to cloud environment?

2)Only ECC DB Size is 5TB,BW 2 TB, Portal are 100GB ..etc. Which is best option to move the data to cloud environment.

1) MSSQL specific attache and detach method.

2)Backup and recovery

3) SAP Specific Export and Import method.

3)How many and how long outages can you take--> We need to check with client.

Regards

Chandrasekhar

luisdarui
Advisor
Advisor
0 Kudos

Hello Maruthi,

1) Database Compression is default by new SAP installations. Check out SAP Note 1488135. I'd suggest to run it and afterwards follow SAP Knowledge Base Article 1721843.

2) The Heterogeneous copy method (R3Load) offers advantages (like releasing unused DB space to the OS, like mentioned in the KB Article 1721843), but it is generally time consuming and requires a greater downtime window than Backup/Restore. I`d stick with Backup/Restore or Detach/Attach.

3) Which Cloud service you are going to? SAP Cloud, MS Azure or?

Best Regards,

Luis Darui

Former Member
0 Kudos

Hi Luis Darui,

3) We are going to Amazon Cloud service.

Regards

Chandrasekhar

luisdarui
Advisor
Advisor
0 Kudos

Hi Maruthi,

There are no specific SAP Notes for SQL Server on AWS, but SAP has released an important note for SAP on AWS: 1656099. There is a very important remark for SQL Server on public cloud environment!

Another important remark. You are currently running on SQL Server 2008 (R2) and you wants to run on SQL Server 2014. Remember to meet the minimum SAP NetWeaver SPS requirements described in SAP Note 1966681. If you apply these minimum requirements, you can migrate to SQL Server 2014 in one step, avoiding another downtime.

Best Regards,

Luis Darui

Former Member
0 Kudos

Hi Luis Darui,


We have seen minimum SAP NetWeaver SPS requirements described in SAP Note 1966681.Just want to know Which one is the best approaches to Migrate all SAP applications to Cloud environment If one production DB size is 5TB

1) MSSQL specific attache and detach method.

2)Backup and recovery

Regards

Chandrasekhar

ImtiazKaredia
Active Contributor
0 Kudos

Hi,

DB compression is recommended for you. You can first Upgrade SQL and then perform an online DB compression. I have seen DB space reduction by over 70 %. So your ECC could drop to 1 TB. You will have less data to move so this will save time during migration.

MSSQL attach/detach should be used considering size of DB.

Thanks

Imtiaz

luisdarui
Advisor
Advisor
0 Kudos

Hi  Chandrasekhar,

It depends! Let us say you have a 5TB database spread over 16 Data Files.

Launching 16 file copies (assuming you have enough safe bandwidth) can be faster than copying a single backup file to the target destination.

If you have a solid backup strategy applied to this system, you can at any time send (now) your latest FULL BACKUP and restore with 'NO RECOVERY' and send all T-Logs and start applying them. You can do it every day until the day you have to GO-LIVE in the cloud.

In this day you do TAIL-LOG in the source (on-premise) database and transfer this log backup and apply it after the latest T-log backup has been applied (the log backup you tail from the source system will bring the database up).

See the following link: https://msdn.microsoft.com/en-us/library/ms179314.aspx

You can even run a PoC if this would work for you.

e.g.

1. Create a database locally


CREATE DATABASE TEST

GO

USE TEST

GO

CREATE TABLE TB1

(

Col1 INT IDENTITY NOT NULL PRIMARY KEY,

Col2 INT NOT NULL DEFAULT 0

)

GO

Make sure that the database was created with FULL recovery mode (mine did automatically).

2. Make a full backup.

3. Insert a few rows:


INSERT INTO TB1 (Col2) VALUES (1);

INSERT INTO TB1 (Col2) VALUES (2);

4. Take a T-Log Backup.

5. In the target database server, restore this full backup with NORECOVERY option. This will make the database in 'RESTORING'

6. In the target database server, restore the t-log taken. Use RESTORE WITH NORECOVERY OPTION:

7. In the source database, insert a few rows again in the database table TB1:


INSERT INTO TB1 (Col2) VALUES (3);

INSERT INTO TB1 (Col2) VALUES (4);

8. Take a new T-Log file, but now use the option NORECOVERY:

9. Afterwards your source database will be in "Restoring" state.

10. In the target database server, restore this tail log backup WITH RECOVERY. This will bring your database online.

11. Run a query against TB1 and check whether the entries you have inserted into TB1 are there.

12. Run a DBCC CHECKDB with no_infomsgs to ensure that the database is free of corruption.

With the above steps you can use a similar approach to your ECC database. What advantages you have?

  1. You send the bigger file in advance to the target server on Cloud
  2. You can test this backup and the t-logs you are sending before starting the real restore. Just keep in mind you can't bring the database online (restore WITH RECOVERY) otherwise you cannot apply the T-logs.
  3. You can start the recovery of the FULL BACKUP before starting the migration. You can leave the Database in "RESTORING..." state awaiting the last few T-Log backups and the Tail-Log. This can reduce drastically your downtime window if you were planning to include the 5TB transfer to this new host.

Best regards,

Luis Darui