cancel
Showing results for 
Search instead for 
Did you mean: 

How create a backup

Former Member
0 Kudos

Hi experts,

I install my first system in ASE... and I have a big problem; how can do a backup?

In db13 only appear two actions:

- SQL Script

- Central Calendar Log Collector

How can create a backup with this actions?

My system is SAP ERP 6.0 with EHP5

Thanks in advance,

Regards,

Accepted Solutions (1)

Accepted Solutions (1)

0 Kudos

Hi Victor,

Please check this thread>

http://scn.sap.com/message/13431383

backup is created with dump database command

dump database <DBSID>  to "E:\DBBVACKUP\NAME_OF_YOUR_DUMP_FILE.dmp"

Please check very good documentation at

http://scn.sap.com/docs/DOC-29524#comment-321365

check SAP note

Note 1585981 - SYB: Ensuring Recoverability for Sybase ASE

Note 1588316 - SYB: Configure automatic database and log backups

best regards,

Peter

Getting Started with Sybase ASE and the SAP System

Former Member
0 Kudos

I try to configure the backup with the note 1588316 but I don´t know how can I create the procedures...

Is there any management tools such us SQL Server? Or where can I administrate the DB?

Regards,

JanStallkamp
Employee
Employee
0 Kudos

Hi.

If you are on Windows I would start dbisql and log in as user sapsa. You can then just copy and paste the stored procedure to the input area and run it. This will create the stored procdeure in the datbase. Or you can run copy them to a file and run that on command line using isql.

As the graphical UI we recommend the DBA Cockpit for monitoring as it is designed for the SAP specific database usage. There is also a growing number of administrative tasks that can be handled with the DBA Cockpit. Beside that there is e.g. Sybase Central.

Regards,

Jan

Former Member
0 Kudos

Jan,

I'm new to SAP; I understood that most, if not all Database Admin tasks could be completed via DBA Cockpit.  From your post it looks like SAP is in the process of implementing all of the admin tasks (i.e. backups) in Cockpit for ASE. Is that correct?

I can see that was has been implemented is VERY good - I've read the document on 'Automatic Table Maintaince' which details some really useful and well thought out functionality.

Is it possible to get a list of  :-

- What DBACockpit for ASE can do at the moment

- What is going to be rolled into DBACockpit for ASE in the future?

cheers,

Chris

0 Kudos

Hi Victor,

All SP are attatched to SAP Note 1588316, you have to >>

Attached to this note you will find an SQL text for a Sybase ASE stored procedure 'sp_dumptrans'. This generates a unique, timestamp-based name for a dump device file name. It then calls the 'dump transaction' command using this device name.

A second stored procedure 'sp_dumpdb' generates a unique, timestamp based dump device file and calls the 'dump database' command using this device name.

Review the SQL text for these procedures and adapt it as needed. Change the code that generates the path for the dump device as required.

It is recommended that you create these stored procedures in the 'saptools' database.

executes stored procedure sp_dumptrans in database saptools

<<

Create at DBACOCKPIT SQL action

EXEC saptools..sp_dumpdb


At sp_dumpdb check variable curr_logdev adapt your location for backup.

Best regards,

Peter

Former Member
0 Kudos

but... how can create the stored procedure?

I have the file and I think, I know how create the SQL Action, but... How can create the stored procedure for use the files attached to the note?

Thanks in advance,

Regards,

Former Member
0 Kudos

Hi,

I don´t know these tool, the dbisql.

Are there any other tools for management ASE? Like Management tool of SQL Server?

Any other interesting tool?

Regards,

Former Member
0 Kudos

I do not see why you need to create a store procedure to complete a backup.

Just run the backup SQL you require.

Former Member
0 Kudos

ok, how?

0 Kudos

Hi Victor,

I will provide information, which I did on my system to make it clear>

download SQL script from SAP Note file sp_dumpdb.sql copy it on your OS DB server e.g. to  /backup

modified and adapt to your needs e.g. curr_logdev

from command line run sql script

isql -S<YOURDBSID> -Usapsa -i /backup/sp_dumpdb.sql

system will ask for password - it should be master password during installation - I guess

/backup - for Linux if windows provide the path

it will create SP at saptools as you can see at script>

use saptools

go

if not exists (select name from sysobjects where name = 'dump_history' and type = 'U')

begin

  exec('create table dump_history (DBName VARCHAR(30), Type VARCHAR(10), Command VARCHAR(512), StartTime DATETIME, EndTime DATETIME, RC INTEGER, Status CHAR(1))')

end

go

if exists (select name from sysobjects where name = 'sp_dumpdb' and type = 'P')

begin

  drop procedure sp_dumpdb

end

go

create procedure sp_dumpdb (@sapdb_name varchar (256))

<

So it mean if already you have one, it will run drop, and then create new one - so if you decided to change your backup location, just modified SQL and re-run it

So now you should have at saptools your SP called sp_dumpdb

About the dbacockpit - I am not sure what need to be done, as I always run it from OS level

Hope now it is more clear.

If I run it from OS level I run those commands

sp_dumpdb <DBSID>

go

Best regards,

Peter

JanStallkamp
Employee
Employee
0 Kudos

Hi.

dbisql / isql is the command line interface to ASE. If you are on windows this should be installed during the SAP installation at \sybase\<SID>\DBISQL\bin\dbisql.exe

Regarding other tools: I already mentioned Sybase Central, feel free to have a look at this.

Regards,

Jan

JanStallkamp
Employee
Employee
0 Kudos

Hi Chris.

The reason we recommend to have a stored procedure and use the DBA Cockpit is that in a typocal SAP scenario you want to run a full backup e.g. daily and do a transaction log dump every hour (or even more). To set this up the DBA Cockpit offers a wizard-like interface to the job scheduler.

In the stored procedure we can encapsulate all the neccessary logic and some additional tasks (if required). Every customer is free to set up his own backup solution (or to use some 3rd-party solution). The stored procedures attached to the SAP Note are just our template for setting up a basic backup strategy.

Regards,

Jan

Former Member
0 Kudos

Jan,

does that mean DBACockpit is 'hardcoded' to call one stored procedure sp_dumpdb ? You customise this as required?

Cheers,

Chris

JanStallkamp
Employee
Employee
0 Kudos

Hi Chris.

No, you can schedule any SQL script you want. You could also run scripts for other tasks there in regular intervals.

/Jan

0 Kudos

Hi Chris,

DBACOCKPIT is not hardwired to call sp_dumpdb.

You can enter whatever SQl script you like when creating a backup job in DBACOCKPIT .

sp_dumpdb is just an example stored procedure that could be used to dump your databases to a file on disk.

See SAP note 1588316.

Rgds

Tilman

Former Member
0 Kudos

Hi,

I can do the backup but... I runned 8 hours ago and the backup do not finish, the file dmp continue increasing.

Is normal this time? It is very slow...

My system is Windows 2008 and the database is 150GB.

Regards,

0 Kudos

The time it takes to dump your database depends on the size and your hardware of course.

However, if you are unable to dump a database of 150GB in say max 2 hours , then I suggest something is going on.

If you dump to a file on disk , Windows is perhaps swapping like hell.

Can you check the settings in Control Panel\System and Security\System

--> Advanced System Settings --> System Properties , tab Advanced --> 'Performance' :

Set Checkbox  'Adjust for best performance'

(path to syste,m properties might be slightly different on your Windows)

HTH

Tilman

Former Member
0 Kudos

I changed it but I continue with the same problem, very slow... I calculate more or less:

500.000MB of dmp file in 10minutes

if the fmp file should be 30GB more or less, the final time will be 600minutes = 10 hours.

Why is so slow? The server is new, with 12GB ram and 16CPUs

Regards,

0 Kudos

50MB per minute is not really an excellent thoughput rate , to be precise it is veery bad !

I'd ecpect to see some 50MB/s

What rate to you get when you just copy the data file of the database to the target location maually using the copy command on a dos prompt (or cut and paste in the explorer) ?

Can you try to monitor what happens Windows perofrmance Monitor  while the dump is being executed ?

Rgds

Tilman

Former Member
0 Kudos

If I copy and paste a file, the rate is around 80MB/s.

While the dump is executing; in the Windows monitor not appear nothing, the system is stopped and only appear, sometime, 1%of cpu in the process sqlsrvr.exe

Could be a problem with the script of the backup?

Regards,

0 Kudos

So we know that the system is capable of doing some 80MB/s

The issue is very liekly NOT with the script of the backup .

ASE initiaes a backup only and notifies the backupsrv.exe process to do a backup.

The backupsrvr.exe process mainly reads the database as it is from disk and copies it to the backup medium in this case a dump file on disk (actually it is a bit more complicated, but that is not important here)

I still do suspect that your Windows server does swap.

When  you open the taskmgr --> performance tab , there is a button resource monitor.

Open it, go to the 'disk'  tab , there is a panel 'Disk Activity'  

If you order by read/s or writes/s I'd assume to see the pagefile.sys on top when you do the dump

Rgds

Tilman

Former Member
0 Kudos

Hi Tilman,

Lot of thanks, but... not appear the pagefile.sys... instead of appear 3 process sybmbuf.exe

What is this process? Could be the problem?

Thanks in advance,

Regards,

0 Kudos

sybmbuf.exe is yet another executable which is started by the backupserver.  This executable is the one that acually reads database pages from disk and stores them to the backup device .

(As I said above it is reaqlly a bit more complicated with the backupserver) .

You are looking at the window 'processes with disk activity' in the 'resource Monitor'

Below that window is a second one called 'Disk Activity' which is hidden by default. Click on the little triangle to the rigth to open it. Ther you should see the executables along with the file they use and how much reads and writes go to each file.

regds

Tilman

Former Member
0 Kudos

Hi Tilman,

I´m checking the Disk Activity... I attach an image:

Any strange?

Regards,

Former Member
0 Kudos

From this it looks like :-

- You are dumping to one stripe (two sybmultbufs)

- Backups and data device(s) are on same volume (F) -  is this correct?

Former Member
0 Kudos

I don not understand the first point... what do you meaning?

The second point is correct. All in the same volume.

Regards,

Former Member
0 Kudos

Could you ost the SQL you are using to complete the backup?

Also what hardware sits under the F: volume?

Cheers,

Chris

Former Member
0 Kudos

Hi Chris,

I´m using the SP that appear in SAP note 1588316

The F disk is a RAID 5 of several disk.

Regards,

Former Member
0 Kudos

Could you post the SQL run to complete the dump?

Also try completing a read/write (file copy) outside of  Sybase on the F: drive (I often complete a 'dd' in UNIX to evaluate read performance).

Former Member
0 Kudos

I attach the script:

 

--script to set up stored procedure sp_dumpdb in database saptools

--version 1.0.2

--date 04 Oct 2012

--changes:

--1.0.1 fix issue that cause sp to not run in jobscheduler

--1.0.2 no change

use saptools

go

if not exists (select name from sysobjects where name = 'dump_history' and type = 'U')

begin

exec('create table dump_history (DBName VARCHAR(30), Type VARCHAR(10), Command VARCHAR(512), StartTime DATETIME, EndTime DATETIME, RC INTEGER, Status CHAR(1))')

end

go

if exists (select name from sysobjects where name = 'sp_dumpdb' and type = 'P')

begin

drop procedure sp_dumpdb

end

go

create procedure sp_dumpdb (@sapdb_name varchar (256))

as

begin

declare @sqlstatus smallint

declare @sqlerror int

declare @curr_logdev varchar(1024)

declare @current_ts char(19)

declare @starttime datetime

declare @dumpcmd varchar(1024)

declare @sjobid int

declare @sec_past int

declare @backup_active int

declare @dbid smallint

commit

set chained on

set quoted_identifier on

set @sqlerror = 0

set @sqlstatus = 0

-- is it a valid DB name?

select @dbid = db_id(@sapdb_name)

if @dbid IS NULL

begin

print '%1! is not a valid database name', @sapdb_name

select @sqlerror=-100

commit

goto exit_sp

end

-- work-around JS timing issue

-- 1. determine own sched_job_id

SELECT @sjobid = jsh_sjobid FROM sybmgmtdb..js_history WHERE jsh_spid = @@spid AND jsh_jobend IS NULL AND jsh_state='R2' AND jsh_exit_code = 0

-- 2. get seconds since last execution

SELECT @sec_past = datediff(ss,MAX(jsh_jobstart),current_bigdatetime()) FROM sybmgmtdb..js_history WHERE jsh_sjobid = @sjobid AND jsh_jobend IS NOT NULL

-- 3. skip dump trans if last execution time is within the past 120sec

if @sjobid IS NULL or @sjobid < 1 or @sec_past > 120 or @sec_past IS NULL

begin

-- ensure no dump database is running

select @backup_active = BackupInProgress from master..monOpenDatabases where DBName = @sapdb_name

if @backup_active = 0

begin

-- execute dump db

select @starttime = getdate()

select @current_ts = CONVERT(varchar,@starttime,23)

select @current_ts = str_replace(@current_ts,':','_')

select @curr_logdev = 'F:\backup\'+@sapdb_name+'_'+@current_ts+'.dmp'

select @dumpcmd = 'dump database ' || @sapdb_name || ' to "' || @curr_logdev || '" with compression = 101'

insert into saptools..dump_history(DBName,Type,Command,StartTime,EndTime,RC,Status) VALUES (@sapdb_name,'DB',@dumpcmd,@starttime,NULL,NULL,'R')

commit

exec(@dumpcmd)

select @sqlstatus = @@sqlstatus, @sqlerror = @@error

if @sqlerror <> 0

begin

update saptools..dump_history set EndTime = getdate(), Status = 'E', RC = @sqlerror where StartTime = @starttime

end

else

begin

update saptools..dump_history set EndTime = getdate(), Status = 'F', RC = 0 where StartTime = @starttime

end

commit

end

else

begin

insert into saptools..dump_history(DBName,Type,Command,StartTime,EndTime,RC,Status) VALUES (@sapdb_name,'DB','',getdate(),getdate(),0,'S')

print 'skipping database dump due to an active database backup'

select @sqlerror=-100

commit

end

end

exit_sp:

if (@sqlerror<>0)

return @sqlerror

else

return 0

end

go

0 Kudos

You ordered the ouput on Reads (B/ second) . The Sybase database device file seems to be on top, which seems to be just as expceted during a dump database.

I suggest to filter the output just by sybmbuf.exe and bcksrvr.exe and the sort according to writes.

Also adjust the output with for the files so you see the top. 

And yes, the sybmbuf.exe appear to have a read speed of just below 1 MB/s and a write speed of some 600KB/sec.  That's slow and results in some 48 hours for a 100GB database.

Rgds

Tilman

Former Member
0 Kudos

OK.  Looks like you're using the new lightweight dump compression (101). I've not used this yet.  It's meant to deliver a compressed dump with less CPU overhead.

your are also dumping to one stripe - dumps can be sent to multiple files at the same time - which can  speed up dumps. I.e. :-

dump database foo to 'F:\dumps.foo.1.dmp' stripe on 'F:\dumps.foo.2.dmp'

Try turning off the 'new' compression (new in 15.5)  - try the old compression type. 

I think this is dump database foo to "compression:2:{file location}"  ...

OR  try without compression and see what you get.

Chris

Former Member
0 Kudos

Hi,

I attach the Disk Activity ordered by writes, but I don´t see the process bcksrvr,exe

What could be the problem?

0 Kudos

P.S. I just finsihed a dump database to a disk on a Windows VM  - same disk as the database device itself,  database not really tuned 

50 GB database took some 30 minutes . 

P.P.S:

You do know that RAID5 is not optimal for write performance ?

And : I assume that goes without saying - but of course , in production system  you don't backup to the same disk(s) as the database , do you ?

Rgds

Tilman

0 Kudos

write speed is slooooooooooooooooooow. Go ask your Windows system adnministrador and or HW support.

I suggest to also check ST06 -> Memory :  Are there lot of page/ins ? There shouldN#t be any.

HTH

Tilman  

Former Member
0 Kudos

There are not pages in read/write... appear 0.

What compression has your script? The new one (101) or the old one (compression:1)?

Of course, in productive I have a external RDX for do the backup... this environment is test.

Regards,

Former Member
0 Kudos

If I copy a file from F:\ to F:\ the speed is good. So, I think the problem is the script.

Regards,

Former Member
0 Kudos

Try a backup without compression. 

Former Member
0 Kudos

Same problem without compression and with a simple

dump database DES to "c:\backup.dmp""

Any other idea?

If I copy a big file in the disk, the write is 80MB/second... If I do a backup the write is 50MB/minute

Former Member
0 Kudos

I just had a look at the image of windows monitoring you posted.

You can see that the sybmultbufs are completing 1Mb read (from ASE devices) and 600K write (to dump device) per second.  Also seem to have 15Mb of read from the F: by sqlserver. This is not part of the backup - something else is going on in  ASE.

Now - I'm not sure this should cause such slow  backups (In fact I do not think it should - ASE backup is a very efficient read straight through each device) .  But it might be worth trying a dump when nothing is going on in ASE just to exclude this? 

Out if interest how many devices (files) are utlized in ASE?

Also might be worth completing an I/O check on the F volume.  Microsoft supply a good utility for this (sqlio).

Former Member
0 Kudos

My partner check the hardware and all is correct... Seems problem with the dmp file or the database.

Any idea?

Former Member
0 Kudos

I replicated the Database with the same size and in better conditions; only 1 disk without RAID, in virtual system, with 5machine in same disk, etc etc and works fine the backup, very fast.

What could be the reason for in a phisical machine, with disk in raid and good conditions and a bit work in them the backup is very very slow?

Regards,

Former Member
0 Kudos

I TRY WITHOUT COMPRESSION AND WORK FINE.

Why is very slow with compression? The machien has 16 cores (2 processors Opteron 2.00GHz)

Any idea with this last test?

Former Member
0 Kudos

How many stripes are you using? To use more cores (for compression) you will need to use more stripes

Former Member
0 Kudos

stripes? What is a stripe? I don not understand you...

Former Member
0 Kudos

Apologies - last reply was rather short :-

dump database foo to 'C:\dumps\foo.dmp.1'

stripe on 'C:\dumps\foo.dmp.2'

stripe 'C:\dumps\foo.dmp.3.'

Above example splits the dump across 3 files. 

This can :-

- Maximum I/O throughput

- Engage 3 CPUs for compression - one per stripe.

Former Member
0 Kudos

I think the problem is that I execute all in same partition of disk...

4disk in RAID5.

I do a compression backup to a external disk (RDX) and work more or less fine... so, I think is problem in configuration of disk or similar.

I think the problem is solved.

Regards and Thanks to everybody

Former Member
0 Kudos

hi Jan,

This faheem Ahmed Shaikh from Pakistan, fresh SAP Basis Certified, i have little confusion in taking Sybase Backup,

According to your comment on below attached link.

scn.sap.com/thread/3245201

you mean to say that after entering in database by typing.

isql -S<YOURDBSID> -Usapsa -i /backup/sp_dumpdb.sql

We have to just paste the stored procedure code according to our, and then we have to use Job scheduler in DBA Cockpit?

Please help me in this regards, it is very important for me. as i have to take backup for my client's server.

Answers (2)

Answers (2)

Former Member
0 Kudos

Take a look on a Blog that I wrote. I believe it will be helpfull..

http://scn.sap.com/community/sybase-ase/blog/2013/05/31/backup-challenges-on-sybase

BR  

Fred

Former Member
0 Kudos

20 hours to finish the backup... how it is possible?

Regards,

Former Member
0 Kudos

Victor,

Sounds like you've got a problem.   Consider the following :-

- Monitor what's happening.  Backups in ASE are executed by seperated processes to the dataserver processes (actually not the backup server it's self - by sybmultbuf processes).  So you cannot use ASE counter to look at the backup process.  So have a look at OS metrics for disk, memory and CPU.  Are you htting limits with any of these?

Note that backups can be CPU intensive - if you're going for compressed backups.

In my experience ASE backups 'just work' fine.   I've tuned them in the past (for large databases) by stripping the backups to multiple volumes (i.e. dump to stripes which exist on multiple file systems).  However Before you go down this route I'd have a look at your O/S metrics first.

BTW backup sever is configurable -  but in my expereince you get very little out of tweaking it.