BODS : Datastore options for SAP R/3 - need clarity for use

Hi All.

Another request to understand the datastore optiono n BOXI 3.1 BODS existing installation.

We are trying to pull a new table from SAP R/3 into BODS and we find that ABAP program is not getting generated as expected. And terminates. When we tested a simple workflow.

While creating a new datastore for SAP R/3 Source :

When we look at the datastore we find the following options for R3 Source ;

ABAP Execution option :

Execute preloaded

Generate and execute

Under data transfer method we find ;

Shared directory

Direct download


Custom transfer

Then we have working directory on sap server

local directory

generated ABAP Directory

Am testing a simple workflow of pulling data from SAP R/3 in a dev machine.

But am not understanding.

which option of ABAP Execution, and data transfer , path would work in co-ordination.

Because when i say direct download and say generate and execute it throws error.

Can anyone help me with combinations of the options to choose for a R/3 source. And the implications thereof.

I had created a folder under local bods server D:\Bodi. And given the path for data transfer.

But the files are not getting generated for whatsoever reasons.

Any advise on this would be helpful.

Also found a bit unusual, that there was no button to test the connection to be correct or not, a TEST connection button is not there. Which i felt, could be included.

Note : on the existing production system, we have chosen execute preloaded; shared directory on sap server. and shared that folder path for the user and given full rights. But while we try to do the same on the dev machine a test before transporting on production, a simple workflow does not work.

would like to know what settings on sap server, really affect the data store options on the BODS



Edited by: Indumathy Narayanan on Jul 19, 2011 4:14 PM


Indeed, BODS <> SAP connectivity can be tricky.

For a development environment, I suggest you select the option "Generate and execute" for your "ABAP Execution Option." What this means is that DS will create, on the fly, small-ish ABAP programs. These ABAP programs will be written, in plain text, to a local directory on the DS job server, in the folder specified in "Generated ABAP Directory". You can see them in there after an attempted job execution, assuming the job involves the creation of an ABAP program. The ABAP program name is specified in the properties of an ABAP data flow, under Options > ABAP Program Name. If you can't, perhaps the DS job server process doesn't have full rights to that folder - ? After being generated on-the-fly, they'll be transferred to SAP to execute. The SAP user you use to connect to SAP must have sufficient rights to upload-and-execute these ABAP programs, and that's a fairly substantial set of rights. What's required is documented in the BODS supplement for SAP. Often, to get things running, your friendly local Basis admin will grant SAP_ALL to the DS user, to see what rights are being invoked.

Once all that jazz is working, you need to get the data back. There are a number of ways to do this. The method of data transfer is specified in "Data transfer method," where, ignoring "Custom transfer," you have three choices:

1) Direct Download: easiest and slowest. This method tells SAP to attempt to stick data in the client-side folder specified in "Local directory." Try this first.

2) Shared folder: This is recommended when you have SAP being hosted on a Windows box. Basically: you set "Working directory on SAP Server" and "Data Services path to the shared directory" to point to the same folder. SAP uses the "Working directory on SAP Server" to find this folder, and DS uses the other setting. So, for instance, if you were going to use the Shared folder method, you could set "Working directory on SAP Server" to "E:\BODS_Transfer", and, assuming E:\BODS_Transfer was shared-out as "BODS_Transfer", you could set "Data Services path to the shared directory" to\BODS_Transfer . Then, you'd need to setup all the relevant security, as both SAP and DS need rights to read and write files in this folder.

3) FTP (this is the method I usually use): SAP writes the "transport files" you're after (i.e., the data) in the folder specified in "Working directory on SAP Server". Then, you need to establish ftp connectivity to that folder from the DS job server's perspective, which you do by entering the ftp host name and the path to that folder in "FTP host name" and "FTP relative path to the SAP working directory". In my opinion, the "relative" business is a little confusing, and I just typically enter the full ftp path, beginning the path w/ a forward slash, like "/usr/sap/tmp/BOBJ" or something like that. You also need to obtain a separate username and password for the ftp connectivity. Note that this name and password has NOTHING to do with the SAP username and password; you're just setting-up DS to act as an ftp client. I strongly encourage you to test ftp connectivity by using a regular ftp client from the DS job server and attempt to connect to your ftp host using the username and password you were given, and attempting to fetch some sample test file. If you can't do this, manually, then DS won't be able to do it, either.

Best wishes,

Jeff Prenevost

0 View this answer in context