cancel
Showing results for 
Search instead for 
Did you mean: 

JDBC adapter - Select statement

Former Member
0 Kudos

Hi,

I have XI scenario with JDBC adapter which selects rows in some Oracle DB. Problem is that table is huge, which means that one select with cca. 1000 rows, produces XML message with cca. 100MB, and this sometimes is not good and causes me problems. I tried to solve this with select like this one:

select * from db_table where status = '-1' and rownum <= 100

update db_table set status = '0' where status = '-1' and rownum <= 100

So my question is, is this correct and can I be sure that update statement will update exactly selected rows and nothig else? This means that I will not get correct result. Do I have somethig to affraid off or this will work correctly?

br

mario

Accepted Solutions (0)

Answers (2)

Answers (2)

Former Member
0 Kudos

Hi,

Maybe you could cumulate the data and sent it. In one of my scenarios I had the same problem. But after cumulating the data it went down around %70.

Former Member
0 Kudos

How do you mean to cumulate?

Former Member
0 Kudos

According to SAP Note Number: 831162:

8. Transaction Handling (Sender)

Q: If I have the following configured in a JDBC Sender:

Select Query:

SELECT column FROM TABLENAME WHERE FLAG = "TRUE"

Update Query:

UPDATE TABLENAME SET FLAG = "FALSE" WHERE FLAG = "TRUE"

How do I know that the JDBC adapter will not update newly added rows (rows that were added between the time that the SELECT and UPDATE queries were executed) that were not read in the initial SELECT query?

A: The SELECT and the UPDATE are run in the same DB transaction, i.e. both statements have the same view on the database.

Make sure that both statements use the same WHERE clause. An additional requirement for the correct operation of this scenario is the configuration of an appropriate transaction isolation level on the database (i.e., repeatable_read or serializable). You might also consider using a "SELECT FOR UPDATE" statement instead of a plain SELECT statement to ensure proper locking on the database.

Shabarish_Nair
Active Contributor
0 Kudos

One of the best practice in such cases would be to use a staging table. i.e have a copy of the table you are accessing which will have all the contents of the original table.

While selecting always select the first 100 (or any number which you find is suitable in terms of performance) rows and after that have a delete statement to remove those rows.