cancel
Showing results for 
Search instead for 
Did you mean: 

Business Logic Editor with Complex Transasctions

Former Member
0 Kudos

I'm starting to have some serious problems using the Business Logic Editor with reasonably large (but by no means huge) transactions. These transactions also have nested transaction calls down a few layers. The main symptoms are:

1) When opening the Link Editor on the first action it is somewhat sluggish. Opening the Link Editor on the last action can take over 60 seconds.

2) If adding a new Transaction Call action, selecting the called transaction in the Configure window doesn't work. The transaction can be selected OK, but the browse window will not close. It has to be cancelled. The Link Editor does not show the transaction's input, even though the Configure window will now show the selected transaction's path OK.

3) If the transaction is Saved after 2), that transaction will not open again in the Logic Editor, even after a reboot.

Memory seems to be part of the formula. The above is on a notebook with 1Gb of RAM. On a dual core with 2 Gb, the problems are less, but not much. On a server with 8Gb, the symptoms disappear.

Does anyone know if there is a recommended minimum system config for using the Logic Editor?

What is actually happening when a transaction with nested Transaction Call actions is loaded in the Editor and for exection? I have many transactions that don't have the above problems, but they take a long time to open in the Logic Editor, and up to 1.5 seconds to load when executing the transaction (this is as per the F5 timings).

Thanks,

Bill

Accepted Solutions (0)

Answers (5)

Answers (5)

Former Member
0 Kudos

The Links windows that take forever are any at the end of the transaction. E.g. the second last action is just a conditional. It takes just as long to open as the last action, which is a transaction call. The first actions are quick. It gets progressively slower as I move from the beginning of the transaction to the end.

Definitely no recursive logic. We tried that early in our xMII lives and decided to stay right away from that one. Just lots of calls to complex transactions, which in turn call other complex transaction, etc. It goes down about 6 levels. The largest trx is 109 Kb, and most are much smaller, so they aren't that big.

I had considered that maybe the logic editor and execution engines did some sort of "early binding" exercise right down the tree, that would certainly slow things down. But on the 3rd level the transaction call nesting is broken. Because we need to call the next transaction dynamically, we form a URL and use the XML Loader instead of the transaction call action.

Former Member
0 Kudos

Rick,

These days I'm very careful about large embedded reference docs in my transactions. I actually go through the .trx files with notebook looking for any large chunks of that sort of XML and make sure they are eliminated.

In the case of this transaction, by the time I get to the actual SQL query action (and remember it's a few layers of nested transaction calls down), it can't run the query anyway. The Query Template name in the configure window is blank. That and any params are set dynamically in Links at run time (the Query Template name actually comes from a config file). If I try Limit Rowcount or Yes when I close the configure window, it just gives an error because it has no idea what query template to run.

Given this, the problem just doesn't seem related to fetching data. And how would this cause the Links window to take so long to open in the Logic Editor?

Bill

Former Member
0 Kudos

If I remember correctly, the Links window will add nodes for every row/column in any resultant datasets, which could add up pretty quickly.

Here's another thought: Do you by any chance have any recursive transaction calls? Directly or indirectly? That could definitely cause some weirdness...or, if you have transaction A calling transaction B, and somewhere also, B calls A (or some other combination that loops back in similar way), you'd have issues.

- Rick

Former Member
0 Kudos

hi,

this is mainly due to the large amount of records which u are trying to access,

u can restrict the amount of data u want either by selecting the correct number for row count or by using a sort filter to filter out the required data.

regards,

shyam

Former Member
0 Kudos

Bill:

I suspect that you have a few queries with VERY large datasets in them referenced in your transaction. Open the query actions, and when you press OK to close the dialog for each, be certain to select "Limit Rowcount" when BLS tries to acquire a "sample" of the dataset structure. This will make a huge difference in the load/save time, the link editor, and other areas.

Rick

Former Member
0 Kudos

Is the 1 Gb machine generally slower with other applications as well?

Memory's definitely a factor , but in my experience a client with a 512 Mb behaves pretty nice.

Now I have transactions that do not exceed 100Kb , not sure how large yours are.

Maybe a Disk Defragmentation followed by a cleanup be answer to your issues.

All the best!