cancel
Showing results for 
Search instead for 
Did you mean: 

Transaction Catalog Limit

Former Member
0 Kudos

Hi,

When you have either multi-levelled transactions (ie: Transaction 1 calls transaction 2 which calls transaction 3 etc), or high transaction re-use (even if it is conditional at runtime) is there a limit? I believe I have found one.

I have an application that has quite a few transaction calls embedded at various stages within various transactions.

However, when you access (eg: open) any transaction (or call to a transaction) where the pending "catalog" (that will be built) is large-ish, I see a blank grey box for a few seconds then nothing happens.....no error is returned......the transaction will just NOT load at all, and I am left with whatever transaction I was looking at rtaher than the one I wanted to open.

I see in the xMII log that there are multiple entries as follows;

INFO SessionHandler - Requested URL is http://localhost/Lighthammer/Catalog?Mode=Load&Class=Transaction&Object=<myTransaction>;

which makes sense as I can see that the catalog is being built so I can use it within subsequent links etc.

But these entries just stop at a certain point in the logger. This certain point appears to be 10 secs after I triggered the initiating request to open the transaction.

During development it appears that the catalog building process spits the dummy at a certain point.

Any ideas what such a certain point might be?

I have scoured through the config and jsp's etc, but I cannot locate any parameter that could be causing this issue by imposing a limit and/or timeout.

Does anybody know of any such limits?

Limit by count?

Limit by memory?

Limit by time?

Thanks

Kevin.

Accepted Solutions (1)

Accepted Solutions (1)

jcgood25
Active Contributor
0 Kudos

More than likely you are hitting the upper Memory limit for the editor. Set your Java Web Start preferences to show the console always and look to see if you get out of memory errors.

Regards,

Jeremy

Former Member
0 Kudos

Hi Jeremy,

As per the Java console, free memory seems to be range bound and the value does not appear to change much as I open and access various transactions and calls.

However, I can still easily replicate the issue of transactions NOT opening, if I simply create a transaction with multiple calls within it to other large transactions.

I have a log dump which is showing that the catalog just stops building!

The real issue here is this.......when you open a transaction that has a large dependency, it does not "appear" to open in the GUI.....you are clearly looking at the previous transaction still....however, if you immediately SAVE the transaction you "think" you are "looking" at, it updates the transaction file that you just tried to open! So in fact, the transaction you wanted to open was in fact opened, but the editor GUI had not updated to actually display it!

Very dangerous behaviour!

When this happens, the only way to get at a transaction is to manually edit the master transaction XML file, by opeining it and removing the reference to sub-transactions in strategic transaction call actions. This obviously breaks the transaction dependency chain and then the master transaction will be able to be opened.

I can replicate this on multiple PC's and multiple OS's with different versions of Java with different memory settings etc.

Regards

Kevin.

Former Member
0 Kudos

Kevin - do you have any recursive calls in there? If so, you'd have problems. Big problems.

Former Member
0 Kudos

Hi Rick,

No recursive transactions, I double checked that early in the piece. Just lots of calls to common transactions for logging and event logging.

I can confirm though that there are several levels of transactions and LOTS of calls, even though at runtime many of them will not be executed, as for example, there is lots of error handling and associated event logging, and during "normal" execution they would never trigger.

The issue I see is purely a development GUI / memory issue which has me stumped.

It appears that when this "catalog building" process happens that all transaction levels are traversed, from the point you access to the lowest point. For example, if you have a "master" transaction (at the top of the tree), and it calls 3 other transactions, and one of these three transactions calls another 4 transactions, and each of these 4 transactions have 14 transaction calls, and 10 of those 14 reference a common transaction which itself has an embedded transaction call, then the catalog building process will attempt to load 1 + 3 + (4 x (14 + 10)) = 100 transactions.

jamie_cawley
Advisor
Advisor
0 Kudos

Hi Kevin,

See note 960362, this outlines the steps necessary to increase the memory size the logic editor will allocate.

Regards,

Jamie

Former Member
0 Kudos

Hi Jamie,

That appears to have worked great, thanks.

Former Member
0 Kudos

Yes, that's how it works at present. There is an opportunity to tweak the loader to do more "metadata caching" so that sub-transactions don't need to be loaded (never need to go more than one level deep). Basically, similar to the way you can pre-load a sample of the XML when you use a query action.

Put it in as a feature request and see what the Exton gang can do for you...

Answers (0)