cancel
Showing results for 
Search instead for 
Did you mean: 

Adapter custom - Multiple Nodes Issue

Former Member
0 Kudos

Hi All,

I developed a custom adapter and lately i'm facing with a problem related to multiple nodes mechanism. Sender adapter read the messages 1 time per node because there is not an implementation for locking one node while the other one is reading.

I tried to use LogicalLocking API following SAP Help guide and javadoc but when I invoke lock method of LogicalLocking instance an exception

is raised. Below the details of operations called:

LogicalLockingFactory lf;

LogicalLocking logicalLocking;

InitialContext ctx = new InitialContext();

ClassLoader oldClassLoader = Thread.currentThread().getContextClassLoader();

Thread.currentThread().setContextClassLoader(this.getClass().getClassLoader());

try {

          lf = (LogicalLockingFactory) ctx.lookup(LogicalLockingFactory.JNDI_NAME);

          logicalLocking = lf.createLogicalLocking("xyz", "Adapter xyz");

          logicalLocking.lock(LogicalLocking.LIFETIME_TRANSACTION, "xyz_12345", "arg1", LogicalLocking.MODE_EXCLUSIVE_CUMULATIVE);

catch (NamingException ne) { // JNDI lookup error

                    }

catch (TechnicalLockException tle) { // the namespace is already reserved for another description

                    }

catch (IllegalArgumentException iae) {     // the namespace is already reserved for another description

                    }

finally      { Thread.currentThread().setContextClassLoader(oldClassLoader);

                    }

Below the exception raised when i call lock method (row 😎:


com.sap.engine.services.applocking.exception.AppLockingTechnicalLockException: The lifetime can not be the transaction, because currently no transaction is open.

Is there anyone that faced with the same issue?

In wider terms and by your opinion is this the right way in order to guarantee exclusive reading by nodes?

Thank you in advance.

Andrea

Accepted Solutions (0)

Answers (1)

Answers (1)

npurohit
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi Andrea,

I also faced this issue. Is your sender side like a file or JDBC Adapter, where you do the scheduling?

In case if it is scheduling, it will be best to do a job scheduling based on the available free node, so that it faces no conflict.

Best regards,

Naresh