cancel
Showing results for 
Search instead for 
Did you mean: 

Agentry - help to reduce time taken in client exchange step

Former Member
0 Kudos

Hi,

I have an object that holds 1500-2500 records (depending on the user) and the fetch time is causing a lot of headaches.  I am not concerned about the initial loading of the object onto the client device, it is the subsequent fetches that I would like to improve the time taken.

I am using the Maximo Inventory Manager agentry application, on 3.0.8.something.  And I believe I am using the standard approach - with a client exchange step to update the backend (Maximo Database) exchange table with the record IDs (and lastUpdate date) from the tablet.  Then the Server Exchange steps run to update the exchange table to work out what records to add, remove, and update/replace on the tablet (and then the read step on the object reads the new/updated record details).

When I watch the logs, most of the time is spend on the client exchange step which is inserting the records into the exchange table one at a time.  Is there other ways to maintain this table from the client that is faster?  I was thinking something like a "Run One Time" client exchange step that somehow makes up XML message which has all the record IDs on the client and then does a single insert into a table on the backend database and then runs a script to process this into the exchange table?

Any ideas?

Thanks,

Cameron

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

If you are using an SQL Step for the client you can try using Run one time, but change the script to include a Foreach.  This way, before you run the script to the server, it will loop through the collection and create one really long SQL Script, then send one whole script to the DB instead of 1500 small scripts.

How to use a foreach function inside a transaction sql step in Agentry - SAP Mobility - SCN Wiki

Former Member
0 Kudos

Finally getting the time to implement this - the query is created very quickly, but the last loop through adds in a UNION ALL and so the SQL is not valid.

How do you detect the last record in the object and not put the UNION ALL?

Former Member
0 Kudos

After the foreach loop you can add a query that will not update like
Update X where 1 = 2

(Sorry I don't have the application by me that I did the foreach, so I can't show  the script I did

Former Member
0 Kudos

Works perfectly thanks Stephen.

Answers (4)

Answers (4)

Former Member
0 Kudos

Hi,

Firstly sorry for the delay in replying to the help you are giving - just moved house and had a few issues getting the internet all working.

The situation is that on the Android and iOS devices I really like the quick search functionality in a tile list of objects (I cannot get it working on the WinCE device which is a pain).  We were using the out of the box Item Search functionality (it was a module in the older versions) which connects to the back end to perform the search and returns those objects.  But it is just not quick enough if someone wants to use this mobile solution to say find where an item is located to go get it or return it to the shelf.  One user said he was performing a search for "a" in the description, bringing back all the items in his store, and then using the quick search on the tile list to actually do his search.  After the download it worked brilliantly so I thought I would copy the functionality but include a fetch to load all the items in the storeroom (which is lots).

I could use pushes to keep the list up to date rather than a main fetch, but the item balances will be out of date when the user issues something until the next push, so a "main" fetch seemed like the way to go - but it takes is too long to do the sync.

If I could use the tile list quick search functionality over a complex table that would have been ideal (I am not wanting to perform any transactions against this data set, just have view access to it).

It sounds like the Foreach approach will be the best thing for me to try to improve the fetch time. And the JVM optimisation info will be great in general too.

Thanks heaps,

Cameron.

mark_pe
Active Contributor
0 Kudos

Cameron,

We have a special program called Customer Outreach that my have some knowledge that you can learn about fetching and how to improve it. If you are interested let me know. It is part of the support maintenance.

Best Regards,

Mark Pe
SAP Platinum Support Engineer

Former Member
0 Kudos

Hi Mark,

That sounds very helpful, I would like to get involved.

Thanks,

Cameron

mark_pe
Active Contributor
0 Kudos

Cameron,

Hi. You may follow me and send me a direct message and we can set one up. It will require a web session time.

Regards,

Mark Pe

SAP Platinum Support Engineer

mark_pe
Active Contributor
0 Kudos

Cameron,

Hi. I agree with my fellow SAP support engineering colleagues/consulting/SAP system integration partner on their suggestion (from Kevin to Bill). The out of the box SAP Inventory Manager for Maximo does take only the generic Inventory Manager objects. This is just the standard approach. There is no knowledge by the SAP Inventory Manager on how your own company plans on using it.

The typical enhancement that an SAP consultant would do is to study the business flow of your Inventory Manager. Modify the out of the box to best suite your needs. Let us say you have a inventory stock office (stores all your parts), and you have about 2 employees who does inventory count. If they are the same employees doing the count for the entire stock room, then I would agree that you may have 2500 objects in it. 

But what you can also do is sub-divide the fetch items in multiple rows for your stock room. A stock room may have row 1, row 2, row 3 and each of those rows may have less than 2500 objects. So you may start enhancing the fetch for the inventory parts to row1 fetch, row2 fetch, row3 fetch. So this also helps the user start counting anything in the first row in the stock room.

Part of the enhancement can be a kiosk type of design. When the technician logs in, they will specify what area in the stock room they would like to count. Then they will press sync to only download what is needed.

This is also true if you are a big store with multiple stores located in the world. Your kiosk design can specify the type of location and with that modify accordingly.

If you are just trying to do modification and you are just trying to just have one database or one cache to do all the processing then you may be limited.

Part of the wonder of designing in Agentry is how you can quickly just follow a template and just create rules and modification on how to tweak the out of the box SAP Inventory Manager for Maximo (IBM) to work with your flow.

So in summary, try to enhance the mobility application based on the real workflow. Try not to do all of the above in one shot. The customer will be happier that you simplified the approach for them and at the same boat, you actually speed up the transmit.

Note: Our SAP Services team normally do this type of engagement in a project per project basis. You may need to talk to SAP Account executive if you need to.

But this is just a suggestion.

Note: The device processing and backend processing can affect the transmission. Not only does your backend does work, your client does work as well. Optimizing the query for both the client and backend can improve it. Based on my suggestion above, you can just optimize the workflow to work on smaller objects per fetch and you will get a better return on your investment and scalability of your solution.

Have a great day. Thanks for posting your question in SAP Community Network.

Best Regards,

Mark Pe
SAP Senior Support Engineer

bill_froelich
Product and Topic Expert
Product and Topic Expert
0 Kudos

Cameron,

That is really too many objects.  A technician doesn't need 1500+ work orders on their device.  They will never know what WO to act upon.  My recommendation is to filter the data down so you have a more manageable number.

--Bill

kevin_xu4
Active Participant
0 Kudos

Hi Cameron,

Did you try to increase the memory setting in SMP and check if it would be better?

http://scn.sap.com/community/developer-center/mobility-platform/blog/2014/09/19/smp-30-analyze-and-a...

And as mentioned in your description, you want to improve the exchange time during the subsequent fetches, is there necessary to update the mass data in one time sync? Have you tried to separate it to different table and update in several times?

Hopefully it would be helpful.

Best Regards,

Kevin.