cancel
Showing results for 
Search instead for 
Did you mean: 

Parallel CPU usage with SNP Optimizer

Former Member
0 Kudos

Hello, I would like to know if it is now possible to execute SNP optimization runs that will utilize more than one CPU on the Opt Server.  Previously an optimization run was limited to being solved only on one CPU, but I have read that recent ILOG Cplex releases support the solving of an optimization run on multiple CPUs thereby significantly reducing runtimes.  I have not been able to find any information online from SAP on whether this capability has now been enabled with the SNP optimizer.  This capability would much improve solve times for discrete problems.

Question

For what class of problems can parallel CPLEX improve performance?

Answer

Parallel CPLEX can significantly improve performance for certain classes of problems. It works particularly well with:

  • MIPs, especially with CPLEX 12.2 or later;
  • barrier method for LPs, QPs and QCPs;
  • concurrent optimization for LPs and QPs.

With MIPs, CPLEX processes nodes in the branch and bound tree on separate threads. Adding multiple processors or cores can cut the run time in half or better on some models. CPLEX 12.2 and later also perform parallel processing at the root node. However, for earlier versions of CPLEX, the majority of the MIP runtime needs to be spent processing nodes for Parallel CPLEX to improve performance.

For LPs and QPs, the barrier optimizer of CPLEX runs in parallel. The barrier optimizer runs well in parallel because most of the computations are focused on the solution of a single linear system of equations.

For LPs and QPs, CPLEX also offers a concurrent optimization procedure, where CPLEX solves your model with a different LP algorithm on a different processor, stopping as soon as the first one has solved the problem. This feature is not available for QCPs since CPLEX currently only uses the barrier method to solve such models. Note that CPLEX 12.2 and later use concurrent optimization for the root node solve by default, thus potentially improving the run time for MIPs where the root node solve comprises a significant percentage of the total solve time.

Accepted Solutions (0)

Answers (2)

Answers (2)

thomas_engelmann
Explorer
0 Kudos

Hi John!

sorry, I found your question with a delay of two years. but better now than never:

  • It is possible to combine any of the three large commercial solvers in their latest versions and run it below SNP Optimizer. The drawback is, that you need your own license for that, but then you can use the full feature-set of this solvers. You find at https://css.wdf.sap.corp/sap/support/notes/1756554
  • In most performance-critical scenarios the product-decomposition is used. End of 2013 we enhanced it, that it can solve several sub-scenarios in parallel. Working with 6 threads we watched speed-ups up to 5x. Using the knowledge of the problem and exploiting it showed a better performance than the very general parallelization approaches of the solvers. To activate the parallelization you specify the number of allowed threads in the SNP Optimization profile.

Best regards,

Thomas

former_member584840
Participant
0 Kudos

Hi, Not really answering your question, but we are currently looking at putting our Optimizer onto the Amazon cloud which has an expandable model.

This wiki points to how it is done.

http://wiki.sdn.sap.com/wiki/display/SCM/SCM+Optimizers+in+the+Cloud

And I have just been talking to the amazon guy this morning who  sent me these links on instance types and pricing.

http://aws.amazon.com/ec2/instance-types/

http://aws.amazon.com/ec2/pricing/

So this may be an option. We are still in test mode, so will likely get this setup, though it seems that to use the on-demand model, you need to wait a few extra minutes for your instance to be copied to a server each time before you run the job to use the pure on demand model.

Our current basic setup, we can only run one SNP optimizer at a time (but can run an SNP optimizer at the same time as the PPDS optimizer)

Good luck.

j