on 11-04-2013 10:56 AM
Hello everyone,
we receive an error message when executing an R Script embedded inside a SQLScript Procedure. With smaller data sets, the procedure is working fine:
Could not execute 'CALL PAL.XXX.XXXX()' in 1:13.629 minutes .
SAP DBTech JDBC: [2048]: column store error: [2048] PAL.XXX.XXXX: line 5 col 1 (at pos 126): GenericFailure exception: column store error: search table error: [34082] Execution of R script failed.;Error: cannot allocate vector of size 280.5 Mb
stack trace:
No traceback available
Does anybody know how to increase the max. size of memory allocated to our R script? Our R Server is installed on a separate SLES 11.3 32-Bit environment with 4GB of ram.
Thanks for helping and best regards,
Florian
Hi Florian,
May be there is a memory leak do share the code here which you are using.
Also as mentioned by @JohnAppleby you would want to be on those versions as a best place to start and as mentioned by you can try using gc() in your script to flush the memory of the unnecessary process.
Also do share your session info and what is the RAM capacity of your system.
Regards,
Krishna Tangudu
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Googling for "R Error: cannot allocate vector of size", you can see that is an R generic message.
I've found a couple of links (check the 1st one, stackoverflow) but not a lot of useful information.
One recurring topic is that 32bits is not a good memory handler. I'd say to, at least:
- replace your R environment for 64-bit SLES and 64bit R;
- use gc(); to "flush" memory before you run your script.
This link in particular has some nice insights into memory management in R.
http://stat.ethz.ch/R-manual/R-devel/library/base/html/Memory-limits.html
It's basically due to the underlying OS memory management.
Make sure your R running environment has the proper settings.
Another approach would be to actually change your R code to adapt to the memory needs.
This package looks promising:
http://cran.r-project.org/web/packages/bigmemory/index.html
http://cran.r-project.org/web/packages/bigmemory/bigmemory.pdf
Here is a bit more on large memory datasets on R:
Taking R to the Limit (High Performance Computing in R), Part 2 -- ...
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Are you using R 2.15, R-Serve 0.6-8 and HANA Rev.70? This is the current supported version and a good place to start.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello,
looks like I have a unique issue with the R-integration. Is there really nobody who could assist us in this case?
Thanks and regards,
Florian
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Florian,
Looking at your issue, my impression is that it's not a matter of the available RAM per process, but of the available continuous address space in your system. In your case can you try the execution with smaller data sets by gradually increasing the data until you hit your target data set.
One more area to ponder, Is there any point in your script, do you have any memory leak?
Regards,
Vasanth
User | Count |
---|---|
101 | |
13 | |
13 | |
11 | |
11 | |
7 | |
6 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.