cancel
Showing results for 
Search instead for 
Did you mean: 

Out of Memory Issue while Selecting a dataset

Former Member
0 Kudos

Hello Experts,

I'm trying to a perform a Join between two tables...Let's say Table A and Table B in a SQL Script as shown below:

it_tab = SELECT A.KUNNR, A.VKORG, A.VTWEG, A.PARVW, A.PERNR,B.MATNR

FROM  <schema_name>.A

           <schema_name>.B

WHERE A.VKORG =  B.VKORG and A.VTWEG = B.VTWEG and A.MVGR1 = B.MVGR1

GROUP BY A.KUNNR, A.VKORG, A. VTWEG, A.PARVW, A.PERNR, B.MATNR;

and later I want to insert this dataset into a Physical Table as shown below:

INSERT into <schema_name>.C select * from :it_tab; 

During the execution of the Procedure, it throws the following error:

GenericFailure exception: column store error: search table error:  [9] Memory allocation failedPlease check lines:

Definitely the ResultSet is huge and hence it's not possible to hold.

Is there any way to handle this ?

It would be really great, if you can let me know of any way.

Thanks & regards,

Jomy

Accepted Solutions (0)

Answers (3)

Answers (3)

justin_molenaur2
Contributor
0 Kudos

Jomy, I went ahead and posted something about iterative looping for building tables in HANA similar to an ETL process. I used this when facing OOM errors before and it worked well.

Happy HANA,

Justin

justin_molenaur2
Contributor
0 Kudos

Hi Jomy, just recently hit some similar issues. By reading your code I can assume that you are trying to create a large dimension through a stored procedure.

Ideally, this would be solved through an ETL tool such as Data Services. In absence of that, which I assume to be your case, you need to manage smaller executions of the same procedure. For example, by looping through a distinct set of logically grouped data you can get it to a manageable memory size.

To check this, literally see if you can get the SELECT part of the SQL to execute in FULL by entering the code into SQL console, highlighting it, then right click and choose Visualize Plan. From there, execute the plan and see what happens. If it still fails with the same error, try adding constraints in the WHERE clause.

Once you have established it can run with a logical subset, form a LOOP and execute the same logic multiple times.

Regards,

Justin

Former Member
0 Kudos

Hello Justin,

I've tried executing the SQL statement directly in SQL Console and it is atleast not returning any error but the result set is around 836 Million records.

Now how to modify the current procedure to make it more logically grouped set:

it_tab = SELECT A.KUNNR, A.VKORG, A.VTWEG, A.PARVW, A.PERNR,B.MATNR

FROM  <schema_name>.A

           <schema_name>.B

WHERE A.VKORG =  B.VKORG and A.VTWEG = B.VTWEG and A.MVGR1 = B.MVGR1

GROUP BY A.KUNNR, A.VKORG, A. VTWEG, A.PARVW, A.PERNR, B.MATNR;

INSERT into <schema_name>.C select * from :it_tab;

I can think of Material No. ranges, but how can we incorporate that ?

Regards,

Jomy

justin_molenaur2
Contributor
0 Kudos

This seems like it may be too big to be that useful. You are essentially exploding all materials in the system for every entry in the Z table you mention based on your joins.

Show me all MATNR's for a given Sales Org/Dist Chan/Sourcing Group for every entry in the Z table, at a grain of Sales Org/Dist Chan/Sourcing Group/Partner Function/Person.

You could loop at say Sales Org or any of the other keys in your Z table, but I would take a good look at what you are trying to achieve functionally first.

Regards,

Justin

Former Member
0 Kudos

Justin - spot on. I can't believe that this it is necessary to explode the data, but without understanding what you're trying to achieve it's impossible to comment.

You need to redesign the requirement so you're processing sensible result sets. As a rule of thumb, if you are materializing more than 100m rows, you will have problems.

Former Member
0 Kudos

Hello John,

Basically, I'm trying to find Sales Employee on the granularity of Customer(KUNNR), Sales Org(VKORG), Distribution Channel(VTWEG) and Product Group(MVGR1)

I do have an Analytic View on COPA Transactional data which has KUNNR, VKORG, VTWEG.

Now Sales Employee is maintained in table KNVP in ECC but to reach there I need to go via ZTable1 in ECC which has MVGR1, VKORG as Key and has PARVW(Partner Function).

So I join KNVP and ZTable1 on VKORG, PARVW to populate ZTable2 in HANA to get Sales Employee on the granularity of KUNNR,VKORG, VTWEG, MVGR1.

Now I can't join attribute view created on this ZTable2 with COPA Analytical View as we will be having M:N relationship, So I was extending this table to have MATNR as well which I can get only when I join ZTable2 with MVKE on the basis of VKORG,VTWEG and MVGR1 to get MATNR.

Let me know, if you still have some confusion.

Regards,

Jomy

Former Member
0 Kudos

Have you tried building a calc view with all your joins in it?

Materializing this large table is not the best idea, because it will become out of date. You could fix the problem in Data Services.

John

Former Member
0 Kudos

Hi John,

First of all, Is it not correct that doing a Join in Calculation View will kill the performance.

Even though, I tried it and as expected it really killed it.

Regards,

Jomy

former_member185165
Active Participant
0 Kudos

Hi Jomy,

You can find the system limits by querying on M_SYSTEM_LIMITS in system Views Reference.

select * from m_system_limits

Hope you might find this useful.

Regards,

Vijay

Former Member
0 Kudos

Can you explain what you are trying to achieve?

As a rule of thumb you should return small result sets. Materializing huge result sets is bad.

Former Member
0 Kudos

Hello John,

I'm trying to achieve the following:

I do have a scenario, wherein I'm joining a Z Table : consisting KUNNR, VKORG, VTWEG, PARVW, MVGR1, PERNR and MVKE joined on the Basis of VKORG, VTWEG and MVGR1 to populate a new Table which will be having KUNNR, VKORG, VTWEG,MVGR1, MATNR and PERNR.

This I'm doing within a HANA SQL Script.

Later on I'm creating an Attribute View on this new Table and Joined with COPA Analytical View on KUNNR, VKORG, VTWEG, MATNR.

Can you think of any other way?

Regards,

Jomy