cancel
Showing results for 
Search instead for 
Did you mean: 

SAP HANA NLS

Former Member
0 Kudos

Can you please provide me the below information based on your past experience on other clients: •        Should NLS implementation be performed before or after HANA migration activities? What are the pros and cons? •        Oracle is Row based technology and HANA and Sybase are Column based technology. Can NLS be connected to migrated HANA environments afterwards? •        Can we integrate another Sybase database with Sybase IQ NLS database after NLS implementation and what are the pros and cons? It will be appreciated if you please send me the above information asap. Thanks in Advance

Accepted Solutions (0)

Answers (2)

Answers (2)

Former Member
0 Kudos

Hi Siva,

I would says it depends on your current DB size and the migration procedure.

However, you should be aware one important fact, which SAP did not tell us end of 2013, when they wrote a NLS concept for us...

Just yesterday my frustration exploded here: http://scn.sap.com/message/16337625

From my point of view NLS is not developed sophisticatedly!

You have to be aware of many topics which could lead to issues in your daily application management...

Three problematic examples - besides the issue in my posting above:

1) If you extract ERP data and for what ever reason the datasource delivers records with a date, which has already been used for archiving, then your DSO activation will fail and you have to reload all data back until that date for being able to activate

(Only alternative is, you implement a deletion routine, but then of course this data is not updated in the DSO)

2) If you should load data records (e.g. coming from external source) where the key fields like document number and item are repeated (number range started all over again), you could have the old data archived and the same numbers again in active DSO table. Now want to reload from NLS for reason in 1) --> Forget about it! Re-Loading fails due to "duplicate key" and there is no proper (standard solution) to e.g. chose skip or overwrite DSO data. This is embarrasing!

3) If you have archived your write-optimized DSOs, just in case for being able to delete the upper Layers and reload everything with new TRFN logic, for example ... you would assume you can just trigger a DTP which also reads the archive. --> Forget about it! First, you have to re-load ALL records from the archive, otherwise the upload to the next layer will not be in correct sort order! The w/o-DSO is capable to load request-based in the DTP, NLS is not.

These are only the "highlights", which you should consider in your Appl. Mgmt.

Best regards,

Martin

surendra_p
Active Participant
0 Kudos

Hi Martin,

I gone through above post which is regarding NLS.

We are implementing data archival process and when i am trying to archive data from Write optimized DSO i am facing an issue like "Data area archived empty" i am trying to archive based on request based only it throwing this error.

Please help if you know anything about this issue,

Thanks in advance.

Regards

Surendra

Former Member
0 Kudos

Hi Surendra,

do you try to archive data from a classical DSO or from an ADSO?

As far as I know, ADSOs can be archived only from 7.50 version on.

BR, Martin

surendra_p
Active Participant
0 Kudos

Hi Martin,

When i am trying Archive data from Classical DSO everything going fine but when i am trying to archive data from Write optimized DSO i have this issue" Data area archived empty".

The DSO is not of type ADSO.

Please help me if you have any solution. and currently we are at SAP BW On HANA 7.40 version only.

Thanks in Advance

Regards

Surendra

Former Member
0 Kudos

Hi.

Sorry, I don't know more than stated in this page, either: https://wiki.scn.sap.com/wiki/display/BI/Archiving+Business+data+in+BW+-+Troubleshooting

Try google or open a note.

BR, Martin

0 Kudos

Hello Surendra,

For WDSO it is request based archiving not time slice archiving.

Also you cant partially archive the DTP request here.

Check on the characteristic for time slice in data archiving process.

You can specify the time characteristic ,( like data records older than one day).

That kind of an error is thrown when you don't have data  to archive for the specified restriction you mention.

Hope this helps.

Thanks &Regards,

Priya

surendra_p
Active Participant
0 Kudos

Hi Priya,

Thanks for your reply.

I am archiving data based on characteristic time slice as Request created date only, please refer the below screenshot.

i am able to archive data upto few request and after that i am unable to do that it's throwing an error.

Regards

Surendra

0 Kudos

Hello Surendra,

You can try the following  archiving request restriction if possible.

Thanks & Regards,

Priya

surendra_p
Active Participant
0 Kudos

Hi Priya,

Thanks for you information.

Is there any restriction we have archive this many request at a time or not.i think there is no as per my understanding.

as you said i passed like that only when i am passing selection refer below screenshot.

when i pass any selection condition till the above highlighted request the data has successfully archived then after if suppose if we pass 1 Day or 1 Month or 1 Year its automatically cancelling.

Could you please share your details if you don't mind on this

( Surendrap595@gmail.com) i will share my screen.

Thanks&Regards

Surendra

surendra_p
Active Participant
0 Kudos

Hi Priya,

Thanks lot for your information and finally the issue got resolved.

Regards

Surendra

former_member200967
Active Participant
0 Kudos

Hi Surendra,

Can you please share the final solution here and mark the thread closed so that it can help anyone who has the same problem as yours?

Regards

Manpreet

surendra_p
Active Participant
0 Kudos

Hi All,

Issue while archiving data using NLS from wdso(Write Optimized Data Store Object) is "Data area archived Empty".

Please go through the following link

former_member186053
Active Contributor
0 Kudos

Hi Siva,

Below docs might be helpful to you.

Also have a look at the docs submitted by

Regards,

Vengal.