cancel
Showing results for 
Search instead for 
Did you mean: 

TDMS Data Transfer Problem

Former Member
0 Kudos

Hello, I have a problem in TDMS in the Phase Data Transfer.

For 2 weeks has been in processes this phase, it concluded with finishing with error RFC in 12 tables, but the RFC is fine so i thought maybe the problem is with the duplicated key thing.

First i read the note 1366729 - TDMS data transfer error: duplicate keys for many objects, so i created the program report and executed it, but after searching in sdn forums, i pumped up with this link

so i decided to kill the process and changed the write behaviour from Array-insert to Modify like global parameters for the remaining tables. I executed the phase Data Transfer again and it shows me that the total of tables is 1400 and have not being created or processed.

what happened ? I don't understand how from 14120 tables and remained 12 tables it came this situation. Even if i change the write behaviour to array-insert, modify or Single Insert. And what about the write beaviour Dynamiv Single insert or Sum-up, can they be usefull?

Please, any thoughts, any advices?

Anatolii

Accepted Solutions (1)

Accepted Solutions (1)

suman_pr
Active Participant
0 Kudos

Hi,

Please check the error RFC_NO_AUTHORITY and it should be mentioned for which function group is the authorization check failing. If this is specified, please add the appropriate authorization using the role maintenance transaction PFCG to the concerned object (which usually is S_RFC).

Regards,

Suman

Former Member
0 Kudos

Hello Suman,

the RFC is ok. One question, i just executed again the program from the note mentioned in the first message, for generating the table and then executing the next program for calculation, what type of write behaviour i have to include, for not to transfer the data again and lose again a week for it 😃 ?

No i have it in array-modify.

Regards,

Anatolii

Answers (1)

Answers (1)

Former Member
0 Kudos

Hi Anatolii,

Can you please let us know more about the errors ?

additionally, check more details it in extended monitor log. Please also check if some processes are running in receiver system and any dumps ( ST22) in receiver system.

regards, Santosh

Former Member
0 Kudos

Hi Santosh,

It gives a dump of RFC_NoAuthority, i just changed the profile for the TDMS Receiver User (SAP_ALL) 😃

It also gives the same dump in Sender system

Will this help ?

Anatolii

sunny_pahuja2
Active Contributor
0 Kudos

Hi,

Please check user which you are using for communication between central, sender and receiver system, should contains below roles:

SAP_DMIS_EXT_DD_ALL

SAP_DMIS_EXT_DD_M_ALL

SAP_DMIS_EXT_DD_RFC

SAP_TDMS_USER

SAP_TDMS_USER_EXT

Thanks

Sunny

Former Member
0 Kudos

Hello Sunny,

They have those roles, the problem now is when i execute the phase data transfer again it gives in 30 sec aborted, and no logs and no dumps =(

Just that the process was cancelled

Regards,

Anatolii

sunny_pahuja2
Active Contributor
0 Kudos

Hi,

What is the status of RFC's in the first phase right now ?

It should be green.

Thanks

Sunny

Former Member
0 Kudos

Sunny, It is Green 😃

Regards,

Anatolii

Former Member
0 Kudos

Hi Sunny, Can you tell me what you did or which write behaviour you selected after using this note 1366729 - TDMS data transfer error: duplicate keys for many objects

Its the same problem that you had in this link =))

I will appreciate if you help me!

With best regards,

Anatolii

Former Member
0 Kudos

Hi Anatolii,

Change write behaviour too "Single Insert"

Single Insert: The data is written separately to the database. If a duplicate key exists, a warning message is written to the log and the data transfer continues.

Regards, Santosh

sunny_pahuja2
Active Contributor
0 Kudos

Hi,

> Hi Sunny, Can you tell me what you did or which write behaviour you selected after using this note 1366729 - TDMS data >transfer error: duplicate keys for many objects

You did the same mistake which I did. The note which you implemented is not the correct one because if you read the pre-requisites of the note it clearly says that if your project is very old more than one year. Then you have to implement this note.

I also did the same mistake that I implemented the note after implementing this note it resetted all programs relate to TDMS and it started data transfer again. Then I raised this to SAP they told me this thing. Then SAP has changed some control tables related to TDMS and they did the data transfer and it finished in 10 mins. I don't know what tables SAP has changed as they denied to give details to customer.

So, I will suggest to raise it to SAP they will correct it as you are now in same situation like me few days ago.

Thanks

Sunny

Former Member
0 Kudos

Thanks You very much for the help Sunny, sorry for the long time answer from my behave 😃

We corrected the problem with the saprouter and SAP could connect to our systems and they told us that everything is ok, that from our part we have to correct the RFC problems.

The RFC is ok, but still problems with the same tables, i see that everytime that we launch the activity "Start Data Transfer" the tables are filled with a quantity of records for 4 hours after the launch and then the errors are ocurred and the process is stopped.

My question is there a way to calculate the size of the Header Tables that controls this conversion for each of the tables left, from the Productive system to the Reciever system, i see that they are calculated in blocks not in records, to understand how much is left to be filled ?

the tables are: X_MKPF, X_S001, X_S033, X_S120, X_S121, X_S124, X_601, X_602, X_WKBK

With best regards, Anatolii

0 Kudos

You can check the number of records selected for particular conversion object via transaction DTLMON.

Give mass transaction id and access plan id . In subsequent screen select the tab "Runtime Information".

You can get mass transaction id from activity "Define Mass Transfer ID". Access plan id you can get from the logs of

any conversion object of the activity "Start data selection".

Isha