performance for huge table
I need to fill data from excel file and implement below logic for all the fields and at last finally need to update custom table .
A table has more then 35 fields then in these fields there is one field ' STATUS ',
first after picking data from file need to check all 35 fields one by one from starting 2nd field of table ,, if any field has value then its mean 1 ,if not then 0 like that need to read table from 2nd field of table and concatenate these values and pass to field STATUS .
Eg if table test had 5 fields
1 A = 'ttt' .
2 B = ,
3 C = '33'
4 D = '343',
5 Status . In this case status values should be '1011'.
now my query if what could be the most optimized , performance side best method to write this logic as there will be multiple records and 35 fields need to read to update this logic .
Your doubt regarding performance is correct ... But I think, 30 IF .. ENDIFs and 30 times DO will not have much difference .. May be its a nested loop but both does the same right! .. DO might have very small performance than IFs .. For example if you have 100 fields you can't imagine how many lines we should write for that ...
As MATTHEW suggested, it is better to keep SY-SUBRC check and populate the number for DO statement dynamically ..
I think you can close the thread, this will be the best optimised and good code for your requirement ...
Thanks & Regards,