Table partitioning for 2.5 Billion rows
We are planning to do Volume and Performance testing in our native hana project ( Data load through BODS) . Volume Expected - appx 2.5 Billion iniital load to retail 2 years history and YTD.
Has anybody done Table partitioning for this size and what are the issues , concerns and best practice for the same.
Please help, Really appreciate your contributions.