cancel
Showing results for 
Search instead for 
Did you mean: 

how to keep up the performance with data volume grows?

Former Member
0 Kudos

hi ,

I have a question. My report data is from a table. Every month the data in this table grows .

I am worrying one day in future ,my report performance will become unacceptable.

What's your solution to keep the report performing?

Regards,

Tony

Accepted Solutions (1)

Accepted Solutions (1)

0 Kudos

Hi Tony,

a solution is to use aggregated tables in your DWH. You can use the @AggregateAware operator in your universe then to let the user access either the aggregated or the detailed data transparently.

Regards,

Stratos

Former Member
0 Kudos

hi,

We have no DWH...

besides,even aggregate table also grow,right?

I am thinking, is it feasible to transfer the history data to a new place(for example, a tape ) every lets say, 10 years?

Regards,

Former Member
0 Kudos

Tony,

I am thinking, is it feasible to transfer the history data to a new place(for example, a tape ) every lets say, 10 years?

Moving data to a tape is a local call on your part. Another technique is called "database partitioning", both Oracle and SQL Server (SQL Server 2005 and beyond) have these features. Basically, the database server creates separate areas for the historical files based on a key column in the datamart (normally the file date/time stamp), so the process becomes transparent to the SQL that is generated, the server inserts a modifier to the "from" so that it pulls in the correct partition based on the where, etc, etc.

Thanks,

John

0 Kudos

Hi Tony,

you can for sure design and implement a backup strategy for your data, depending on the type of the database server you are using. Some projects require a consolidated view of the history data even if the detailed data have been archived in an external storage system (eg. tape). In this case the AggregateAware operator can help you implement such a requirement.

Regards,

Stratos

Former Member
0 Kudos

you are right,thank u for you advice.

Former Member
0 Kudos

hi,

if user want to see detail data, and user need to turn back the time to see past detail data, the data volume maybe very big.

except the partition table, any other solution you usually have?

Regards,

Tony

Former Member
0 Kudos

Tony,

any other solution you usually have?

Not really solutions, but rather principles to follow in good management of a database server:

- avoid any unnecessary activities on the database server that will compete I/O with the database application during the "online" session (like archiving files to other storage areas, performing DB backups/restores, etc)

- avoid loading data during the "online" session

- before loading data, drop associated indexes on that table, after loading data, add required indexes again

- measure performance and keep tabs on "normal" runs for benchmark purposes; keep measuring performance to ensure "normal" runs still complete on time, investigate any anomoly for root cause and resolve as soon as practical

- limit the number of concurrent logins to the database application (Web Intelligence Server limits it to 5 concurrent, however, adding more Web Intelligence Servers, either on the same BOE apps server or via cluster causes the concurrent pool to increase by 5 each time, which will definitely affect performance)

- if you are operating under a SAN or NAS for your databaser server storage, ensure your Administrators are applying best practices as outlined in the link to the following PDF (though this PDF is written primarily to SQL Server management, several best practices are specified that would be beneficial for any SAN/NAS approach): [http://www.computationpress.com/images/sql_disk_001_10a.pdf]

Just some thoughts and my two cents worth. I'm sure that other contributors here could pipe in more info along these lines.

Thanks,

John

Answers (0)