Click here to Skip to main content
15,885,244 members
Please Sign up or sign in to vote.
4.00/5 (1 vote)
See more:
I am working on a database design for an application. The database has been designed, to be specific, Tables, views, Stored procedures etc are in place. There are total 7 tables, 8-10 views and 4-5 procedures and some of the Triggers may follow. Out of 7 tables, there are 3 transactional tables, for which data will keep growing. Now based on a specific implementation of the application, the data may grow exponentially. Because of which the performance would surely suffer. Because of reporting requirements, if I continue to use this single database, even reports are prone to crash.

What would be best approaches to have performance and archival/Stagging together? Performance in terms of reporting and application.

I was thinking to write a few scripts which would transfer the data (how it is currently placed in the tables.. {the structure etc}) from current database to a new, but perhaps in such a way that all data is transformed to either a single table or a few tables than original database, such that even if I purge those data(for a designated period) from original database, but still it is available in new database considering that in future there could be a new reporting requirement which a customer may ask on existing data.

Could you all please shed some more light of knowledge onto this?
Posted
Comments
PIEBALDconsult 31-Jul-15 12:12pm    
Seek thee "Replication"?
jgakenhe 31-Jul-15 15:18pm    
I'd to a little data mart and use SSIS to do the initial and then subsequent loads. You could even do it in the same database instance and database, just use a different schema so the tables are easy to pick out. This would be perfect for non-trending reporting purposes.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900