|I'm using a postgresql database and managing it becomes harder and harder due to the sheer size of it.
the largest table contains about 2.5-3 billion records and there are some tables reaching about 10 million records. Database size is about 1.2 TB on disk. I believe the large table (and the tables around it) is the most problematic since there are (very roughly) 10 million write statements a day.
I have had issues with the transaction id's overflowing. Luckily by now I know the cure, but I would like to avoid this issue (and any other issue!) if possible. Especially since the database is expected to grow even further. VACUUM takes like forever on that large table.
I did a search on managing large databases, but most results are for other databases. Only found a not so helpful presentation on slideshare.
If anyone has good (practical) tips about keeping such a large database into good shape, please let me know.
(I hope I explained things well enough)