Prev: Initial review of xslt with no limits patch
Next: (9.1) btree_gist support for searching on "notequals"
From: Hardik Belani on 2 Aug 2010 03:20 We are using postgres as RDBMS for our product. There is a requirement coming for a feature which will require me to store data about various data points (mainly numbers) on a time scale. Data measurement is being taken every few secs/mins based and it is needed to be stored for statistical analysis. Now this requires numbers (integers/floats) to be stored at every mins. For this i can create a table with number and time (may be time offset instead of timestamp) as columns. But still it will require me to store huge number of rows in the order of few millions. Data is read only and only inserts can happen. But I need to perform all kinds of aggregation to get various statistics. for example: daily avg, monthly avg etc.. We already are using postgres for our product so using postgres does not add any additional installation requirement and hence it is a bit easier. Would you recommand postgres for this kind of requirement and will be provide the performance. OR do you recommand any other database meant for such requirements. I am also searching for a good historian database if postgres doesn't suppport. Thanks, Hardik |