A question I have been considering recently is whether innovation is good for performance. If I was writing about business performance the answer would hopefully be yes – but I am considering IT systems performance.
It seems that each new version of a product tends to be claimed by its manufacturer to perform better than the last, and be more scaleable and so forth. It is also true, however, that the system products around at the moment could not have run on the systems available even a few years ago.
The cycle of upgrades definitely serves manufacturers well, as the need to stay supported fuels software purchase, which fuels hardware purchase and so forth. On the back of this manufacturers also add new software features that increase the power and flexibility of the tool set. It is common, however, for upgrades to be performed purely to stay on a supported release – indicating that the customers often upgrade because they have to rather than because they see benefit in doing so.
Do we really need all this to do the job – or are we just buying in to the products the manufacturers want to sell us?
An interesting example of technical innovation has come to my attention recently. Data federation was recommended as a way forwards for an information management system. This is an innovative way to build a data warehouse without copying and moving data into a central store. It comes from the Service Oriented architecture stable and has been “main stream” for a while. It should have significant benefits if you believe it really works. One of these benefits should be that good performance is achieved by having a caching and management layer above the production databases. The fundamental question was whether it could perform well enough in the environment we needed it for. Was acceptable performance achievable using this innovative technology?