I was looking at some results of a benchmark test MicroStrategy did on its product. The benchmark was about supporting large numbers of users. MicroStrategy is pushing into the mobile business intelligence business and it has identified (correctly in my opinion) user scalability as one of the key challenges in this area.
Anyway, the benchmark tests to see how MicroStrategy would do with up to 100K users or so. Impressively, (if not surprisingly, considering they ran the tests themselves) the product did well. They ran the test both in Linux and Windows.
There were two things I found intriguing about the test: The hardware configuration of the application server, (144 GB of RAM), and the fact that they allowed the system four hours to load the data into cache, which is basically an in-memory cube.
In other words the Oracle database that was holding the data probably wasn't working very hard during the actual test. It worked during the four hours of loading, and was probably done for the day. Most of the heavy lifting was going on in MicroStrategy's in-memory database solution.
I think this is what we should expect more and more in the future: large scale database solutions of this type will make use of the falling RAM prices to squeeze better performance out of their solutions. Mobile applications with large numbers of users will need in-memory technology.