There is a lot of buzz around big data these days and I am excited by the opportunity to leverage poly-analytics against large amounts of "in memory" data. I think one of the major benefits, that goes unheralded, is the ability to span the time and space gap between operational processes and analytic data.
In the 80s there was a sharp divide created between operational transactions and reporting environments. IBM started this trend and pushed this separation because of the potential impact on performance. This divide also separated inexperienced business professionals from the experienced technical wizards. This artificial wall has been here for a long time. While we broke down the large data caches into smaller and more manageable data marts, but the convergence of operational and analytical data sources was not practical.
Big Data allows for a bridging of what was considered an impossible gap, until recently. This is the real story for organizations, despite the great application of big data for deep thinking problem sets. It is easy to glean tactical benefits with big data quickly when combined with poly-analytics. While this is great, the big win, in my mind, is the bridging of this gap which allows organizations to increase speed and velocity of processes.
This bridging of operational time demands and back-ground analytical reporting allows processes to become smarter and quicker. Add "near real time" and "predictive capabilities" and now organizations are no longer driving through the rear view mirrors and can opportunistically steer through emerging events and patterns.
The "big deal" in big data is the ability for organizations to get ahead by leveraging new smarts and speed in and around their processes. Thank God this divide is finally disappearing for processes in particular :)