Making big data into small data

49efc34fbb0731233d672e66fe50b393delicious

Complexity in big data can be reducedIt isn’t every day you get to write about Immanuel Kant and Big Data in the same blog post but last week Gartner Analyst Will Cappelli did just that. As you’ll see in his post, “AI and IAM: Will Two-Tier Analytics Become the Norm for IAM?” context is the key. Cappelli addresses Kant’s conclusion that human reasoning is a two-tier process that first involves what is—the contextual lens in which we view our existence—and how the pieces all relate to each other. From this standpoint, we reason and make decisions.  

In technology, our view and approach to something like Big Data is impacted by the context of our approach. Do we focus on the “big” part and the technologies required to store and retrieve it, or the “data” part that illustrates our ability to make sense of information and act on it? In our minds, we tend to separate the two, but K&C (Kant and Cappelli) both seem to suggest it’s impossible to do so.

Cappelli writes, “I am inclined to think that Kant and the cognitive scientists have hit on something which is not just true of the processes that govern human cognition but rather reflects the deep structure of any process that seeks to turn volumes of raw, noisy data into information capable of grounding action taken by human beings or machines.”

If we look at this from an application performance monitoring perspective, the number of alerts annually generated by event and performance systems has increased, on average, by 300% among Global 2000 enterprises, according to Gartner.  One customer we spoke to told us their monitoring systems generate millions of alerts per day.  This is considerably more that they can handle.   Clearly “Big Data” spells “Big Problems”.

Several things immediately jumped out at me when I read this:

  1. The problem we’re facing as an industry termed “Big Data” occurs on many levels.  The very systems we create to monitor applications become themselves contributors to the problem by generating its own alert big data.
  2. There is a cultural context to this issue as well.  IT workers are pack rats by nature.  They capture all the data they can, saving it “till the end of time” …just in case it might someday be important.   How much of this data is actually read?  — Very little.  How much is understood?  Unknowable.
  3. Keeping true to the basic premise of Moore’s Law, we’ve accelerated the rate at which we can collect data.  The more we automate the more we will create – this is not a short term problem. And the more it increases, the less ability we have to make sense of it. This process is breeding complexity and volatility in our business environments that we have never faced before.

We’re constantly reminding ourselves to be mindful of this at Nastel, working with our clients and with our application performance monitoring solutions in a way that we can best address the market challenges that are shaping our future.  Ever-increasing complexity is a constant.  In the application monitoring space, customers and prospects often have multiple monitoring systems in place for their middleware.  We can help make better sense of the data acquired by acting as a funnel, a single point of actionable analysis.  And since we can monitor all common middleware, we help our users deduplicate their tools and reduce the complexity.  The low-latency of our complex event processing engine can handle the volume of big data and in real-time separate the noise from the information.  This is a big step towards making Big Data into Small Data…small meaningful data. All of this serves to act as an anti-entropic force to reduce complexity, manage costs and let IT focus on delivering optimal service to the business.

Leave a Reply

Your email address will not be published. Required fields are marked *