We all know that when you add lanes to a highway, the improvement in journey time is short lived, as more vehicles start to use the new road, creating more traffic, there are then more chances for accidents and this results in slower journey times.

The same seems to happen with technology, Moore’s law is the observation that the number of transistors in a dense integrated circuit doubles about every two years. And this drives performance, which is quickly consumed by operating systems and software advancements. Sure, every year the amount of processing increases dramatically, and this allows for more complex ideas to be expressed digitally and this then creates increased complexity and increased chances of something breaking.

In the 1980’s an algorithm was developed that allowed very weak signals to be used to transmit and store data accurately. This algorithm (called Partial Response, Maximum Likelihood or PRML) allowed for massive increases in throughput and storage and was a driving force in digital wireless and wired communication, and was possibly the most important technological improvement of its decade. Every time your spouse complains that you are not listening, and you respond with “I heard exactly what you said”, this is the biological equivalent of PRML in action, but I digress…

Today’s IT environments are so complex, that it is literally impossible for a single person to understand it all. It often takes hundreds of people to just run systems, with thousands more developing them.

The technology used even a decade ago is no longer practical to use to monitor modern systems, and yet, the cost of change to these systems has proven to be too great, so almost every enterprise is still using technology with its roots in systems delivered in the 1960’s.

Think about it Unix, Linux and C were first delivered in the early 1970’s while much of the ideas behind the mainframe were from the 1950’s and 60’s. Hypervisors and virtual machines – 1960’s! ARPANET, TCP/IP the 7-layer model and the concept of “internet” started in the 1980’s, and many of the issues around security, micro-payments and performance were identified very early on, but as still issues today.

Where this leaves us is with massive complexity, where millions of metrics must be observed, monitored and analyzed continually, and from this data can be derived information that can then be used to make decisions.

But with so much to monitor and analyze, decision making becomes bogged down in trying to see what’s important. It doesn’t matter how pretty the graphs and gauges are, if there is too many to monitor, it’s always going to be too easy to miss critical information.

There are two areas of technology that are critical in trying to simplify how you monitor complex environments.

Business Abstraction

This is where you automatically stitch together an understanding from the underlying data of how technology impacts business. Instead of looking at all of the parameters of every IT sub-system you instead see the impact IT has on users. This is often referred to as transactions monitoring, transaction tracking or business flows. But the result is you can identify subtleties that indicate the early points of performance issues before they become important and can take steps to solve.

Without abstraction you are left having to describe every integration point and consider every scenario that could possibly every happen. In IT terms this means writing and maintaining millions of lines of script (code) to describe how things fit together. This is too cumbersome and too expensive for modern systems. Any monitoring system that relies on custom coding to work will cause you long term costs and limit your ability to grow.

Abstraction is the fundamental of all management.

 

Machine Learning

Using historical data to predict future events is the future. Knowing the probability of an event happening allows you to focus and prioritize. Machine Learning (M.L.) algorithms are often referred to by the marketing term “artificial intelligence” or A.I. but ML is not AI, it’s just math, albeit complex, state-of-the-art math, that allows smart people to be predictive. The trick is to deliver ML technology that business and technology people can use to be predictive without the need for PhD’s in data science to build code. Real world, real-time ML systems create new ways for people to understand information.

Putting Business Abstraction together with Machine Learning within the framework of enterprise monitoring delivers a new way to control the availability and performance of complex environments.

 

Want to find out more www.nastel.com