Why Should You Bother With Information Technology Operations Analytics?

Your organization’s IT system is a complex network of intercommunicating devices that can provide you with an abundance of useful data – if you apply the right practices to gather and filter it. There are many sources of this information such as clients, applications, the network, and even business events, as well as different ways of gathering this information (of which we will talk about a little bit later). However, to realize how each of these sources interacts and interconnects with one another, you need will to master the art of Information Technology Operations Analytics.

WHAT IS INFORMATION TECHNOLOGY OPERATIONS ANALYTICS?

Information Technology Operations Analytics (ITOA) is used to discover specific patterns that would otherwise be very difficult to pinpoint in the great amount of performance data and complex IT systems. Simply put, ITOA collects large amounts of data, which is then used to uncover the patterns in traffic within the IT chain.

WHY IS ITOA IMPORTANT?

Your IT environment comprises a number of different technologies, and each one of them provides you with a set of specific data: how much traffic you have flowing through it, are database systems working properly, and what the end-user experience is as well. Without IT operations analytics, gathering and correlating all this data would be painstakingly slow and inefficient.

By implementing an ITOA solution into your organization, you can comprehend the big picture and keep track of the entire IT chain. This insight allows you to:

  1. Pinpoint the cause of issues that occur
  2. Address the problem in the shortest amount of time
  3. Reduce downtime
  4. Avoid overspending or overprovisioning
  5. Make informed, data-driven decisions

Finally, an ITOA architecture can be dubbed as the operational data analytics across the entire network. Not only does it provide you with an overview of your entire network, allowing more accurate and efficient analysis, but it also gives you understanding of the larger context which helps you respond and act quickly.

ITOA AND BIG DATA

To put it as plainly as possible, ITOA is the Big Data of and for IT. The essence of IT operations analytics is allowing the organization to make data-driven decisions and operate in a more data-driven manner – just like the objectives for Big Data.

IT operations analytics uses the same three-step principle as Big Data analysis:

  1. Extract
  2. Index and store
  3. Analyze and visualize

Of course, to improve your ITOA, you need to add new data sources whenever possible. It will provide additional flexibility and improved reliability to your decision-making.

Finally, keep in mind that even though the ITOA tools provide you with an overall view of your entire network, they lack something only an experienced network administrator would have: expert knowledge about your operating environment and infrastructure components. Therefore, the tools can alert you that there is a problem (or a set of conditions that may lead to the problem), but they can’t pinpoint the root cause of it.

THE SOURCES OF VISIBILITY

As far as ITOA is concerned, there are four different sources of visibility, all coming from your organization’s IT system.

  1. Machine data

Machine data is self-reported information gathered through clients, servers, network devices, applications, remote sensors, or security appliances. Essentially, it is data gathered by the machine itself.

While you can get an abundance of information that gives you a clear picture of the state of your IT system, you are still relying on the machine to gather the information. There is certain data that machines cannot provide (for example, information found in transaction payloads), and as always, there is a possibility of malfunctioning.

  1. Wire data

Wire data is information that results from the communications between networked systems. With the increase in internet speeds, communication between servers, for example, is a plentiful source of data. If the data is properly extracted and analyzed, it can prove to be a rich source of business and IT intelligence.

Keep in mind, however, that not all information that comes through the wire is of value to performance, security, or business analysis. There is a lot of noise out there.

  1. Agent data

Agent data is gathered from the call stack sampling and bytecode instrumentation. In custom applications, agents generate method performance profiles and by using tags or keys, trace transactions through various tiers of the application.

Agent-based monitoring represents precise analysis in the ITOA and architecture. So, if you determine that there is an application that needs attention on the code level, this is the way to go. Finally, you must be aware that you are trading off the application’s performance for a data-gathering feature. In order to gather necessary information, agents slow will slow the application down.

  1. Synthetic data

Synthetic data comes through active services checks or hosted monitoring tools, and can be defined as “scheduled and scripted time-series transaction level data.” Synthetic data relies on service checks or “Pingers” that give an answer to the question “Can we build a connection?” These answers provide insight into users’ experiences depending on their geographic location.

Synthetic data helps the IT team test transactions from around the globe as well as within the data center, enabling them to pinpoint failures. The Pingers are cheap and easy to implement, but they don’t give you a reason why your transaction is failing–they let you know only that it is failing.

  1. Other sources

There are other sources that can improve visibility that mostly consist of human-generated data. These sources include text documentation, social media posts, engineering diagrams, and messages from the internal collaboration software.

By analyzing this data, you can get  better insight into the contextual information of the end-user. This information is unavailable to the operations team. Keep in mind, however, that the actionable intelligence based on this data is available only with a limited number of tools and is still regarded as somewhat experimental.

CONCLUSION

Unlike some other techniques that allow you to react only when a problem in your network already occurs, Information Technology Operations Analytics provide you with a timely warning that something might go wrong. Keep in mind that the ITOA is not enough on its own: you need an experienced team of network administrators to decipher and understand the data and make proper decisions based on it. The system can be powerful, but only if it is in the right hands.

Get the Monthly Tech Blog Roundup

Subscribe to the latest in log management, security, and all things Graylog blog delivered to your inbox once a month.