There’s so much jargon being thrown around these days. Some of it relates to new or emerging technologies that are actually quite independent, but because so many of these names and terms are often mentioned together it only adds to the difficulty that non-technical business leaders have in understanding these technologies and where they might be of use to their companies.

Big data analytics” is one such term. Alternatively, one also hears of “analytics and big data”. And then there’s “marketing analytics”, “social media analytics“, “web analytics”, “audiostream analytics”, “text analytics”, and so on.

It seems like analytics is often clubbed with other words, many of which describe various types of data that are often associated with large volumes of inflow and storage.

Analytics, of course, is the application of statistical methods of analysis to gather business or functional meaning and insights out of data. It’s about aggregating, grouping, distributing, finding means, standard deviations, and other metrics. It can also be used predictively.

When analytics is applied for a business it can be referred to in conjunction with the functional purpose it is used for, for example, security analytics, marketing analytics, risk analytics, HR analytics or operational analytics.

Looking at what type of data is being analysed, analytics may be referred to in conjunction with the source and form of data being analysed. Examples are text analytics, email analytics, clickstream analytics, audiostream analytics, video analytics, and so on.

Putting the functional purpose and the type of data together, we could be performing analytics on email communication for security purposes, or financial transaction data for risk purposes, or employee training data for HR purposes. It could be a combination of many types of data for marketing or customer relationship purpose. It could be almost any combination of types of data for any functional purpose, as long as there’s a plan aimed at getting a meaningful output.

So does analytics need big data? The answer is “it depends”. For a statistical analysis to be meaningful it should be done on the right types of data. For the analysis to be reliable it should be done on an adequate quantum of data, so that the sample size adequately represents the overall population. And of course, the age of the data also matters if predictive support is required for decision making. And then there’s the rate at which new data points are generated. And finally, of course, the data has to be of reliable quality, otherwise it’s just another case of garbage in, garbage out.

If the volumes of data required to be analysed are higher than what existing systems can cater to (in the order of several dozens of terabytes upwards), the data is of diverse variety of the kinds not handled by traditional storage infrastructure, it has to be of very high veracity for statistical reliability, and it streams in at a higher velocity than existing data capture and transaction processing systems can handle, then that’s when a big data perspective has to come into play.

Otherwise, if the volume, variety, veracity and velocity are all adequately handled by existing enterprise systems, and the data available in them provides adequate samples to provide statistically valid output, then analytics can be executed without resorting to big data.

The bottom line is that analytics is about the application of statistics and quantitative techniques on data. Whether it needs big data is a separate and secondary question, and not an affirmative that goes without saying.