Time aggregation case study
/Whenever possible, we will use case studies and experiments to illustrate general concepts and specific features of our framework. We will try to create such examples based on different types of data. In some cases, we will use data streams within the context of a related application domain (e.g. healthcare or security). In other cases, we will use them as generic data samples, without considering their sources or possible goals of analysis. This post belongs to the second group, as we will use a stream of stock prices in order to illustrate some general challenges related to the nature of time dimension and the characteristics of time-oriented data.
In this post we look at time aggregation which is a transformation of data from a higher-frequency time granularity into lower-frequency statistics. This is actually a very common and practical scenario, applicable when we need to calculate aggregated statistics over intervals, for example a sum of weekly sales or a count of server responses per minute. The stock price data we use are also a product of time aggregation - they are last values for daily intervals (specifically adjusted close). The chart with the input stock price stream before aggregation is presented in Figure 1 (with horizontal lines for maximum, average and minimum values across the stream).
Time aggregation is a simple transformation that can be executed with a single command or a few intuitive gestures. It is usually parametrized with an aggregation interval and a summary function (e.g. count, average or standard deviation). The chart in Figure 2 includes our original data stream with a background of new data series created by aggregating the stream by MONTH intervals and with MAX/AVG/MIN functions. It is worth noting that the data series in Figure 2 are using both point (stock prices in navy) and interval (aggregated values in orange) time models. Since the base time granularity is determined by stock price (a day), we are using a stepped line for monthly summaries.
The first visible challenge with such a transformation is a different number of days in a month, and thus a variable duration of aggregation intervals. The mapping between different time granularities can be problematic also on different levels, for example in business scenarios, where analysis is focused on workweeks that are affected by irregular holidays. Analysis scenarios can become more complex if we deal not only with the time dimension but also with locations. Aggregating multiple data streams from different geographical points (e.g. sales across the country) can be affected by time zones, especially if short intervals as selected (e.g. daily statistics).
When using time aggregation, we need to pay attention not only to the time dimension, but also to other data elements, especially during selection of summary functions. Applicability of these functions depends on characteristics of analyzed data, e.g. calculating an average will not work with values of nominal or ordinal scales. But it also depends on context of data. In our case it wouldn’t make sense to calculate sum of prices per an interval, since our data points are related to a state (a price of an asset). However, if exactly the same values (including currency) would be related to events, like sale transactions, calculating average price per interval could be very useful.
We will discuss more detailed challenges related to analysis of time-oriented data in future posts. We will also show how technology can help with solving these problems.