The Big data concept is defined around four key characteristics: data volume, data velocity, data veracity and data value. While the volume and velocity characteristics are concerned with the data generation process as well as how to capture and store the data, the veracity and value characteristics primarily deal with the quality and the usefulness of the data. Leading experts have identified an imbalance, where excessive “useless data” is inhibiting our ability to make utilize of “useful data”.
If we continue to focus on the volume and velocity aspects, we create a big data problem – essentially data comprising of too much ‘noise’. However, if we shift our focus to smart data (veracity and value) we can filter out the noise and hold the valuable data, which can be effectively used by the enterprise to solve problems.
Then issue comes from deciding what is quality and what is noise. However, by analysing data qualitatively rather than quantitatively we not only become data-driven but it also creates opportunities to become creatively-driven. By analysing data more qualitatively we can turn big data can into smart data.
This is not a simple exercise; you have to let the data tell the story, removing as much bias as possible. At the same time data interpretation should not be a random exercise; it should increasingly lead us towards understandable solutions and actionable tasks. Data collection and analysis should have the objective of improving existing process’ performance, or developing capabilities to predict future outcomes.
This basically means your focal point should not simply be to collect masses of data, but also to contextualise each element of data with its own specific context. In a smart building, for example, the productive creativity of data generation and analysis comes from bringing together video, access, sensor, geospatial and even social media data, then identifying patterns and correlations that can give insight and bring about solutions.
According to a our report Big Data in Smart Buildings, “we estimate that the market for Big Data and Cloud Based Software and Services in Smart Buildings will grow to over $9.17 Billion in 2015 and at a rate of 33.2% CAGR to nearly $30 Billion by 2020”. As the market continues to grow, the case for quality over quantity also grows.
“The key to actionable intelligence is data correlation. Smart buildings will need to combine multiple data streams to provide a more holistic evaluation of a building’s environment. Formally integrating previously disparate functions of physical and IT sensors allows for the correlation of data in a more efficient way by drawing trends and metrics from a digital perspective to increase utilisation at a higher and more efficient level”, states James Chong, Founder and CEO, Vidsys.
Chong sees three major smart building data correlation trends taking place in the coming years. In security he sees the convergence of physical security and cyber security. These two branches of security have developed independently in the smart era, but the exponential increase of big data, allows risk to be viewed in a more converged manner, leading both sides of the security market toward one common denominator – analytics. The more information that is collected and analysed from both physical and logical sensors can lead to better security across the board.
Through the occupant experience, Chong believes that we can drive better insights and awareness around lifestyles and sustainability practices. As IoT expands into the consumer market, more and more data will be collected around each individual’s needs, habits, and patterns, including energy usage – this can lead to increase customer satisfaction, sustainability and energy efficiency.
Chong also sees the interconnectivity between building technologies as a key driving force towards a more efficient and automated approach to building management. “Assets and key systems within every building will ultimately begin to communicate and share data with different IoT applications to run more efficiently and optimise themselves, and do so more in real-time”, he said.
The ability to collect data from various sensors alone is not sufficient to bring about such monumental change. The challenge for organisations is not just to gather more data, but to utilise that data in order to making more informed decisions, effectively creating “actionable intelligence”.
This concept is being increasingly understood and put into action. Nowadays, most Big Data comes from large companies who have preened, cleaned, standardised and processed it so that others can understand and use it. This is why data scientists are so highly valued and in demand, they deliver clean data that is actionable. Although there is still more we can do to tackle the imbalance surrounding “big data” as we charge full speed into a “smart” world flooded with masses of noisy data.
We are not talking about quality verses quantity; data quality and quantity are not mutually exclusive. You can have a very large, reasonably accurate data set. However, you cannot reach this point without investment in quality, and not without the realisation that simply increasing quantity does not solve problems or bring about improvements.
[contact-form-7 id="3204" title="memoori-newsletter"]