I was at the CoreNet Global Summit held in Singapore a few months back, and thought it would be good to share some insights from the sessions discussing data. The general consensus is data will definitely be relevant, as studies have shown that 98 percent of firms are expecting to use some form of data analytics—whether large or small—by 2018.
Exponential growth of data in the digital universe
Across the different sessions on data that I had joined, the message was clear: the amount of data being captured in the urban environment is increasing by leaps and bounds. During the session “Big data in the smart city: toward an urban Internet of things,” Professor Andrew Hudson-Smith us the data that he and his team are capturing across London, and how they are using such data to map transport information, emotional responses of people walking down the street, etc. Professor Andrew Hudson-Smith’s research mainly focuses on self-monitoring analysis reporting technologies (SMART), which, essentially, are showing us that data is everywhere and in every form possible one can think of.
Here are some interesting facts to help you better understand and relate to the speed at which data is growing: Research has shown that the amount of data that we have in the digital universe is doubling in size every two years. The same research also expects that by the year 2020, the amount of data that we create and copy annually will reach 44 trillion gigabytes—a 900 percent increase in just eight years, considering how we were only using 4.4 trillion gigabytes of data in 2013.
Accurate data forms the foundation of meaningful insights
Rubbish in, rubbish out—we have all established the importance of creating actionable insights, but we must also remember that such insights can only be gained from accurate data. If your data is not accurate, chances are, the insights that you pull out will not be meaningful. And because this whole process is a cycle in itself, it will naturally lead you to take unnecessary actions—something that could ultimately be a waste of your effort, time and money.
One way that we can ensure the accuracy of our data, as suggested by Professor Andrew Hudson-Smith in one of his talks, is by testing out the data. An example he gave was the London census—many people lie on the census, but such data can actually be tested by checking it against social media data available on the network. Doing so may sound very simple and almost silly, but helps considerably in ensuring that we are not going in circles when it comes to making business decisions. We need to find effective ways to test and validate our data.
Virtual reality is the next big thing
Virtual reality (VR) has been around for a while, but it is set to have a much stronger presence in 2016 than ever before, with activity in the VR space ramping up and technology companies launching their latest VR technologies (e.g., HTC Vive, Oculus Rift, Sony PlayStation VR and Samsung Gear VR). But what does this mean for big data? With the amount of data expected to explode over the next few years, today’s data visualization tools will no longer be sufficient. Such traditional tools fail to extract information in a way that the human mind can comprehend—meaning, they tend to show only what we already know about the data rather than reveal the unknown.
VR technologies, which enable users to navigate and interact with vast amounts of business data—in infinite space and in real time—through constantly adapting presentation styles that avoid brain overload, can allow data to be visualized in a more “empathic” way, thereby giving a deeper and more meaningful sense of the numbers.
There are many exciting things coming up in the data analytics space. Are you ready for what’s ahead?
Interested to find out more about Future of Work? Learn more about our outlook on the changing world of work here.