The Future of Big Data
Since the advent of cloud computing, the requirements of data storage have been growing exponentially over time and we no longer are confined to limited storage constraints as the cost of storage continues to decrease.
Combined with the ability to leverage outsourced computational resources such as Hadoop in the cloud with instances that can be run on demand, the barrier/cost of entry to highly intensive processing tasks has become more widely available to the general masses and is no longer confined to use by large scale institutions.
Big Data analytics comes into it’s stride when combined with data sets larger than most traditional data analytics approaches can handle, Key examples of these use cases are Internet of Things, eSciences, Mobile devices, infographics, CCTV and security where data is collected and archived 24/365 in real time.
How we process big data in future will also need to change in order to improve efficiency and further reduce cost. For example would it really be necessary to back up all of a businesses security cameras constantly to the cloud or would we only be interested in the frames where the reference frame had changed or includes a person or point of interest?, what knock on effect would this then have on reviewing footage and would the timescales and manpower required to investigate an incident also reduce ?
Interoperability and open source will eventually trump closed technology ecosystems, the more sources of raw data input into such an ecosystem the stronger the analytical correlations and therefore the stronger incentives to innovate and compete. In the future autonomous vehicles and robotics would benefit from such standardisation procedures; traffic/accident reporting, statuses and charge stations for electric vehicle locations and their associated sensors reap limited benefit if the technology is proprietary manufacturer specific technology only.
As we get more acquainted with the fundamental aspects of big data we will find it possible to make predictive analytic models, forecasting events such as what conditions cause a user chooses to turn their central heating on or earlier storm warnings and consequently enable us to take action before the event happens such as turning the heating on when an ambient thermostat falls below a certain temperature or issuing an storm warning before it happens and consequently adjusting the course of logistics air or ship routes on the fly.
To this extent we can quantify the big data by applying the 3V’s principles. Volume, Variety and Velocity; thus if these are constantly expanding then it stands to reason that so are the computational resource requirements to process their underlying raw and unstructured data for analytical insight and therefore make informed and meaningful decisions.
Deep Learning applications can tailor bespoke services and feedback to their target audience as the user interacts with a given service, at present we see this often in the form of individual targeted advertisements based on browsing habits, but in future this may involve content delivery networks adjusting specifically the applications that we use according to our habits and tendencies to produce alternative content such as dynamic websites or applications. Speech recognition is also able to identify the user and control the very devices we interact with on a day to day basis even notifying the user when they need to replace the groceries in their refrigerator because they are either out of date or have all been used up.