THE BLOG

Featuring fresh takes and real-time analysis from HuffPost's signature lineup of contributors

Mark Whitby Headshot

2015 in Focus: Data Deluge, Privacy and Hybrid Cloud

Posted: Updated:
Print

Making predictions in the fast-paced technology world today can be tricky. However, there are several clear-cut trends developing as we approach 2015 that deserve close attention from the industry at large. These are; an improvement in privacy policies, a shift towards hybrid cloud models, the impending shortage of data storage and smarter infrastructure for big data applications.

Polished Privacy Policies
An inevitable consequence of the data explosion of recent years has been the growth in consumer concern about how their data is used. Until very recently, consumers had little or no control over what data was kept by the organisations whose products and services they used. In 2014, the European Court of Justice and the European Parliament overhauled Europe's antiquated data and privacy regulations. Consumers and brands now have a 'right to be forgotten' by search engines, for example, a controversial development which has divided opinion across the digital landscape.

Such rulings have so far done little to clarify the complexities of consumer data privacy, but there are other signs that the business world is starting to take customer privacy more seriously. Some enlightened brands are making substantial efforts to explain exactly what data they hold and how they are using it to their benefit - often strengthening their customer relationships and improving long-term engagement in the process. As the role of data as a marketing tool grows in 2015, expect to see many more organisations become increasingly transparent and open about their data assets.

Storage: a dwindling resource
We are running out of disk space, a harsh reality in a world where data is no longer just text files and spreadsheets but countless HD videos and high-resolution images. In 2013, the world generated around 3.5 zettabytes of data, the equivalent of 120 billion 32-gigabyte smartphones, or 600 billion DVDs (or, put another way, more than 3 billion years of video). By 2020, we will be creating more than ten times that amount of data. Unfortunately for us, we're going to run into trouble long before then; by 2016, estimates show that we will be producing data faster than we can store it.

One problem facing the storage industry is that the technology underlying today's hard drives is showing its age. While we can already squeeze enormous amounts of information into tiny areas of disk space with existing technology, there's no way to increase the density of that data. The good news is that several innovative technologies are in development to help solve this problem. For example, Seagate is investing in heat-assisted magnetic recording (HAMR), which is expected to increase the limits of magnetic recording by a factor of 100.

However, commercial HAMR drives are still some way off and are unlikely to appear before 2016, according to most industry experts. In 2015 then, the focus will be on interim measures designed to improve data storage efficiency. Smarter data policies in the workplace, better de-duplication methodologies and more enlightened backup strategies will all have their part to play.

Big data: tiers before downtime
For consumers, the predicted data capacity deficit is likely to be a nuisance rather than a major catastrophe; less space for TV shows and photos for example. But for organisations increasingly dependent on so-called big data to understand their business and their customers, it represents a much greater threat. Today, big data typically resides on traditional storage built around standard hardware and software components, often working in a very inefficient and poorly synchronised manner. But as data volumes grow and processing demands increase with them, IT leaders will have to think much more strategically about how to manage data in the longer term.

One approach that will gather steam while IT departments wait for new storage technology is a much more efficient tiered model based on existing technology. By intelligently layering conventional hard disk drives, SSHD (solid state hybrid) drives and SSD (solid state) drives, IT will be able to organise data much more effectively; allowing quick and easy access to the most critical data from all devices while ensuring that the less valuable metadata is still available and secure on the slower HDD drives. We'll see more data centres built around this approach in 2015, as everyone from IT directors to data analysts and marketing departments struggle to capture, store and make sense of the flood of data from the explosion of devices on the world's networks.

Private and public cloud unite
Cloud computing may appear to be mainstream today, but we are still in the very early stages of its evolution into a fully-fledged business platform. Many organisations are now discovering that their early private cloud implementations are reaching their capacity limits, and are being forced to extend them into public cloud resources (so-called cloud-bursting). Most businesses are starting to recognise that it makes little sense to devote costly, resource-intensive private clouds to business applications that can be run as efficiently from more cost-effective public cloud platforms. As a result, more businesses will use private cloud environments only for mission-critical applications.

These so-called hybrid approaches require a leap of faith, but as cloud technology and security evolves, so too does the level of trust that businesses are prepared to place in them. 2015 will see these private/public cloud implementations and business models emerging more strongly, driven by improvements in cloud storage and bandwidth as well as more robust security measures.

Suggested For You