Home | Blog | What is Big Data and why is it important?

ShareWhat is Big Data and why is it important?

Matt

Matt Mills, Head of IT

Big Data is the latest industry buzzword to describe large volumes of structured and unstructured data that can be difficult to process and analyse but could potentially be used by organisations to improve their efficiency and make more informed decisions.

That leads us to the next question – what is structured and unstructured data? Structured data is more easily organised into a database, it generally fits neatly into set fields. Unstructured data however is information that is not organised or easily interpreted by traditional data models and processes. It’s usually text heavy or uses a variety of formats (e.g. images, text, video) and is much harder to analyse.

Gartner describes Big Data as “high volume, high velocity, and/or high variety information assets that require new forms of processing to enable enhanced decision making, insight discovery and process optimization”.

Gartner’s ‘3Vs’ are described as:

  • Volume: Big Data doesn’t sample. It just observes and tracks what happens, creating huge volumes of data
  • Velocity: Big Data is often available in real-time
  • Variety: Big Data draws from text, images, audio, video; plus it completes missing pieces through data fusion

The analysis of Big Data can lead to relationships being recognised between data sets that had previously been missed, such as correlations between different functions, processes and areas within the business.

What has all this got to do with business and why is it a big deal?

We are now able to collate huge amounts of data from all kinds of sources – websites, social media, customers, staff, financial and sales reports and more. But this information isn’t always easy to analyse and use to our advantage. However, Big Data is about using modern analysis techniques to combine datasets and contrast information in different ways and thus extract meaningful information that will support decision-making, quickly.

For example, Netflix uses Big Data analysis in its recommendation engine to suggest programmes and films of potential interest based on a customer’s viewing habits and stated preferences. Similarly, Amazon uses it to suggest other products of interest based on previous purchases and website browsing analysis, while Lloyds Banking Group worked with the Advanced Skills Institute and Google to use Big Data to develop customer-focused propositions.

What has any of this got to do with ISPs?

Quite simply, for those organisations that adopt Big Data and advanced business analytics, it is going to drive bandwidth requirements up and make connectivity even more vital. Big Data is very capacity-hungry; large unstructured data that includes large volumes of text, high-resolution images, maps and other rich content is going to put enormous strain both on networks and storage resources.

You need quite specific and well-developed skills to make Big Data work and these factors have made it difficult to cost-justify and have limited adoption to a large degree up to now.

But many organisations are now looking at taking Big Data pilot projects live. According to an article in Information Age, a recent survey of 2,200 Big Data customers found that 76% of respondents who already used open source Big Data solutions planned on doing more within the next three months.

Just as the perceived risk is lower, the cost is also coming down. More cloud- and appliance-based services are now available. According to a recent report by Ovum, a rising tide of IT spending will lift investment in big data analytics, with appliance and cloud driving the next wave of uptake to mainstream enterprises that lack the same depth of IT skills as the early adopters.

Organisations that adopt cloud-based Big Data in particular, are going to become much more heavily dependent on bandwidth. This will be the case particularly if, as predicted by some analysts, ‘Fast Data’ becomes even more important; adaption of specific technologies, such as the Apace Spark processing engine, that speed up the delivery of analytics from Big Data engines, is gathering pace. It means that user organisations will need real-time access to the data.

This means fast as well as guaranteed and very broad bandwidth. It may not happen everywhere or very soon, but the rise of Big Data and in particular Big Data-as-a-service (BDaaS) and Fast Data, will drive demand for even bigger, faster, more resilient connectivity.

For ISPs or resellers serving enterprise or mid-market companies, it may be a good time to ask customers and prospects about their plans for Big Data and whether or not they have looked at the underlying infrastructure and connectivity that they will need to support it.

Have your say!

Do you proactively analyse data to improve your business processes or as a marketing tool? Are you looking for new ways to collate and analyse the information you have available? Do you see Big Data usage generating more opportunities for your business? Have your say by leaving us a comment below.

Related articles

Further information


 

Share this article:
Share

Rate our article...

Sorry, comments for this entry are closed at this time.