Skip to main content

Posts

Showing posts from September, 2019

Post #3: Growth of Big Data

There was an incredible amount of internet growth in the 1990s, and personal computers became steadily more powerful and more flexible. Internet growth was based both on Tim Berners-Lee’s efforts, CERN’s free access, and access to individual personal computers. In 2005, Big Data, which had been used without a name, was labelled by Roger Mougalas. He was referring to a large set of data that, at the time, was almost impossible to manage and process using the traditional business intelligence tools available. Additionally, Hadoop, which could handle Big Data, was created in 2005. Hadoop was based on an open-sourced software framework called Nutch, and was merged with Google’s MapReduce. Hadoop is an Open Source software framework, and can process structured and unstructured data, from almost all digital sources. Because of this flexibility, Hadoop (and its sibling frameworks) can process Big Data. Big Data is revolutionising entire industries and changing human culture and behaviour

Post #2: History of Big Data

Big Data has been described by some Data Management pundits (with a bit of a snicker) as “huge, overwhelming, and uncontrollable amounts of information.” In 1663, John Graunt dealt with “overwhelming amounts of information” as well, while he studied the bubonic plague, which was currently ravaging Europe. Graunt used statistics and is credited with being the first person to use statistical data analysis. In the early 1800's, the field of statistics expanded to include collecting and analysing data. The evolution of Big Data includes a number of preliminary steps for its foundation, and while looking back to 1663 isn’t necessary for the growth of data volumes today, the point remains that “Big Data” is a relative term depending on who is discussing it. Big Data to Amazon or Google is very different than Big Data to a medium-sized insurance organisation, but no less “Big” in the minds of those contending with it. Such foundational steps to the modern conception of Big Data invol

Post #1: Definition of Big Data

Big Data  is a term that is used to describe a massive volume of both structured and unstructured  data  that is so  large  it is difficult to process using traditional database and software techniques. In most enterprise scenarios the volume of  data  is too  big  or it moves too fast or it exceeds current processing capacity. Big Data  comes from text, audio, video, and images.  Big Data  is analysed by organisations and businesses for reasons like discovering patterns and trends related to human behaviour and our interaction with technology, which can then be used to make decisions that impact how we live,  work , and play. This Big Data can also  be analysed for insights that lead to better decisions and strategic business moves.