meta data for this page
  •  

This is an old revision of the document!


My Own Definition of Big Data

Big Data refers to groups of data that reach the limits of processing capabilities of commodity data manipulation/analysis tools. Therefore, robust solutions have to be addressed to understand, relate and profit from the information found based on the processed data. The boundaries of the amount of data that become processable or that represent high costs to be processed change dynamically depending on technological advances and availability of the technology.

Bigdata represents an increasing business opportunity since new market segments or preferences withing a market can be found by the analysis of the data. Furthermore, entire new products or industries can bloom from the analysis of large data sets.

Other definitions of Big Data

McKinsey Global Institute

“Big data” refers to datasets whose size is beyond the ability of typical database software tools to capture, store, manage, and analyze. This definition is intentionally subjective and incorporates a moving definition of how big a dataset needs to be in order to be considered big data—i.e., we don’t define big data in terms of being larger than a certain number of terabytes (thousands of gigabytes). We assume that, as technology advances over time, the size of datasets that qualify as big data will also increase. Also note that the definition can vary by sector, depending on what kinds of software tools are commonly available and what sizes of datasets are common in a particular industry. With those caveats, big data in many sectors today will range from a few dozen terabytes to multiple petabytes (thousands of terabytes).

Source: Big data: The next frontier for innovation, competition, and productivity May 2011 Authors: James Manyika, Michael Chui, Brad Brown, Jacques Bughin, Richard Dobbs, Charles Roxburgh, Angela Hung Byers.

http://www.gartner.com/it-glossary/big-data/

Big data is high-volume, high-velocity and high-variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision making. Big data – information of extreme size, diversity and complexity – is everywhere. This disruptive phenomenon is destined to help organizations drive innovation by gaining new and faster insight into their customers. So, what are the business opportunities? And what will they cost?

WIKIPEDIA & THE ECONOMIST

http://en.wikipedia.org/wiki/Big_data#cite_note-Economist-1

Big data is an all-encompassing term for any collection of data sets so large or complex that it becomes difficult to process them using traditional data processing applications.

The challenges include analysis, capture, curation, search, sharing, storage, transfer, visualization, and privacy violations. The trend to larger data sets is due to the additional information derivable from analysis of a single large set of related data, as compared to separate smaller sets with the same total amount of data, allowing correlations to be found to “spot business trends, prevent diseases, combat crime and so on.”[1]

http://www.economist.com/node/15557443

Big Data Summary

Chapter 1 – Now

The author Viktor Mayer-Schoenberger introduces the reader to the world of big data by making emphasis on different major world problems or valuable companies and acquisitions that took place through the analysis of information using this technique and curiosity of people.

The first example provided by the author is related to the spread of diseases in the world and the slow response that governments have to track them and contain them with the appropriate measures. The case discussed the H1N1 outbreak where the only solution to restrain it was to identify which regions had been impacted by the sickness. Lamentably, only after two weeks the spread of the diseases could be determined.

Subsequently, Google released a white paper on how the company could predict the presence of the disease by monitoring the keyword searches in each region and compare it against millions of mathematical models to increase the efficiency of the algorithm comparing it to previous flu outbreaks and their spread through the country. Others tried this, but only Google had enough amount of data and the technological muscle to achieve it.

Another example mentioned in the first chapter tells how Farecast helped its users to save billions of USD acquiring flight tickets just by telling them the price trends with the likelihood of an increase of decrease of price through time. Again, everything by analyzing a huge set of data from the airlines and their historical behavior.

On base of these examples one definition proposed in the book can be better understood “Big data refers to things one can do at a large scale that cannot be done at a smaller one”. In order to better understand the variables of big data, it should be considered that technology is not the only factor playing a role but the human creativity to seek interconnections across data sets to find new information.

Improvement in the data processing tools doesn’t offer a radical enhancement to business processes. Therefore, the new data manipulation capabilities should be used to review the combinations of data to create new information in levels that weren’t foreseen. Some examples of the incremental capacity of gathering data can be found in the field of Astronomy with new telescopes capturing more information in weeks compared from what it had been learned in the history of human kind.

Although Big data is considered to be a section of artificial intelligence, far from that, Big data is a predicting tool focused on finding out what are the next events to happen based on a known context with enough information to simulate scenarios. Furthermore, Big data doesn’t seek the explanation on why the data behaves in specific manners, but on finding what is happening and possibly how will it continue.

Big data manipulates great amounts of information that may have low quality that could result in inexactitude. Thereby, the best quality that the data based is feed with, the better outputs that will result.