It is projected that by 2020, 1.7 megabytes of data will be generated for every person in the world, every single second; and the proportion of data that needs to be protected is growing faster than the digital universe itself. All the data coming in large volumes from different places is called Big Data.
Big data basically means sets of structured or unstructured data whose volumes are so large and so complex that traditional data processing software cannot process them within a reasonable amount of time. The information mined from these sets are then analyzed and put to good use. Big data involves more than just the volume and complexity of data, however. Doug Laney laid out the definition of big data in 3 V’s.