A “big data” problem can be defined as one that exceeds the processing capacity of conventional database systems because the data is too big, moves too fast or doesn’t fit the structures of your database architectures. It is based on the promise that the data being analyzed contains information of great business value and that if you can extract those insights, you can make far better decisions.
Big data problems that aren’t so big. Some of the problems which fall in this category do not necessarily require large amounts of data to be processed, but do involve data with a high frequency.
The data streaming from a hundred thousand sensors on an aircraft qualifies as big data. However the size of the data set is not as large as might be expected. Even a hundred thousand sensors, each producing an eight byte reading every second, would produce less than 3GB of data in an hour of flying (100,000 sensors x 60 minutes x 60 seconds x 8 bytes).
The three V’s of big data. The 3Vs (volume, variety and velocity) are three defining properties or dimensions of big data. Volume refers to the amount of data; variety refers to the quantity of different types of data and velocity refers to the speed of the data processing
- UsesThe value of big data to an organization falls into two main categories
- Analytical value
- Enabling new products
Some of its most common uses include:
- Weather forecasting
- Disease prediction
- Doctor performance
- Consumer habits
- Fraud detection
You can find more related applications in our last post, “Five challenges facing computer science businesses in the next ten years“.