As we enter into new computing age with connected devices around us with ambient sensing intelligence all pervasive the data generated would run into hundreds of terabytes.
A major chunk of this data would be generated by the IoT devices around us (Sensors,actuators, wearables,smartphones, etc). This data eventually culminates into Big data. Gaining actionable insights would be a major challenge both in real time and non-real time.
For a specific end application once a large number of sensors are deployed and activated, they would start generating data and push them to analytics platforms to gain actionable insights and accordingly send actuating signals.
Consider for example the healthcare wearables like ECG devices they would be generating at least 1000 events every second and millions of such wearables once deployed will be generating a large volume of data running into 10’s of millions of data points.
This data comes in real-time or as stream of data also known as Velocity in Big data definition.
The data also has variety since it is structured, un-structured, having diverse data models from different sources. Data uncertainty referred to as veracity is due to incompleteness, inconsistency, ambiguity, latency, etc.
Of all the four challenges mentioned above with respect to IoT, the big data generated for analytics purpose to gain actionable insights, the velocity with which the data flows in real time and streaming poses a major challenge.
Companies addressing the above challenges of volume, velocity, variety and veracity and gain actionable insights would be at the forefront of IoT Big Data Analytics. Moreover a silo approach should be desisted to share information with augmenting IoT platforms.