top of page
  • Pushparaj Zala

Connected world, Cloud and Analytics

Updated: Aug 12, 2021




When we talk about connected things, loT of development is going on across all industry segments. We witnessed quite a few product launch announcements last year in this area. Still, I feel there are a lot of challenges for its implementation, which include remote connectivity, device management, network protocol standards, energy consumption, privacy/security, and many others. Maybe this is the only reason why we are not witnessing a large number of connected devices in our day-to-day lives, though the talks of IoT have been around us for more than a few years now. But that is not the case for industry usage of IoT. The industry is investing heavily in IoT and many implementations are already on production, helping real-time operations, cost optimizations, and resource utilization. Please check out this video for further details on how Microsoft Azure IoT helps the industry.

The evolution of the public cloud will help to boost connected devices and their applications. It will not solve the basic problem of Internet availability to things but will definitely solve the problem of connectivity and will help to process data easily. End-to-end solution for IoT applications with Amazon Web Services (AWS) has been implemented before the release of the AWS IoT service launch. Here, architecture differences between before and after AWS IoT launch will be discussed to provide some more insights on how to leverage this new service for applications that cover the data mining and analytics field.

Before AWS IoT service:

Below is the architecture in which sensor nodes connect to AWS Kinesis and sends sensor data.

AWS IoT service


Conclusion: After this, we have multiple options to read data from the AWS Kinesis stream. We can use Apache Storm for real-time streaming analytics. A sample of Kinesis Storm spout is available here. To display real-time data on a dashboard, Kibana was used and Elasticsearch reads Kinesis stream, and processed data is used by Kibana. But as AWS keeps updating its services with new features, it now provides Amazon Elasticsearch service out of the box. For more detail please check out this blog by Jeff Barr. We can also use AWS Elastic MapReduce and process the Kinesis stream with some MapReduce tasks. Storing data to DynamoDB or to other services is also possible.

After AWS IoT service:

There are many fixes needed in the first part of the above architecture. For example, we have to manually manage all the components that are connected to the network. Also to send data to a specific AWS service (Kinesis in our case), AWS API keys with specific roles need to be present inside things/devices. For all of that AWS IoT provides an excellent solution. With that, we can manage things/devices with all the features of AWS IAM which also includes certificate provisioning for things/devices. We can also revoke certificates associated with any node at any time.

Below is the architecture after using AWS IoT service:


After AWS IoT

With the Rules Engine of AWS IoT, we can route messages to different AWS services. It also provides much-needed support for the MQTT protocol. Some of the noticeable features of this service include Device Shadow and Device SDKs. The remaining part of the architecture will remain the same for some application of data analytics and visualization which includes Storm, Elasticsearch, and other related methods. But with AWS IoT, we now can also talk to devices that enable us to design a wide number of real-time applications. The ultimate goal will be to use historic data generated and find some pattern out of it that will drive some key decisions.

Conclusion:

With reduced hardware cost and availability of excellent cloud services, there is immense opportunity in various applications ranging from factory automation, healthcare, logistic & warehouse management, device/things remote monitoring to home automation.


16 views0 comments
bottom of page