Processing Critical Data with MongoDB and Pentaho




Processing Critical Data with MongoDB and Pentaho

Processing Critical Data with MongoDB and Pentaho

November 9, 2015 / wot.io, Critical Mention, MongoDB, Pentaho / Posted By: wotio team

As we mentioned in a previous post about NGDATA and scriptr.io, we have a partnership with Critical Mention giving us access to their enriched real-time media stream containing transcribed content of broadcasts across radio and television in the US, Canada, Mexico, UK, and other countries. As we showed previously, this rich feed of data can be processed in many ways and the data stream itself transformed with data services into new useful feeds.

As another example of wot.io data services that can operate on the Critical Mention media feed, we routed the feed into a MongoDB instance. Working with one of our systems integration partners DataArt, we then set up transformations on the data in an instance of Pentaho. In addition to the transcribed text of the broadcasts, the messages in the feed have additional data including the country where the broadcast originated, the network, etc. We created Pentaho transformations based on this data and were able to quickly create graphs showing the frequency of countries in the feeds.

This is a great example of how wot.io can route high-volume data to multiple deployed data services for processing. It also provides a glimpse at the types of things that are possible with the Critical Mention feed. We captured some of the details in a video. Enjoy!