Big data analytics on System z transactions

Big data analytics on System z transactions

When Gartner asked users what data sources they want to analyze through big data, the top response—70%—was analyze transactional data. The 2nd and 3rd responses were log data (55%) and machine or sensor data (42%).

Paul DiMarzio, on Smarter Computing Blog, takes a look at this in light of the huge volume of transactions processed by mainframe servers and explains the business case for deploying Hadoop on System z:

I’ve discussed the use of Hadoop for log processing with several clients who have been hesitant to do so because of concerns over moving sensitive data to distributed systems. Having the HDFS remain on System z maintains mainframe security over the data and simplifies compliance with enterprise data governance controls. Keeping mainframe data on the mainframe is a critical requirement for many clients, and [Veristorm’s] zDoop allows them to use Hadoop to process those logs without that data ever leaving the security zone of their mainframe box.

Click here to read the full article http://www.smartercomputingblog.com/big-data/the-elephant-on-the-mainframe/