Bitwise Brings Mainframe Data into Hadoop

Bookmark and Share

Bitwise, a data management services company, announced the launch of its Hadoop Adaptor for Mainframe Data, intended for converting any mainframe data in EBCDIC format to Hadoop-friendly formats such as ASCII, Avro, and Parquet. The data conversion solution addresses compatibility issues of ingesting mainframe data into the Hadoop data lake for advanced analytics, combining mainframe data with any other data sources in the data lake, and achieving faster analysis on mainframe data in Hadoop.

Mainframes use EBCDIC data format, which presents a challenge since Hadoop uses ASCII data format. Most leading ETL tools in the market like have an EBCDIC conversion capability, but they do the conversion on their infrastructure and not natively on Hadoop. Furthermore, data structures within COBOL can become very complex due to complications in COBOL CopyBooks related to CompFields (compressions), Nested fields, Arrays and Redefines within Flat Files, making fast and accurate Code Translation difficult to achieve.

Bitwise Hadoop Adaptor for Mainframe Data is a standalone EBCDIC-to Hadoop format conversion utility that understands all the different ways data is stored in mainframes (including COBOL CopyBooks) and can convert these complex data structures to the right structure for Hadoop.

Hadoop Adaptor for Mainframe Data handles all mainframe, AS400 and other COBOL data files of any complexity, and delivers an optimized process to convert mainframe EBCDIC data to ASCII, Avro and Parquet data that increases overall data conversion efficiency and maximizes ROI.

For more information, visit Bitwise at