Many companies are beginning to acquire IoT data and more external datasets for new forms of analytics. However, they are realizing the value in integrating this data with their existing operational systems and databases.
This has given way to a demand for new architecture patterns and data engineering best practices that are being incorporated into cloud and hybrid cloud strategies.
DBTA recently held a webinar featuring John O'Brien, principal advisor and CEO at Radiant Advisors, and Mark Van de Wiel, CTO at HVR, who discussed the proven strategies, new technologies and lessons learned from utilizing hybrid-cloud.
Data lakes are becoming the core part of architecture for the cloud for HDFS or cloud storage, O’Brien explained. Larger datasets are easier to handle within the cloud via the data center, and companies want to realize the benefits of real-time and operational analytics, O’Brien said.
Database replication for non-intrusive operational CDC will continue to support the database locally on-premises for operational reporting. It can be paired with cloud replication server for high-performance secure data uploads.
Users can create streaming hubs in their data lakes for ingesting external and IoT data, and achieve high scalability and reliability with Apache Kafka. Users can then stream web and mobile app logs, high volume social media, and IoT device sensors. Cloud marketplaces offer a variety of data technologies, O’Brien explained, and there needs to be a balance between SaaS, PaaS, and IaaS.
Van de Wiel said that HVR can enable continuous integration of these systems to help users understand and utilize their hybrid-cloud systems.
An archived on-demand replay of this webinar is available here.