Data Fabric, Data Mesh, And the Cloud: Data Management Architectures for the Future

<< back Page 2 of 2


Moving to next-generation data architectures is a journey, not an overnight sprint. “Th­ey can be complex and challenging to design and implement, requiring specialized knowledge and expertise,” said Jerod Johnson, senior technology evangelist at CData. “Th­is can make it difficult for organizations to fully understand and take advantage of the capabilities that these architectures offer.” In addition, “adopting a next-generation data architecture may require significant changes to existing systems and processes, which can be difficult and time-consuming. Integrating new technologies with legacy systems can be a challenging task and could require additional resources, expertise, and testing.”

­There are funding issues and business commitments as well. “Modern architectures can be expensive to implement and maintain, requiring significant investments in hardware, software, and personnel,” Johnson said. “Th­is can be a barrier for some organizations, especially smaller ones.”

Even at large enterprises, there aren’t “enough people with knowledge and experience to run these things appropriately,” said Grant Fritchey, product advocate for Redgate Software. “Not only could you end up with unsecure or non-functional data stores, you could lose data or, worse yet, run up unneeded costs.”

In addition, implementing data mesh or data fabric may see more success within larger enterprises, but be too elaborate for smaller companies or startups. “It might be suitable for big organizations where each team owns their specific domain, but not be applicable to smaller organizations which have limited IT staff managing and owning all the data for a company,” said Dangol.

Additional challenges with mesh include data consistency, data governance, data quality, complexities, and interoperability, Dangol added. With data fabric, challenges include “complex data integration for various source systems, ensuring proper data governance in centralized environments, maintaining scalable infrastructure, data quality, and cost to maintain data in a central environment.”

Data security is also an issue. “New systems can also introduce new security risks, especially when dealing with large amounts of sensitive data, such as cloud-based data architecture,” said Johnson. “Organizations will need to invest in security measures to protect against these risks, which can be costly and complex. Some organizations may have regulatory requirements that must be met, and next-generation data architectures may not be able to comply with these regulations. Th­is can make it difficult for organizations in certain industries, such as finance or healthcare, to adopt these architectures.”

Data governance also needs to be stepped up as these next-generation architectures come into play. “Data is an incredible asset, but if it’s not managed well, it can also be an incredible liability, especially around compliance,” Webber said. “While mesh infrastructure helps with this, at the end of the day you have a people problem. People publish data from their team in good faith only to find it really shouldn’t be shared. Governance can hinder fabric as well. Mastering an enterprise’s data, even with such tools, remains a difficult task with often multiple owners who think their data is authoritative. Cutting that Gordian knot is hard because it’s a people problem that involves many people.”

­There is demand on the part of businesses to “be accountable for data, with actual business KPIs linked to data effectiveness and where failure to deliver is seen as a business risk and cost,” said Jones. In addition, he added, new approaches need to overcome “a data team aligned to IT that is still wedded to traditional post-transactional, reporting-centric approaches for data. This means that the need for operational control and accuracy is replaced with a focus on data-quality pipelines and manual cleanup. The challenge in shifting from a reporting-centric, post-transactional data warehouse to a business-owned, operational speed and insight-driven data mesh is a cultural one—the technology just industrializes that transformation.”


As with any groundbreaking technology, the needs of the business come first. “Getting started on next generation data architectures almost always requires starting with a clear understanding of your business—strategic goals, key stakeholders, pain points in the current environment, financial constraints, and costs of the current environment,” said Greenstein. “Th­ese are the essential ingredients in defining the right target to start an architecture and roadmap for your business.”

Open communication and collaboration are paramount. “Ensure that all the relevant stakeholders are aware of the changes and that the teams that will be working with the new architecture are properly trained,” said Johnson. “Th­is will help to minimize disruptions to your organization’s business and will help to ensure a smooth transition to the new architecture.”

­The move also requires addressing a range of questions and concerns. “You need to address your culture and identify your tech challenges as decentralization versus centralization, scalability, product-oriented data mindset, and cultural shift,” said Dangol. “Data mesh or data fabric isn’t a tool or software, but a new way of thinking about managing data. In data mesh, we need to map out the departmental ownership of data to find the right balance. Once the groundwork has been laid, you need to find the right tools and ensure you have the right architecture and quality control over your data.”

Building a next-generation data architecture “requires diligence and good planning,” said Sarkar. “Many organizations are still struggling to move away from the traditional approach of central data team building and managing the entire data platform. In the new scheme of things, the core team is tasked to build and manage the core platform with reusable components and common frameworks to ingest, transform, and work with the data, which is then leveraged by other teams to build and manage their data products. Every organization’s journey will look different. However, the key tenets and principles will mostly remain the same.”

<< back Page 2 of 2