Architecting For Speed and Scale at Data Summit 2022

Around 85% of analytics, big data, and AI projects will fail, despite massive investments of money. It’s not new news, but it still reflects on how powerfully design affects speed, scale, and usage.

At Data Summit 2022, Brian O'Neill, founder and principal, Designing for Analytics presented his session, “Technically Right, Effectively Wrong: How to Avoid Creating the ML or Analytics Application No Customer Wants to Use.”

The annual Data Summit conference returned in-person to Boston, May 17-18, 2022, with pre-conference workshops on May 16.

A "people first, technology second" approach can minimize the chance of failure and drive your analytics/AI/data/product team to create innovative and indispensable software solutions.

“A lot of times in the big data world we really get lost in the output,” O’Neill said. “Your mission isn’t to produce more data outputs. It’s to change somebody’s life.”

If you make things with data, you’re already a designer, he explained. It’s about the type of intent applied. There are 3 tenets to designing a useful application, he noted. This includes researching what a user needs, clarifying unarticulated problems, and crafting the application.

Expecting users to adopt a solution is one way to developing a bad application. An application needs to be designed to easily fit into a consumer’s existing routine.

A good design inspires action, considers time, feels helpful and insightful, is ethical and more, O’Neill explained.

“You have to get your team to care. If you don’t, you’re screwed,” O’Neill said.  “Your solution will see more use if it fits into their workflow—not yours.”

Good design also begins lo-fi in order to understand how the application will serve the learning process. Companies can do workshops with customers to see what level of accuracy is needed to do a good job. Tool usability can be tested—design is not necessarily art, he said.

“You can’t create business value if users can’t or won’t use your data products,” O’Neill said.

After O’Neill’s presentation Jean Noutoua, engineering lead, Federal Reserve Bank of San Francisco, examined the data mesh with his presentation, “Big Data Architecture—Mesh Anyone?”

Providing a look at hype versus reality, Noutoua looked at the current problems that need to be addressed with data mesh.

“Cloud is driving modern data architecture, machine learning, and AI,” Noutoua said.

Data helps people make decisions, improves lives, and solves problems, he explained. However, there are fundamental issues with data including integrating data from many sources or security gaps that allow sensitive information to get into the wrong hands.

Data architecture has grown from a data warehouse to now a more decentralized solutions such as a data mesh.

Data fabric and data mesh can be intertwined. With a data fabric the focus is not just on data but the process that is used to uncover the data. Data mesh is a decentralized approach that enables domain teams to perform cross-domain data analysis on their own.

Many Data Summit 2022 presentations are available for review at