Newsletters




12 Key Takeaways about Data and Analytics from Data Summit 2018


Data Summit 2018 was recently held in Boston. Big data technologies, AI, analytics, cloud, and software licensing best practices in a hybrid world were among the key areas considered during three days of thought-provoking presentations, keynotes, panel discussions, and hands-on workshops.

Data Summit 2019, presented by DBTA and Big Data Quarterly, is tentatively scheduled for May 21-22, 2019, at the Hyatt Regency Boston with pre-conference workshops on May 20.

  1. Data has been called many things such as the new oil and the new electricity, but, it is really the new capital, on a par with financial and human capital for creating new products and services. When we say that data is a kind of capital, it’s not a metaphor; it is literal. “In economics, capital is an asset produced through some process and is then a necessary input to some other good or service. Data fulfills this definition.” - Paul Sonderegger, senior data strategist, Oracle
  2. Speed to value is the new metric that companies care about, and data is a key differentiator. Putting data to use must be accomplished now in a day, or at most a week. The reason data science is critical now, said Caserta, is that the costs of compute and storage are dramatically lower than just a few years ago; data generated by all aspects of society has dramatically increased; and there is a need to efficiently learn what there is to know about our data. - Joe Caserta, founder and president, Caserta
  3. While still in its early days, blockchain is already in its third stage. First was the development and acceptance of Bitcoin, which showed it can work. The second stage was the addition of the idea of smart contracts. Now, the third wave is coming with new companies taking private and public key management and creating more of an operating system around the blockchain technology. Like most of IT, blockchain is changing and adding features, moving “at 100 miles an hour.” In the next 12-18 months, there will be an acceleration of blockchain deployments as people begin to be more comfortable with it. – Paul Tatro, founder of Blockchain U Online
  4. Software vendors will continue to introduce new licensing metrics to end users because the game is to generate as much revenue as possible and to ensure there is no software piracy. Increasing complexity around cloud offerings and determining exactly what is required versus premium features means overcharging is running rampant. - Michael Corey, co-founder, LicenseFortress
  5. 2018 will be the year of the graph. As organizations undergo digital transformation to analyze and query large amounts of data at high speeds, they are increasingly leveraging graph databases to illuminate information about connections. Contributing to graph technology uptake now is that it is becoming mature; there are use cases in areas such as financial services, healthcare, pharmaceutical, and oil and gas; it is being used beyond classical graph technology problems; and the ecosystem is growing. - Sean Martin, CTO, Cambridge Semantics
  6. Traditional relational database and other NoSQL systems are not suited for many use cases because the technologies are primarily focused on the entities as opposed to the relationships. This is where graph databases are handy. They make it easy to discover, explore, and make sense of complex relationships. By leveraging the insights in data relationships you can deliver more relevant, real-time experiences for your customers, proactively fight fraud, and ensure the health and seamless operations of your network. Scott Heath, CRO, Expero
  7. DataOps takes DevOps to the next level, recognizing that many DevOps projects have data integrated into them and requires that data to move at the same speed the rest of development and testing. DataOps is emerging as a methodology for data scientists, developers, and other data-focused professionals to enable an agile workflow while also adhering to data governance requirements. - Kellyn Pot'Vin-Gorman, technical intelligence manager, Delphix
  8. AI poses many challenges, including moral and ethical issues. While it has been suggested that AI must be made to be understandable to humans in order to make it “explainable,” it is possible that this could result in less favorable outcomes, such as a smaller reduction of traffic deaths using self-driving cars, or fewer successes in identifying patients who may be at risk for certain illnesses. As a result, limiting AI to what humans can understand and confirm and, thus limiting its potential, will perhaps present its own moral dilemma. - David Weinberger, senior researcher, Harvard's Berkman Center for Internet & Society
  9. In order to harness the power of their data, businesses need a solid strategy that incorporates everything—from security to data governance to choosing the right technologies. A data strategy must start with a business goal and people should keep asking “why” to understand what the data will be used for. - Lynda Partner, VP, Marketing & Analytics as a Service, Pythian
  10. Organizations that embrace digital transformation will succeed. In the analytics economy, there are five imperatives for transformation: Analytics—with data at the core; Identification—of who you want to be and who your customers and prospects expect you to be; Consumption—understanding the needs and expectations of all of your data consumers; Monetization—of data in order to differentiate information products, solutions, and services; and Communication—which is the heartbeat of a successful analytics culture because, without it, nothing else survives. - Anne Buff, business solutions manager, SAS Best Practices, SAS Institute
  11. Knowing your customer starts with knowing where they are located, detailed intelligence about what is around them, and how to reach them at the right time with the right message. By using spatial information about a person, place, and things, companies can glean relevant business insights by understanding the relationships between them. To achieve this, companies should utilize address cleaning and standardization and geocoding and reverse geocoding.  This will convert data to business assets. - Dan Adams, VP, data product management, Pitney Bowes
  12. Today, it is not enough to understand what happened, organizations want to have a view of what is happening now so they can impact the future. Cloud is a key piece of the new real-time infrastructure, and change data capture is a critical piece of enabling rapid data movement. But as organizations move to the cloud and data lakes, there are a lot of challenges to get data that is needed. Gartner has estimated that 9 out of 10 data lake strategies have failed.  It is easy to get data in, but hard to get meaningful insights out. Automating the end-to-end pipeline of the data lake enables continuous updating and merging of data. - Dan Potter, vice president, product management and marketing, Attunity

 


Sponsors