Big Data Quarterly Articles



Today, whether it is company leaders dealing with customer and business concerns or public health experts talking about the COVID-19 pandemic, what you hear again and again is that they are relying heavily on data. And, in this issue we look at the range of data management challenges and opportunities.

Posted May 28, 2020

Powering the next generation of renewable energy with IoT positively impacts running costs, while ensuring that wind power will continue to be both a viable economic solution for energy production for the future and a benefit to the environment.

Posted May 28, 2020

The pandemic is revealing gaps in our critical infrastructure security, supply chain fragility, and utilization of modern technologies to mitigate and recover communities globally. Although advanced technology platforms have been used by large international corporations, the pandemic is exposing the fact that emergency management and public health agencies are behind the curve or underutilizing data science, open source software, and high-performance computing resources.

Posted May 28, 2020

As companies have evolved toward digital business models and undertaken digital transformation initiatives, they have increasingly faced two challenges. First, the data they need to drive their real-time business processes is typically spread across multiple, siloed datastores. Second, their existing applications often cannot scale to address the increase in end-user demands for real-time engagement.

Posted May 28, 2020

A combination of factors is heightening the need for high-quality, well-governed data. These include the need for trustworthy data to support AI and machine learning initiatives, new data privacy and data management regulations, and the appreciation of good data as the fuel for better decision making.

Posted May 28, 2020

Scott Zoldi is chief analytics officer at FICO (www.fico.com).

Posted May 28, 2020

Even the most ambitious data analytics initiatives tend to get buried by the 80/20 rule—with data analysts or scientists only able to devote 20% of their time to actual business analysis, while the rest is spent simply finding, cleansing, and organizing data. This is unsustainable, as the pressure to deliver insights in a rapid manner is increasing.

Posted May 21, 2020

As companies shop for a cloud-based solution, it's critical to understand that there are some major differences in business practices among cloud software vendors. Some of these practices could have deep impact on the company's overall business operations—costing more time, money, and resources in the end.

Posted May 19, 2020

It is a matter of when, not if, your organization will confront a never-before-seen data source—a source that, if managed improperly, could result in catastrophic consequences to your brand and bottom line. In some cases, that data will be imported from outside your four walls. In others, the data will spring from new business processes or the fertile minds of your employees manipulating existing assets to create altogether new analytic insights,

Posted May 19, 2020

As data sizes have grown over the last decade, so has the amount of time it takes to run ETL processes to support the myriad downstream workloads. A decade ago, most people were only thinking about making their KPI dashboards faster. As time rolled forward, they started to think about getting more intelligent analytics out of their data, and the data sizes quickly grew from gigabytes to terabytes.

Posted May 18, 2020

With a multi-cloud strategy, businesses are finding that they can gain scalability, resiliency, and significant economic savings. However, this approach requires businesses to transition their architecture to a much more complex and decentralized model, which makes managing the security of the entire environment extremely challenging.

Posted April 09, 2020

The amount of data needed for real-time, customer-facing applications is impossible to operationalize when managed through software alone, according to Prasanna Sundararajan, CEO and co-founder of rENIAC.

Posted March 30, 2020

The Ethical Use of Artificial Intelligence Act was recently introduced by U.S. Senators Cory Booker (D-NJ) and Jeff Merkley (D-OR) with the goal of establishing a 13-member Congressional Commission that will ensure facial recognition does not produce bias or inaccurate results.  Recently, Suraj Amonkar of Fractal Analytics, an AI and analytics company, shared his views on the proposed legislation and the issues it addresses.

Posted March 27, 2020

Competition these days is no longer just about cost or quality; it is about companies offering entirely new digital business models and better customer experiences that are based on insights. How do organizations compete on that basis? They do it by unlocking the various data sources that are imprisoned within IT and business departments, systems, and databases.

Posted March 24, 2020

It takes a minimum of 18 months to develop vaccines and anti-viral drugs. The bureaucracy and the human trials, when combined with the discovery process which involves endless hours of iterative testing, should be the perfect target for the use of the modeling capability of machine learning and the inference capability coming from new AI algorithms.

Posted March 20, 2020

To democratize data and analytics is to make them available to everyone. It is an admirable goal and one with its roots in the earliest days of the self-service movement. If an organization is to truly be data-driven, it follows that all key decisions—from tactical operational priorities to strategic vision—must be data-informed. So where is democratization going wrong?

Posted March 20, 2020

GPUs fuel AI and machine learning. Initially created for video games, they are used in sports and business analysis by fantasy baseball enthusiasts, oddsmakers, and front office executives who want to enhance their understanding of the hidden value of often obscure players. Other uses of this technology's extreme processing power include the recognition of animals, such as dog breeds or endangered species, to allow biologists to gain a more accurate understanding of species populations in a geographical area.

Posted March 17, 2020

As more and more organizations migrate database management and integration to the cloud, various use cases and best practices are beginning to take shape around the timing, cost, and extent to which workloads are moved.

Posted March 17, 2020

Quantum computing continues to captivate imaginations. The technology takes advantage of quantum mechanics to deliver exponentially faster speeds by being able to process an almost infinite amount of parallel compute threads delivered as qubits and quantum gates. As Jim Clarke, director of quantum hardware for Intel Labs, describes it, "by harnessing quantum mechanics, quantum computing systems promise an unprecedented ability to simulate and analyze natural phenomena, significantly accelerating the ability to process information and answer questions that would require prohibitive amounts of time even for today's supercomputers."

Posted March 17, 2020

There's no question that investing in data systems and infrastructure can make organizations more competitive and allow for new, exciting innovations. This makes every company a data company. But recently, the maxim has come into sharper focus. The big competitive advantage doesn't come from data-at-rest; instead, it comes from streaming data.

Posted March 16, 2020

Database release automation provider Datical recently announced the appointment of Dion Cornett as president. Datical is built on top of Liquibase and is the primary maintainer of the open source project that enables application teams to version, track, and deploy database schema changes. Cornett, who joined Datical with more than a decade of open source leadership experience from positions at Red Hat (which was acquired by IBM) and MariaDB, talked with BDQ about why it is critical  to include the database as a key player in DevOps processes.

Posted March 04, 2020

AI is capturing attention as a transformative technology for enterprises. Fundamental to AI is the use of ontologies, says Seth Earley, CEO of Earley Information Science (EIS), a consulting firm focused on organizing information for business impact. His new book "The AI Powered Enterprise: Harness the Power of Ontologies to Make Your Business Smarter, Faster and More Profitable," due out in April, focuses on the importance of ontologies as a foundation for AI success.

Posted March 02, 2020

data.world, the cloud-native data catalog company, has expanded its partnership with Snowflake, the cloud data platform, that includes integration to Snowflake Partner Connect.

Posted February 13, 2020

Red Hat OpenShift Container Platform is generally available for IBM Z and IBM LinuxONE, reinforcing the agile cloud-native world of containers and Kubernetes with the security features, scalability, and reliability of IBM's enterprise servers.

Posted February 13, 2020

Dun & Bradstreet, a provider of business decisioning data and analytics, is releasing the D&B Analytics Studio. The platform is a secure, cloud-based analytics platform that will provide clients with a single, integrated solution to explore, synthesize, and operationalize data and analytics in order to remain competitive in the era of digital transformation.

Posted February 12, 2020

Oracle has been extending its cloud reach with announcements of new partnerships, additional cloud developers and engineers, and more cloud regions. Recently, Steve Daheb, senior vice president, Oracle Cloud, talked with Big Data Quarterly about how the company's cloud strategy has evolved and how customers' use of Oracle technology is changing. Gen 1 of cloud was about moving workloads from on-prem, but the next step will be about how companies can take advantage of new technologies to improve their processes in ways not possible before.

Posted February 11, 2020

2019 was a banner year for cybersecurity crime, with hackers targeting consumers, government agencies, and private corporations alike. According to the Ponemon Institute, the average total cost of a data breach is $3.86 million, and 80% of U.S. businesses expect that they will have had a critical breach this year. These numbers are not only sizable; they're alarming.

Posted December 30, 2019

Many organizations are experimenting with AI programs, but most of them face a significant and seemingly intractable problem. Although proof-of-concept (POC) projects and minimum viable products (MVPs) may show value and demonstrate a potential capability, frequently, they are difficult to scale.

Posted December 30, 2019

For many, governance remains a dirty word: It's bureaucratic, restrictive, and slows things down. This perception is diametrically opposed to data governance's true objective, which is to enable rapid yet appropriate exploitation of enterprise data assets.

Posted December 23, 2019

There's no doubt that AI has taken center stage in the enterprise data and analytics world, as evidenced by the mass quantities of related headlines, conferences, and vendor marketing. But hype aside, business executives are now discovering how to leverage AI for improved decision making with augmented or assistive intelligence solutions or the competitive advantages in new products and services. AI is proving its viability in the real world, including enterprise data and analytics.

Posted December 23, 2019

It is said that form follows function in architecture, and we are seeing something similar in IT, where job titles follow current trends. So move over data scientist. The "contextualist" is here. We have a lot of shifts in IT, and today we are at the dawn of Industrial Revolution Number Four. This revolution started in the communications industry about a decade ago, with advances in mobile technology that helped disrupt the traditional media world by offering online news and entertainment.

Posted December 16, 2019

The big data ecosystem has changed. No longer is it true that people just want to store massive quantities of data. Action is not only needed but must be taken to sustain the viability of an organization. While some say big data is dead, the concept isn't going anywhere. Instead, it is the notion of inaction on big data that is dead. In addition, the technologies that were built to store and process big data are not loved by the industry; they are merely tolerated and are often maligned. They have been difficult to put into production, to maintain, manage, and even find people with the skills to do all the work.

Posted December 16, 2019

Prior to 2008, whatever your database question, the answer was always,"Oracle"—or sometimes, "MySQL" or, "SQL Server." The relational database management system (RDBMS) model—really a triumvirate of technologies combining Codd's relational data model, the ACID transaction model, and the SQL language—dominated database management systems completely.

Posted December 09, 2019

"I've looked at clouds from both sides now, from up and down and still somehow, it's cloud's illusions I recall, I really don't know clouds at all." These are lyrics from a song created in the late 1960s by Joni Mitchell. It's doubtful that the songwriter envisioned computer systems in the clouds, yet the words somehow ring true today. Cloud computing has revealed countless new dimensions to IT. We have public clouds, private clouds, distributed clouds, but most importantly, we have hybrid, multi-cloud architectures. However, the idea of "the cloud" remains as elusive as the artist who wrote the song.

Posted December 09, 2019

Business leaders understand the value of quality and consistent web experiences. Their customers demand it; and, in today's always-on, digital world, it can make all the difference between keeping a customer and losing one.

Posted December 02, 2019

You've probably heard of ETL, or heard somebody talk about "ee-tee-elling" their data. It's a technology from the days of big iron for extracting data from many relational databases, transforming it to a corporate standard schema, and loading it into another unified relational database. That target database, the enterprise data warehouse (EDW), was intended to be a "single source of truth" for the enterprise. It included all the data that IT deemed fit (and available) for decision makers to use.

Posted December 02, 2019

Let's cut right to the chase. Today, "big data" is just data, and the majority of organizations recognize the importance of being data-driven. But data, of whatever size, is only valuable if it's accessible, trustworthy, and usable.

Posted December 02, 2019

Hewlett Packard Enterprise has announced the HPE Container Platform, an enterprise-grade Kubernetes-based container platform designed for both cloud-native applications and monolithic applications with persistent storage. With the HPE Container Platform, the company says, enterprise customers can accelerate application development for new and existing apps—running on bare-metal or virtualized infrastructure, on any public cloud, and at the edge.

Posted November 18, 2019

PlanetScale has announced the general availability of PlanetScale CNDb, a fully managed cloud native database designed on Vitess, a Cloud Native Computing Foundation (CNCF)-hosted open source project that serves massive scale production traffic at large web-scale companies such as YouTube, Slack, and Square.

Posted November 18, 2019

Just as in the oil industry, exploitation of data comes down to how good your refinery capabilities are. Just as oil has to be refined to get valuable products out of it, such as gasoline and jet fuel, data has to be refined into insights. And, just as in the old days of the oil business, the rush is on.

Posted September 26, 2019

There is no substitute for genius, and despite the awesome power of the GPU and the majesty of the new manifestations of AI, there is no substitute for the human mind.

Posted September 26, 2019

As business analytics education, including specific instruction in data visualization, becomes more solidified in higher education, the question is not: "Are we teaching business analytics?" but instead becomes: "What are we teaching in business analytics?" To make education most valuable, it should align with what the market is looking for in potential job candidates.

Posted September 26, 2019

While on the surface Wall Street may appear conservative and risk-averse, when it comes to IT, the financial services industry has continually led the adoption of new technologies—sometimes out of a drive for innovation, and other times out of necessity.

Posted September 26, 2019

IT executives and line-of-business experts understand the importance of data for their success and are adopting modern technologies to enable the delivery of timely data and insights to enhance decision making. In line with their data-driven goals, organizations are leveraging hybrid and multi-cloud strategies. However, they are also finding that cloud approaches add their own complexity.

Posted September 26, 2019

It might be the most frequently asked question of a data governance consultant: "Who should own data governance, the business or IT?" And man, that's a loaded question! When you dig deeper into the root of the question, most people really want to know one of two things—"Who should ultimately own data decision making for our company?" or, "Where will data governance be most successful?" Let's take a closer look at those two questions.

Posted September 26, 2019

Multi-cloud offers many benefits—in terms of documented security, compliance, and savings outcomes—but every migration presents challenges. Understanding best practices around how to successfully launch a multi-cloud migration journey is crucial to avoiding the common pitfalls along the way.

Posted September 26, 2019

Has the meaning of big data changed? Many agree that data no longer has to be "big" to meet today's evolving requirements. In particular, open source and cloud tools and platforms have brought data-driven sensibilities into organizations that previously did not have such expertise, making big data more accessible.

Posted September 26, 2019

Each year, Big Data Quarterly presents the "Big Data 50," a list of forward-thinking companies that are working to expand what's possible in terms of capturing, storing, protecting, and deriving value from data.

Posted September 11, 2019

It is well-known that data scientists spend about 90% of their time performing data logistics-related tasks. Anything that a data scientist can do to reduce it is a good use of their time, and a benefit to the organization as a whole. Enter RAPIDS—a data science framework offering support for executing an end-to-end data science pipeline entirely on the GPU.

Posted September 03, 2019

Databricks, a provider of unified analytics and original creators of Apache Spark, is boosting its Unified Analytics Platform with automation and augmentation throughout the machine learning lifecycle.

Posted August 20, 2019

Pages
1
2
3
4
5
6
7
8
9

Newsletters

Subscribe to Big Data Quarterly E-Edition