Big Data Quarterly Articles



Cloud computing—and everything that goes with it—is dramatically changing the roles and aspirations of database administrators. No longer do DBAs need to be chained to their databases, wrestling with managing updates, applying security patches, and dealing with capacity issues. Moving to a cloud data environment is steadily shifting DBAs' roles from hands-on database overseers to value-drivers for their businesses—and enabling a range of career advancement opportunities not seen since the dawn of relational databases.

Posted April 11, 2019

Your mileage may vary, but there is an allegorical household phenomenon that provides some lessons in understanding the data lake—the coffee can that often ends up in the garage or tool space of a home to collect all the stray, old-project hardware someone in the household cannot bear to part with.

Posted April 02, 2019

What happened to the 500 million data points from the Starwood data breach?

Posted April 02, 2019

It is estimated that hybrid cloud hybrid cloud will become  a $1 trillion market by 2020. IBM's GM of GTS Cloud Services Jim Comfort explained why hybrid cloud and open source becoming a strategic part of today's business environment.

Posted April 02, 2019

It's easy to see when the emperor has no clothes, just as it's easy to spot a truly bad technology. What's much harder is spotting an overhyped technology—one that has great promise that hasn't been fulfilled yet, or one that is great for a given purpose, but positioned as the cure for world hunger.

Posted March 27, 2019

Gaining greater insight into your business by learning how operations deep at the heart of the organization are really performing requires a level of analytical proficiency that (until recently) was only found in either very large or very specialized organizations. But before you get bogged down in the details, it is helpful to realize that there is a good approach to address those challenges. It is called "working backward," an approach made famous by Steve Jobs in the context of CX.

Posted March 08, 2019

Good data visualization should be fast, informative, and—above all—valuable. This makes data viz a critical tool in the modern analyst toolkit. Here are 3 simple questions to ensure your data visualization passes the eye candy test.

Posted March 06, 2019

Enterprise application teams are facing pressure to release applications more quickly, but most enterprises still have a manual process for reviewing, validating, and deploying database changes. This creates a bottleneck for business innovation and improving CX.

Posted March 06, 2019

Bias—whether consciously or unconsciously introduced—can ruin any project and lead to extremely damaging conclusions if inaccurate or incomplete data is fed in. We've already seen multiple examples of engineers being unable to prevent AI networks from accidentally becoming racist or sexist, and it's not hard to see how bias can lead to many worrying outcomes.

Posted March 06, 2019

Despite the attention given to new big data management technologies, Db2 remains one of the most widely used database management systems in the world and a fundamental component of many enterprise data architectures, according to Craig S. Mullins, who has written a new book and will be at Data Summit 2019 to present a session on "The New World of Database Technologies.

Posted February 26, 2019

While blockchain is believed to be a solution to many business problems, this is not necessarily the case. But, it is igniting new discussions that can help enterprises find the right solutions to some of their biggest challenges. In all the hype, people tend to focus on technical discussions and fail to realize the value in just how much this technology is transforming the way we think within the enterprise.

Posted February 19, 2019

For many companies, designing and implementing a data platform for analytics is a critical task. According to Dremio, a VC-backed firm founded in 2015, by combining capabilities and technologies into a solution that enables access, transformation, security, and governance, data as a service represents a new approach to vexing analytics challenges, delivering data at scale with high performance. Recently, Kelly Stirman, vice president strategy at Dremio, discussed how using open source projects, open standards, and cloud services, companies can deliver data as a service to their data consumers across critical lines of business.

Posted February 12, 2019

Veritas Technologies, a provider of enterprise data protection and software-defined storage, and its platforms Veritas NetBackup and Veritas Backup Exec have attained Amazon Web Services (AWS) Storage Competency status, reaffirming Veritas as an AWS Partner Network (APN) Advanced Technology Partner offering solutions validated by the AWS Storage Competency.

Posted February 12, 2019

Logi Analytics has acquired Jinfonet Software, maker of JReport. The acquisition will consolidate two leading embedded analytics vendors under the Logi brand.

Posted February 12, 2019

Attunity, a provider of data integration and big data management software solutions, has announced two new solutions: Attunity for Data Lakes on Microsoft Azure, designed to automate streaming data pipelines, and Attunity Compose for Microsoft Azure SQL Data Warehouse, designed to enable data warehouse automation for Azure SQL Data Warehouse.

Posted February 12, 2019

MapR Technologies, provider of a data platform for AI and analytics, has announced support for Apache Drill 1.15. The new release offers new enhancements to conduct queries on complex nested data structures, including files, MapR JSON database tables, and cloud data sources specifically for S3 (Amazon Simple Storage Service?).

Posted January 30, 2019

If machine learning poses a special challenge for data quality, it may also hold the key to the solution. Cloud-based data quality tools that integrate machine-learning techniques can both accelerate and enhance data quality efforts. Of course, applying new tools requires an investment.

Posted January 09, 2019

In George Orwell's Book 1984, the party slogan was, "Who controls the past controls the future. Who controls the present controls the past." And today, it is clear that the environment we now live in is a culmination of the many decisions that were made in the past. Many innovations we now take for granted—or expect to become broadly accepted—would never have been possible if past investments and research had not laid the foundation for them. Cloud technology is playing a critical role in many of these advances.

Posted January 09, 2019

It is time to rethink the approach to solving problems with software. The new approach must be data first, not application first.

Posted January 09, 2019

With the expectation that the sea of data will only get bigger and perhaps rougher, it only makes sense to batten down the hatches and put the albatross at the head of the ship. For those not familiar with the ancient mariner and to be a bit more direct—it's time to formalize data governance and position it as a directional lead in supporting key business initiatives.

Posted January 09, 2019

In many organizations, data management pros are still challenged by a collection of legacy enterprise data warehouse architectures, Hadoop, and cloud storage (to name just a few). The ever-growing volumes of data that we generate lead us to looking for places to put the data, such as the data lake. It grows and grows with data that may or may not have value, but the prevailing thought is that it may be needed someday.

Posted December 13, 2018

Evolving compliance obligations, including the EU's General Data Protection Regulation (GDPR) and the NYDFS Cybersecurity Regulation (23 NYCRR Part 500), are driving the urgent need for governance, and, in the U.S., high-profile hacks, leaks, and data breaches amplify security risks and underscore the ongoing requirement to protect corporate data from spying, harvesting, and exfiltration. As of now, all 50 U.S. states, the District of Columbia, and many U.S. territories have data breach notification laws on the books.

Posted December 12, 2018

Because companies across all industries are realizing the need for data in evidence-based decision making over gut instinct, they're increasingly relying on more minds, teamwork, and tighter collaboration for confidence and trust in analytical outcomes. Today, machine learning is improving the process by being a new "mind" to supplement and enhance our human thought.

Posted December 12, 2018

To understand how monetization models are changing, we must look at the state of software licensing and monetization as it is now and how technology will work in the future.

Posted December 11, 2018

The rate of speed at which technology is changing is matched only by the speed of new applications. From high-frequency trading (HFT) to the Internet of Things (IoT), sophisticated processing capabilities mean that we just keep receiving and using data faster.

Posted December 11, 2018

The newest edition of the TOP500 list of the world's fastest supercomputers puts five U.S. Department of Energy (DOE) supercomputers in the top 10 positions, with the first two captured by Summit at Oak Ridge National Laboratory (ORNL) and Sierra at Lawrence Livermore National Laboratory (LLNL).

Posted November 12, 2018

Arun Murthy—CPO and co-founder of Hortonworks—who will also serve as CPO at Cloudera after the merger is complete—recently shared his thoughts on what's ahead for 2019. According to Murthy, data at the edge, AI, IoT, open source, and cloud will all factor in strongly in organizations' plans for analytics and governance.

Posted November 12, 2018

The data lake has become accepted as an important component of a modern data architecture, enabling a wider variety of data to be accessed by users. Yet, challenges persist. Recently, John O'Brien, CEO and principal advisor, Radiant Advisors, talked about the cultural transformation underway as companies increasingly realize the power of data, and the tools and technologies helping them to expand what is possible.

Posted October 24, 2018

In my last article, I discussed how the Internet of Things market is showing early signs of maturity, but that many projects still can stumble. I identified seven "habits" that successful projects have in common, which, when used together, are powerful enough to set your IoT project on the right path.

Posted September 27, 2018

Volumes of data surround us. The internet, advertising, social media, connected cars and smart homes have driven an exponential increase in information, primarily unstructured data. But, this data is useless unless we can also comprehend, analyze and make operational, tactical and strategic (yet democratized) decisions based on the information. This realization has pushed organizations to rethink business strategies and outcomes as related to big data and digital transformation. It's also prompted an investment in technologies that enable data mining, as well as prescriptive, descriptive and predictive analytical solutions.

Posted September 27, 2018

Thanks to the dramatic uptick in GPU capabilities, gone are the days when data scientists created and ran models in a one-off manual process. This is good news because the one-off model was typically not optimized to get the best results. The bad news is that with an increase in the total number of models created—including iterations over time—the amount of data used as inputs and generated by the models quickly spirals out of control. The additional bad news is that there are a variety of complexities associated with model, data, job, and workflow management.

Posted September 26, 2018

This past summer, I had the opportunity to stay in a wide variety of hospitality establishments for both personal and professional travel, which, for me, has generated another fun way to look at data governance that I am excited to share. Because, let's face it, without good analogies, data governance on its own can be, well, kind of dry. Welcome to the Hotel Data Governance. Such a lovely place. (The tune is already in your head isn't it? You're welcome.)

Posted September 26, 2018

Many organizations nowadays are struggling with finding the appropriate data stores for their data, making it important to understand the differences and similarities between data warehouses, data marts, ODSs, and data lakes. All these data structures clearly serve different purposes and user profiles, and it is necessary to be aware of their differences in order to make the right investment decisions.

Posted September 26, 2018

Both Oracle and SQL Server have very well-established communities. While they are different, they are also similar in many ways. All DBAs worry about the performance and security of the data and the database. Out of necessity, Oracle DBAs have become more specialized. Will this happen to SQL Server DBAs now that the database is offered on Linux?

Posted September 25, 2018

New tools and technologies are becoming established in enterprises, helping organizations extract more value from data. Recently, industry experts weighed in on the trends that loom large in the future and those that appear to be waning.

Posted September 18, 2018

IT departments are under increasing pressure to keep up with the pace of business innovation. As data volumes rise, automation looms as a light at the end of the complex-process tunnel. The ability to leverage software, framework, and application tools to generate automated sequences based on various trigger events has become critical to the IT industry. But, it doesn't come without its challenges. When well-executed, IT automation can increase efficiency and productivity, reduce cost, and speed time to value. However,  poorly crafted automated sequences and lack of integration between applications can make a lasting impact on application deployment and management, causing a ripple effect across the entire IT landscape.

Posted September 18, 2018

Today, organizations everywhere want to become data-driven—directed by business knowledge rather than intuition or experience from the past. But to reach that level of sophistication, they need someone at the helm who is capable of forming a strategy for data usage and governance. Enter the "chief data officer," or CDO. Recently, Big Data Quarterly spoke with Ramesh Nair, North America Financial Services leader at Accenture Applied Intelligence, about why more companies are putting executives in that role.

Posted September 17, 2018

Previous technology shifts, such as the introduction of enterprise resource planning software, tended to be inward-facing, involving little contact with the customer. Things are different this time around. The companies that are embracing today's digital transformation trend are focusing squarely on the customer—connecting customer-facing initiatives with business processes to stay ahead of the competition.

Posted September 17, 2018

Identifying new and disruptive technologies, as well as evaluating when and where they may prove useful, is a challenge in the fast-changing big data market. To contribute to the discussion each year, Big Data Quarterly presents the "Big Data 50," a list of forward-thinking companies that are working to expand what's possible in terms of collecting, storing, protecting, and deriving value from data.

Posted September 13, 2018

Today, data management environments are highly complex and often span multiple vendors with deployments across on-premise data centers, clouds, and hybrid installations. In addition to the heterogeneity of systems, the processes surrounding database development and management have also changed. DevOps, a methodology for data scientists, developers, database administrators (DBAs) and others to participate in an Agile workflow, puts a premium on speed and also means that DBAs do not wield the firm control they did in the past.

Posted September 06, 2018

Veritas Technologies, data protection leader, today unveiled Veritas NetBackup 8.1.2 with a new and improved user interface that, simplifying access for employees across an organization.

Posted August 22, 2018

Envorso, a provider of business transformation solutions, is entering a strategic partnership with Signafire Technologies, an industry-leading fusion and content analysis company. The partnership will allow Envorso to offer enterprise data fusion solutions.

Posted August 17, 2018

Today, data is understood to be the fuel that propels companies' growth and success. As a result, safeguarding that valuable resource is more important—but also harder—than ever.

Posted August 07, 2018

The way that many organizations approach the backup process has failed them, as evidenced by the hundreds of millions of dollars they have spent on ransomware to retrieve their data.

Posted August 02, 2018

For organizations dealing with critical infrastructure and systems, as well as governmental agencies and operators, there are several solutions that can help with IT network security.

Posted July 26, 2018

By automating the centralization process, organizations can ensure that data is up-to-date and accurate, improving their customer experience and paving the road for GDPR compliance.

Posted July 23, 2018

Better cybersecurity requires a mindset that looks beyond the technology itself and focuses on having the technology, processes, and people working together in tandem to ensure a secure infrastructure.

Posted July 18, 2018

Today, there is a resurgence in the power of data visualization—alongside a virtual gold rush of bigger, more diverse, and more dynamic data—is providing new tools and innovative techniques to help us transform raw data into compelling visual data narratives. Propelled by this newfound horsepower in data visualization, we are recreating the entire analytic process. We're also making it increasingly more visual—from how we explore data to discover new insights all the way to how we curate dashboards, storyboards, and interactive visualizations to share the fruits of our labor.

Posted July 16, 2018

With so many layers that may be vulnerable to attack, securing an ERP system involves much more than just one piece of cybersecurity software.

Posted July 16, 2018

Organizations need dynamic data mapping capabilities for a precise view of their data and to provide transparency around "the rights of the data subject," such as the rights to be forgotten, rights of accessibility, and rights of rectification.

Posted July 13, 2018

Pages
1
2
3
4
5
6
7
8

Newsletters

Subscribe to Big Data Quarterly E-Edition