▼ Scroll to Site ▼

Newsletters




Trends and Applications



As more companies continue to rely on interconnected networks, virtualized cloud services, and IoT technologies, the potential for downtime and its costs will only rise. By achieving true network resilience, companies can focus on maintaining their services, removing single points of failure and having a plan to bring the network back up to continue normal operations—before it costs them.

Posted October 01, 2019

Data continues to grow in volume, variety, and velocity, resulting in new data management technologies. Recently, Deepti Srivastava, product manager for Cloud Spanner at Google Cloud, discussed how database requirements are evolving and how Google's Cloud Spanner is advancing a relational-NoSQL-convergence approach by giving customers the combined benefits of relational database structure with non-relational horizontal scale.

Posted October 01, 2019

Digital transformation. Infrastructure modernization. Global data center demands. All these forces and more are driving enterprises around the world to seek out next generation cloud-based technologies for a wide range of applications—even those most critical to their business. In reality, however, migrating to the cloud or any other modern architecture is not as easy as it sounds.  

Posted October 01, 2019

DevOps is now widely accepted in application development because, by introducing a culture of collaboration and cooperation between development and operations teams, it enables features to be released faster to end users. As DevOps grows, there is a corresponding need to ensure the database is included so that the entire development process is seamless and free of bottlenecks.

Posted October 01, 2019

Oracle has identified a need for "augmented" analytics, leveraging machine learning and AI throughout the analytics process to help drive up the impact and value of data, and enable knowledge workers to uncover more insights. Recently, Bruno Aziza, group VP, Oracle Analytics, described this new phase in analytics, the role that cloud plays in making it possible, and what the capabilities will enable for customers.

Posted October 01, 2019

Radiant Advisers' John O'Brien identifies four focus areas for data analytics—understanding customer behavior, understanding product usage, increasing operational efficiency, and business model innovation—in his closing keynote at Data Summit 2019.

Posted September 03, 2019

Nearly every week of 2018 featured headlines of a new cyberattack on companies that people trust with their data, such as Marriott and Facebook. In years past, the biggest concern for companies was being hit with hefty fines, but now, they risk reputation damage if they breach compliance mandates and regulations when they are attacked.

Posted August 15, 2019

Streaming platforms allow individuals to see data in real-time batches, enabling businesses to analyze data in motion, simplify the development of applications, and extend the value of existing systems by integrating with already implemented applications along with supporting both structured and unstructured data.

Posted August 14, 2019

A relational database is a set of formally described tables from which data can be accessed or reassembled in many different ways without having to reorganize the database tables. The standard user and application programming interface (API) of a relational database is the Structured Query Language (SQL). SQL statements are used both for interactive queries for information from a relational database and for gathering data for reports.

Posted August 14, 2019

Look to NoSQL for fast, highly scalable access to free-form data. This comes with a few costs, like consistency of reads and other safeguards common to SQL databases. But for many applications, those safeguards may well be worth trading for what NoSQL offers.

Posted August 14, 2019

Highly customized, mission-critical applications have been built on MultiValue database technology, which is sometimes called the fifth NoSQL database technology, for many years now. The MultiValue database dates back to the mid-1960s, with Don Nelson and Dick Pick widely credited as the founding fathers of the technology. Also referred to as Pick or MultiDimensional, a key advantage of MultiValue, is the database structure's use of attributes that can have multiple values, rather than one single value as with relational technology.

Posted August 14, 2019

The Internet of things (IoT) is the inter-networking of physical devices, vehicles (also referred to as "connected devices" and "smart devices"), buildings, and other items embedded with electronics, software, sensors, actuators, and network connectivity which enable these objects to collect and exchange data.

Posted August 14, 2019

In-memory databases and technologies enable decision makers to get to the information they are seeking rapidly and more readily. While in-memory technology has been on the market for many years, today, the demand for intelligent, interactive experiences requires back-end systems and applications operating at high performance, and incorporating movement and delivery of data faster than ever before.

Posted August 14, 2019

After more than 15 years, there is still probably no technology more aligned with advent of big data than Hadoop. The Apache Hadoop framework allows for the distributed processing of large datasets across compute clusters, enabling scale up from single commodity servers to thousands of machines for local computing and storage. Designed to detect and handle failures at the application layer, the framework supports high availability.

Posted August 14, 2019

Today's database administration solutions help to improve DBA productivity while simplifying repetitive administrative tasks, helping to locate and alleviate performance bottlenecks, and optimizing code.

Posted August 14, 2019

With enterprises juggling more than ever?—from massive data volumes and multiple database platforms, to DevOps and the cloud?—database monitoring is more important than ever. Companies can't afford any database downtime.

Posted August 14, 2019

Social media, the Internet of Things, demands for mobile access, and real-time insights are just some of the factors that have increased the pressure on organizations to change how data is managed. And as a result there have never been so many data management choices to deal with it all.

Posted August 14, 2019

For database development teams, maximizing competence, performance, adaptability, and readiness will help simplify development and allow automation to achieve repeatable processes, all while avoiding potential risks that create downtime. Companies want to generate queries and reports, perform SQL development and optimization, detect and diagnose database problems, automate administration tasks, and more

Posted August 14, 2019

With long downtimes simply unacceptable today, organizations seek solutions with capabilities such as the ability to manage backups seamlessly, manage and monitor backups, ensure data integrity, scale efficiently, restore quickly to any point in time, and provide security features to stay in compliance with local geographic and industry mandates.

Posted August 14, 2019

Data visualization tools have evolved beyond the standard charts and graphs used in Excel spreadsheets, displaying data in more sophisticated ways such as infographics, dials and gauges, geographic maps, sparklines, heat maps, and detailed bar, pie and fever charts. The images may include interactive capabilities, enabling users to manipulate them or drill into the data for querying and analysis.

Posted August 14, 2019

With data virtualizatio the business and IT sides of organizations can work closer together in a much more agile fashion, reducing complexity and boosting productivity. Data virtualization enables artificial intelligence/machine learning and data science initiatives by delivering all the available data to algorithms in real time, said Ravi Shankar, chief marketing officer, Denodo.

Posted August 14, 2019

According to an IDC report, the Global Datasphere will grow from 33 Zettabytes (ZB) in 2018 to 175ZB by 2025. "To keep up with the storage demands stemming from all this data creation, IDC forecasts that over 22ZB of storage capacity must ship across all media types from 2018 to 2025."

Posted August 14, 2019

Increasingly stringent data privacy regulations along with a generally lower tolerance for data mishandling, are making companies even more concerned about improving their data security postures and thwarting cyber risk.

Posted August 14, 2019

Today, digitally savvy enterprises cannot afford downtime and, for this reason, many companies are developing strategies that ensure that the data they need—or their customers are viewing—will always be available, regardless of what happens behind the scenes. Data replication is considered a critical solution that supports infrastructure and data services.  Data replication is used to keep essential data available to users and customers.

Posted August 14, 2019

The old adage, garbage in, garbage out, has never been truer. Not only is the problem not going away with the advance of technology and the growth of data volumes, velocities, and varieties—but it is getting worse.

Posted August 14, 2019

Data modeling tools can help organizations create high quality data models that enable them to shape, organize, and standardize data infrastructure, change structures, and produce detailed documentation. Moreover, data modeling solutions can also help companies visualize and manage business data across platforms and extend that data to users with varying job roles and skill levels across geographies and time zones.

Posted August 14, 2019

In the era of big data, volume, variety, and velocity are often mentioned as critical issues. However, at Data Summit 2019, data integration was identified as one of the most critical and disruptive problems facing organizations today.

Posted August 14, 2019

The pressure is on. Today, organizations are under tremendous pressure from business partners, consumers, and regulators to exercise care about how they handle sensitive data of all kinds. Data governance is becoming more challenging due to a confluence of factors, including data volumes exploding, data being collected from more disparate sources, and data no longer being stored centrally, but instead in a combination of on-premise, cloud, and hybrid scenarios.

Posted August 14, 2019

One in four enterprises now regards real-time data as critical to their ongoing operations—and another one in four is actively preparing to introduce real-time data capabilities into their infrastructures. Data-driven innovation is being embraced by almost every department in the enterprise, led by outward-facing departments, with line-of-business owners and marketing departments demanding the most innovation from data.

Posted August 14, 2019

Cognitive computing pillars that are shared with AI include the ability to learn and be adaptive, be probabilistic, and use big data from diverse sources. Characteristics that are specific to cognitive computing include being meaning-based, interactive, contextual, iterative and stateful, and highly integrated.

Posted August 14, 2019

Cloud is now mainstream, a critical part of data environments,and this trend is only increasing. Gartner estimates that $206 billion will spent on public cloud services in 2019, up 17% from 2018, while IDC estimates that nearly half of IT spending was cloud-based in 2018, "reaching 60% of all IT infrastructure and 60%-70% of all software, services and technology spending by 2020."  

Posted August 14, 2019

DBAs and their organizations are moving to the cloud. A recent survey conducted by Unisphere Research, a division of Information Today, Inc., in partnership with Amazon Web Services, found that, on average, 25% of organizations' critical enterprise data is now managed in public clouds. The survey also found that for 60% of data managers the use of public cloud-based data resources and platforms has increased over the past year.

Posted August 14, 2019

Speed to insight matters more today than ever before. But dealing with the increasing volume of data, and the speed at which this data changes, can be a hindrance to data integration, timely analytics, and rapid decision making. Change data capture supports real-time analytics with less overhead to advance a range of initiatives such as data warehousing, real-time dashboards, data quality, and more.

Posted August 14, 2019

To leverage the immense power of their data, organizations need a solid strategy that incorporates everything from security to data governance to the right big data technologies. Enabling both on-prem and cloud deployments—or a hybrid strategy—big data platforms today support data warehouses, data lakes, data science, engineering, machine learning, myriad database management systems, and much more.

Posted August 14, 2019

The ability to quickly act on information to solve problems or create value has long been the goal of many businesses. But with the combination of greater volume, variety, and velocity, the situation is becoming more vexing. Fortunately, there are business intelligence solutions today with capabilities to support strong data strategies.

Posted August 14, 2019

Each year, Database Trends and Applications presents the Readers' Choice Awards, providing a unique opportunity to recognize companies whose products are selected by experts whose opinions carry more weight than any others—you, the readers. Here we present the top three vote-getters in each category. Congratulations to all and thanks to everyone who submitted nominations and voted!

Posted August 14, 2019

While IT has evolved significantly in the past decade as hardware and software innovations accelerate and customers seek solutions that increase efficiencies and lower costs, one aspect has remained the same: backup. And as the dependence on data grows for business insights and analytics, backup will only become more important in IT.

Posted August 07, 2019

As access to data and data sources continues to explode, businesses are being forced to rethink their data strategies to consider more information and power real-time intelligent decisions. That is, however, easier said than done. With the amount of data available that is pouring in from a myriad of sources, it can be difficult to identify what provides value, and what is just noise. Increased data and cloud growth has led to data integration challenges.

Posted August 07, 2019

We've reached the point where hybrid cloud arrangements have become commonplace in enterprises, and with this trend come implications for databases and data management. The rise of both hybrid and multi-cloud platforms means data needs to be managed in new ways, industry experts point out. And, there are lingering questions about which data should go into the cloud, and which should stay on-premise.

Posted August 07, 2019

With the emergence of data-intensive activities such AI and the Internet of Things, workloads are getting heavier for data managers. Data managers have seen increases in data volume over the last 3 years and expect this trend to continue. They are also finding it difficult to keep up with this growth. Many DBAs manage more than 10 databases, with some handling hundreds.

Posted August 07, 2019

No single tool can ensure perfect security. That's why layering multiple tools and approaches is considered a best practice to reduce vulnerability to attacks. It has become more apparent in recent years that management tools form a vital security layer. Think of it this way: A window lock is a security tool. Closing the windows is management. Neither is enough by itself.

Posted August 07, 2019

"Hey Google, what's the cost of noncompliance?" Failing to comply can cost you $56.8 million—the dollar figure Google owes for breaking the EU's General Data Protection Regulation (GDPR). While this multi-million dollar mistake delivers a hit to the tech giant's reputation and checkbook, it has far-reaching implications for any organization with an online presence.

Posted August 07, 2019

Earlier this year, Google became the first major tech giant to be hit with a General Data Protection Regulation (GDPR) fine—approximately $56.8 million. The stated reason: not giving users enough information about consent policies and sufficient control over how their personal data was being used. However, according to a recent report, 86% of businesses use live customer data for application testing because testers believe this provides the most realistic assessment of how an application will perform "in the wild" for real people. This poses significant risk to organizations.

Posted July 18, 2019

Data Summit 2019 in Boston drew industry experts with deep knowledge spanning all areas of enterprise IT, including AI and machine learning, analytics, cloud, data warehousing, and software licensing who presented 3 days of thought-provoking sessions, keynotes, panel discussions, and hands-on workshops. Here are some key takeaways from the Data Summit 2019.

Posted July 18, 2019

Considering the importance that applications play in operating and/or driving commerce in many organizations, it is no wonder that DevOps is the crucible for these businesses' long-term health and realization of their roadmap.s In some enterprises—such as Amazon, Airbnb, Netflix, and Uber—the application set is the business. In others, there is more to the business than just the app—but the app plays a critical role to an initiative or process. The monumental significance of apps in either case has required the DevOps team to work with ingenuity, clarity, speed, and deep pragmatism in turning thoughts, ideas, and intended experiences into reality.

Posted July 18, 2019

Everyone leaves an employer at some point. Better opportunities, reduction in workforce actions, termination, or management issues can all result in an employee departure. No matter the reason, everyone eventually leaves the company they work for.

Posted July 18, 2019

Nearly every week of 2018 featured headlines of a new cyberattack on companies that people trust with their data, such as Marriott and Facebook. In years past, the biggest concern for companies was being hit with hefty fines, but now, they risk reputation damage if they breach compliance mandates and regulations when they are attacked.

Posted July 18, 2019

Many enterprises started migrating to the cloud sooner (or migrated faster) than they even realized—through a rogue marketing department deploying a cloud lead tracking application, a finance group that stood up a cloud-based accounting service—or others that IT may not have vetted, secured, procured, and continuously monitored. Most enterprises have scores of these "shadow cloud" applications deployed with little-to-no planning, strategy, or skilled technical staff involved, posing risk to the organization.

Posted July 18, 2019

Ransomware attacks faded from the headlines after the notorious WannaCry outbreak in 2017 and the frequency of attacks declined in 2018. And yet, with ransomware threats seemingly in the rearview mirror, cybersecurity experts and the Information Security Forum's 2018 Global Security Threat Outlook are suddenly forecasting a major resurgence of ransomware this year.

Posted July 18, 2019

The stakes for expert data management continue to escalate. Data volumes are expanding rapidly and it is well-understood today that companies that harness the value of this precious commodity will be the winners. With a growing appreciation for data's value, spending on data management and analytics products and services is also on the rise. IDC forecasts revenues for big data and business analytics solutions will reach $189.1 billion in 2019 with double-digit annual growth projected through 2022, according to its "Worldwide Semiannual Big Data and Analytics Spending Guide."

Posted June 12, 2019

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28

Sponsors