Newsletters




Trends and Applications



In today's data-driven world, it's hard to find a company that doesn't take its data management and protection seriously. Google CEO Eric Schmidt has been quoted as saying that every 2 days, we create as much information as we did from the dawn of civilization up until 2003. This statistic alone highlights the importance and need for top notch database administrators (DBAs) to manage the fast-growing data landscape among today's modern heterogeneous database environment.

Posted October 18, 2017

Marketers, IT professionals, cloud experts, and regulatory teams all rely heavily on the availability of data on a near-daily basis. And while some of that data is within the walls of the organization, much of it is compiled from outside sources—which is often taken and used without permission. Enter GDPR.

Posted October 18, 2017

The way MarkLogic CEO Gary Bloom sees it, interest in artificial intelligence is soaring: Everyone wants to talk about it and everyone wants to apply intelligence for better insights and better decisions. But there is just one problem.

Posted October 06, 2017

Data is the key to taking a measured approach to change, rather than a simple, imprudent reaction to an internal or external stimulus. But it's not that simple to uncover the right insights in real time, and how your technology is built can have a very real impact on data discovery.

Posted October 06, 2017

Threats against personal, organizational and government data have spawned increasing security measures and legislation, a prime example being the European Union (EU) General Data Protection Regulation (GDPR). The GDPR's official site calls it "the most important change in data privacy regulation in 20 years."

Posted October 06, 2017

How rapidly digital transformations have occurred since 2007. A decade ago organizations had not yet turned en masse to leverage social platforms such as Facebook. Consumer technology and its potential went largely ignored and companies were mainly focused on data mining, search technology, and virtual collaboration. Surely it is time for a new generation of intelligent data modeling to step up and enable advertisers and marketers to target consumers more accurately.

Posted October 06, 2017

Apache Cassandra is a popular open source, distributed, key-value store columnar NoSQL database used by companies such as Netflix, eBay, and Expedia for strategic parts of their business. When combined with Apache Ignite, Apache Cassandra becomes even more powerful, allowing it to be used for today's most demanding web and cloud applications.

Posted October 06, 2017

Hadoop adoption in the enterprise is growing steadily and with this momentum is an increase in Hadoop-related projects. From real-time data processing with Apache Spark, to data warehousing with Apache Hive, to applications that run natively across Hadoop clusters via Apache YARN, these next-generation technologies are solving real-world big data challenges today.

Posted October 06, 2017

Regardless of industry, the ability to collect, manage, and intelligently leverage data will clearly be a differentiator for the foreseeable future. Executives in healthcare are acutely aware of the disruption being driven by this new paradigm and understand that this trend is impacting every sector, from banking to farming to manufacturing. Ultimately, investing time and resources in data collection and analysis is only valuable if it provides insight for making proactive, tactical decisions. Innovative companies today are using big data and analytics to drive attributable revenue and compete more effectively.

Posted September 20, 2017

Linear scaling with legacy storage appliances is no longer an option. The burden that traditional architecture feels from today's tsunami of data is surpassed only by the cost required to meet current and future demands. Aside from the huge expense, this method of increasing storage capacity would take too long. Even adding multiple servers could not accommodate storage demands. Vertical storage architecture contains bottlenecks that slow performance to an unacceptable level. Software-defined storage (SDS) scales horizontally instead, making it a popular option.

Posted September 20, 2017

It's still months away, but it is never too early to start thinking about the holiday shopping season, especially since most Americans are already anticipating outages and system failures from their favorite online retailers. According to a survey, 52% of shoppers expect to experience an outage on days like Black Friday and Cyber Monday. When web and mobile sites buckle under huge volumes of traffic and experience outages, retailers lose revenue, and their brands suffer significantly. To prepare for the busiest online shopping season, companies need to ensure their systems are ready for extreme scalability and continuous IT operations, year-round.

Posted September 20, 2017

Cybersecurity is a top of mind concern for CIOs, as the number of companies who fall victim to cyber attacks and data breaches increases every year. Attacks are becoming more common, sophisticated, and costly. Executives and IT professionals alike know the importance of enterprise security, yet companies' security policies frequently fail to address one key area of vulnerability to organizations—employee mobile devices.

Posted September 07, 2017

Each year, tens of thousands of data professionals from well over 100 countries gather at Oracle OpenWorld in San Francisco. Leaders of two major Oracle users' groups—David Start, president of the Independent Oracle Users Group, and Alyssa Johnson, president of the Oracle Applications Users Group—share what they have planned for their members at Oracle OpenWorld 2017, taking place Oct. 1-5.

Posted September 07, 2017

Each year, tens of thousands of data professionals from well over 100 countries gather at Oracle OpenWorld in San Francisco. Leaders of two major Oracle users' groups—David Start, president of the Independent Oracle Users Group, and Alyssa Johnson, president of the Oracle Applications Users Group—share what they have planned for their members at Oracle OpenWorld 2017, taking place Oct. 1-5.

Posted August 24, 2017

On Sunday at Oracle OpenWorld, several Oracle user groups, including the IOUG, will bring the experiences of our users and experts to San Francisco and share with thousands of our peers. If you're coming to OpenWorld, I can't say enough about how important it is to participate in the Sunday Program.

Posted August 24, 2017

An education, networking and advocacy forum for users of Oracle Applications, the Oracle Applications Users Group (OAUG) helps members connect to find the solutions they need to do their jobs better and to improve their organizations' ROI in Oracle Applications. Each year during Oracle OpenWorld, the group hosts geographic (Geo) and special interest group (SIG) meetings as well as additional opportunities for networking and education.

Posted August 24, 2017

We've heard the term "digital transformation" used almost to the point of exhaustion in the past couple of years—but it's not just a lot of hot air: It's the future. Organizations must update their legacy architecture to remain current in the new enterprise landscape, and mainframe rehosting offers 10 key advantages

Posted August 18, 2017

There are plenty of pronouncements about artificial intelligence—both in terms of the miracles it can produce and the threat it poses to humanity. But according to Ali Ghodsi, co-founder and CEO of Databricks, there is actually a "1% problem" in that there are a handful of companies such as Google, Amazon, and a few others that are actually accomplishing their goals with it. AI has vast potential but some of the claims, as well as the fears, are overstated and a little premature right now, he contends.

Posted August 18, 2017

At the SHARE summer 2017 event in Providence, RI, Harry Williams, SHARE's president, reflected on the changes taking place in the enterprise technology ecosystem, and what's ahead for the IBM users group and the industry overall.

Posted August 18, 2017

We're still very much in the early days of artificial intelligence (AI). However, money is pouring into AI initiatives at astounding rates, and enterprises need to move at a deliberate speed to adopt and leverage AI across their systems, applications, and data.

Posted August 09, 2017

The future value of hybrid cloud computing is to empower customers to embrace a cloud strategy of their own, versus being dictated by a vendor. A hybrid cloud environment is defined by the customer—a hybrid cloud solution should not dictate where or which cloud the customer must use with their on-premise installation. Although this may seem obvious, large vendors often ignore this critical point, as they dictate choices based on their (lack of) capabilities.

Posted August 09, 2017

Hackers are rarely far from the news these days, whether they're perpetrating cyber-intrusions into political campaigns or take-downs of major retail websites, social media sites, movie studios, or entertainment conglomerates. But some of the "hacking" headlines can be deceiving. In fact, a significant number of cybersecurity breaches around the digital world actually represent a kind of all-too-familiar crime that is as old as the abacus.

Posted August 09, 2017

Data modeling tools can help organizations create high quality data models that enable them to shape, organize, and standardize data infrastructure, change structures, and produce detail documentation. Moreover, data modeling solutions can also help companies visualize and manage business data across platforms and extend that data to users with varying job roles and skill levels across geographies and time zones.

Posted August 02, 2017

Who makes the best relational database, what is the best NoSQL database, which company has the best Hadoop platform? To find out, Database Trends and Applications magazine went straight to the experts. Each year, DBTA allows subscribers to vote for the DBTA Readers' Choice Awards. Unlike any other awards programs conducted by DBTA, this one is unique because the nominees are submitted and the winners are chosen by DBTA readers

Posted August 02, 2017

Coined over a decade ago, the term "polyglot persistence" has come to be seen as a shorthand phrase in the database world for using the best tool for the job, or in other words, the right database for the data storage need at hand. Today, that might mean NoSQL, NewSQL, in-memory databases, and cloud databases—also known as database as a service—approaches.

Posted August 02, 2017

The rise of new data types is leading to new ways of thinking about data, and newer data storage and management technologies, such as Hadoop, Spark, and NoSQL, are disruptive forces that may eventually result in what has been described as a post-structured world. But while Hadoop and NoSQL are undoubtedly growing in use, and a wider array of database management technologies in general is being adopted, the reality is that at least for now the relational database management system is still reigns supreme and is responsible for storing the great majority of enterprise data.

Posted August 02, 2017

Cloud databases, also referred to as database as a service (DBaaS), can be SQL or NoSQL, open source or relational, and allow customers to choose the environment that is best suited for their particular use cases—whatever that may be, including support for streaming data pipelines or a data warehousing for analytics. Cloud is increasingly being used by organizations to support a range of users, including data managers, developers, and testers, who require almost instantaneous access to a database environment without going through their respective IT, finance, or top managers or needing to acquire new servers or storage arrays to support their efforts.

Posted August 02, 2017

Addressing the need to store and manage increasingly large amounts of data that does not fit neatly in rows and columns, NoSQL databases can run on commodity hardware, support the unstructured, non-relational data flowing into organizations from the proliferation of new sources, and are available in a variety of structures that open up new types of data sources, providing ways to tap into the institutional knowledge locked in PCs and departmental silos.

Posted August 02, 2017

MultiValue database technology maintains a strong following of loyal supporters. The NoSQL database technology continues to have advocates and is entrenched in many industry verticals, including retail, travel industry, oil & gas, healthcare, government, banking, and education.

Posted August 02, 2017

As data stores grow larger and more diverse, and greater focus is placed on competing on analytics, processing data faster is becoming a critical requirement. In-memory technology has become a relied-upon part of the data world, now available through most major database vendors. In-memory can process workloads up to 100 times faster than disk-to-memory configurations, which enables business at the speed of thought.

Posted August 02, 2017

In the new world of big data and the data-driven enterprise, data has been likened to the new oil, a company's crown jewels, and the transformative effect of the advent of electricity. Whatever you liken it to, the message is clear that enterprise data is of high value. And, after more than 10 years, there is no technology more aligned with advent of big data than Hadoop.

Posted August 02, 2017

In today's highly complex, data environments with multiple data platforms across physical data centers and the cloud, managing systems and processes manually is no longer sufficient. What is needed is the ability to manage and monitor business-critical assets with automated precision. Indeed, with so many technologies involved in enterprise data management today, the key words in database administration are "comprehensive" and "automated.

Posted August 02, 2017

Geographically spread-out development teams with varied skill levels and different areas of proficiency, ever-faster software development cycles, multiple database platforms—all this translates to a challenging development environment that database development solutions providers seek to streamline.

Posted August 02, 2017

If data is the new oil, then getting that resource pumped out to more users faster with fewer bottlenecks is a capability that must be achieved. The issue of under-performing database systems cannot be seen as simply an IT problem. Remaining up and running, always available and functioning with lightning speed today is a business necessity.

Posted August 02, 2017

In today's always-on economy, database downtime can inflict a mortal wound on the life of a business. That's why having a trusted backup solution that can ensure that databases can be back up and running quickly in the event of an outage is critical. Today, close to one-fourth of organizations have SLAs of four nines of availability or greater, meaning they require less than 52 minutes of downtime per year, according to a recent Unisphere Research survey.

Posted August 02, 2017

The unending stream of bad news about data breaches shows no sign of abating, with a raft of breaches far and wide involving organizations such as health insurance companies, businesses, government agencies, and the military. The average consolidated cost of a data breach grew to $4 million in 2016, while the average cost for each lost or stolen record containing sensitive and confidential information reached $158, according to IBM's 2016 annual cost of a data breach study.

Posted August 02, 2017

With more data streaming in from more sources, in more varieties, and being used more broadly than ever by more constituents, ensuring high data quality is becoming an enterprise imperative. In fact, as data is increasingly appreciated as the most valuable asset a company can have, says DBTA columnist Craig S. Mullins, data integrity is not just an important thing; it's the only thing. If the data is wrong, then there is no reason to even keep it, says Mullins.

Posted August 02, 2017

Strong data governance solutions support data quality, protect sensitive data, promote effective sharing of information, and help manage information through its lifecycle from creation to deletion with adherence to regulatory and government requirements. As organizations gather, store, and access increasing volumes of data, strong data governance allows them to have confidence in the quality of that data for a variety of tasks as well adhere to security and privacy standards.

Posted August 02, 2017

Data integration is critical to many organizational initiatives such as business intelligence, sales and marketing, customer service, R&D, and engineering. However, big data integration involves projects not products, and there is often a lack of expertise available. Data integration frees employees to concentrate on analysis and forecasting - tasks that require a human touch. It also vastly reduces the chances for errors to be introduced during the data translation process.

Posted August 02, 2017

Robust data replication capabilities deliver high volumes of (big) data with very low latency, making data replication ideal for multi-site workload distribution and continuous availability.

Posted August 02, 2017

Change data capture (CDC) is a function within database management software that makes sure data is uniform throughout a database. CDC identifies and tracks changes to source data anywhere in the database, and then applies those changes to target data in the rest of the database.

Posted August 02, 2017

Data virtualization provides organizations with the ability to allow the business and IT sides of organizations to work closer together in a much more agile fashion, and helps to reduce the complexity.

Posted August 02, 2017

The Internet of things (IoT) is the inter-networking of physical devices, vehicles (also referred to as "connected devices" and "smart devices"), buildings, and other items embedded with electronics, software, sensors, actuators, and network connectivity which enable these objects to collect and exchange data.

Posted August 02, 2017

Streaming data solutions can evaluate data quickly and enable predictive analytics based on sources of rapidly changing data—including social media, sensors, and financial services data, to head off future problems, machine failures, natural disasters, and take advantage of unfolding opportunities.

Posted August 02, 2017

Business intelligence incorporates a variety of tools that enable organizations to collect data from internal systems and external sources, prepare it for analysis, develop, and run queries against the data, and create reports, dashboards and data visualizations.

Posted August 02, 2017

Big data analytics examines large amounts of data to uncover hidden patterns, correlations, and other insights. Organizations have an increasing number of concerns today with respect to big data. There is the challenge posed by the sheer volume of information that is being created and saved every day as well as the lack of understanding about the information in data stores and how best to leverage it.

Posted August 02, 2017

Data visualization allows enterprises to create and study the visual representation of data, empowering organizations to use a tool to "see" trends and patterns in data. A primary goal of data visualization is to communicate information clearly and efficiently via statistical graphics, plots and information graphics.

Posted August 02, 2017

The cloud continues to transform computing, making it less expensive, easier and faster to create, deploy, and run applications as well as store enormous quantities of data. The cloud offers services of all types, including software as a service, database as a service, infrastructure as a service, platform as a service, cloud puts a range of technologies within users' easy reach—when they need it, where they need it, with the ability to scale elastically.

Posted August 02, 2017

Storage solutions run the gamut of assisting a variety of use cases including backup and archiving, content storage and distribution, big data analytics, static website hosting, could-native application data, and disaster recovery.

Posted August 02, 2017

At the core of big data lie the three Vs - volume, velocity, and variety. What's required are solutions that can extract valuable insights from new sources such as social networks email, sensors, connected devices as sensors, the web, and smartphones. Addressing the three Vs are the combined forces of open source, with its rapid crowd-sourced innovation, cloud, with its unlimited capacity and on-demand deployment options, and NoSQL database technologies, with their ability to handle unstructured, or schema-less, data.

Posted August 02, 2017

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24

Sponsors