▼ Scroll to Site ▼

Newsletters




Data Modeling

Data Modeling and Database Design and Development – including popular approaches such as Agile and Waterfall design - provide the basis for the Visualization, and Management of, Business Data in support of initiatives such a Big Data Analytics, Business Intelligence, Data Governance, Data Security, and other enterprise-wide data-driven objectives.



Data Modeling Articles

Qlik is releasing two new big data offerings - the latest version of the Podium Data product and the initial release of Qlik's Associative Big Data Index available this month. With these latest releases, Qlik continues to streamline the enterprise data journey from raw data source to end user insights.

Posted September 17, 2018

Informatica, an enterprise cloud data management provider, is introducing a new solution for Apache Spark Based Big Data Cloud environments. These new innovations, powered by the CLAIRE engine, enable organizations to stream, ingest, process, cleanse, protect, and govern even more big data with less effort. 

Posted September 17, 2018

Machine learning and AI continue to be the hottest topic at tech conferences around the country and Strata Data was no different. Data professionals converged at the Jacob Javits Center in New York City from September 11 - 13 and the event was humming with the latest talk of how ML and AI will change the future.

Posted September 14, 2018

Identifying new and disruptive technologies, as well as evaluating when and where they may prove useful, is a challenge in the fast-changing big data market. To contribute to the discussion each year, Big Data Quarterly presents the "Big Data 50," a list of forward-thinking companies that are working to expand what's possible in terms of collecting, storing, protecting, and deriving value from data.

Posted September 13, 2018

Zoomdata, a business intelligence company, is launching ZAP! - the Zoomdata Application Partner program. The announcement was made at the Strata Data Conference held September 11-13 in New York City.

Posted September 13, 2018

Pepperdata, a provider of Application Performance Management (APM) solutions, unveiled new enterprise-grade features to its APM suite that include auto-tuning, enhanced recommendations, and management and operational reporting. The company is also introducing a set of services offerings that include best-practices, performance planning, capacity planning, and architecture design for big data success.

Posted September 13, 2018

Trifacta, a provider of data preparation solutions, is partnering with Sumo Logic, a cloud-native machine data analytics platform, to form a joint integration that drives improved business intelligence.

Posted September 13, 2018

Syncsort, a provider of Big Iron to Big Data software, is introducing new functionalities in its Syncsort Integrate family of products, enabling a continuous, real-time data stream from multiple data sources. The new products provide speed, flexibility, and resiliency for the most demanding enterprise IT environments.

Posted September 13, 2018

Moogsoft, a provider of artificial intelligence for IT operations, has announced Moogsoft Observe, intended to provide core AIOps platform capabilities from centralized analytics to the data source. Moogsoft ingests time-series and metrics data in real-time and applies AI to detect incidents at the source of the problem. Observe stores only anomalous and contextual data, providing IT teams specific knowledge to improve their online services and applications. It is targeted at IT generalists, including site reliability engineers and DevOps.

Posted September 10, 2018

Our friends at CA Technologies recently shared some survey results with us: As application containerization adoption continues to rise, new types of challenges emerge. Despite their benefits, containerized application environments have created exponential complexity in cloud-based application management and monitoring. While Docker popularity is rising, more than 50% of organizations surveyed say their use of Docker container technology is "just getting started." Primary use cases at this time include app development and testing. And the business impact of Docker container performance is largely unmeasured today. More than half of executives (56%) said they are not monitoring Docker container performance problems for business impact yet.

Posted September 10, 2018

SQream is releasing an upgraded version of its GPU-accelerated data warehouse for rapidly analyzing massive data stores. SQream DB v3.0 enables enterprises to quickly and easily load massive volumes of data in the range of terabytes to petabytes for analysis.

Posted September 07, 2018

Alation Inc., the data catalog company, and First San Francisco Partners, a business advisory and information management consultancy, are entering a strategic partnership to meet the needs of Chief Data Officers (CDOs). The partnership will focus on delivering new, field-tested methodologies for agile and modern data governance, made possible by data catalog technology.

Posted September 06, 2018

Today, data management environments are highly complex and often span multiple vendors with deployments across on-premise data centers, clouds, and hybrid installations. In addition to the heterogeneity of systems, the processes surrounding database development and management have also changed. DevOps, a methodology for data scientists, developers, database administrators (DBAs) and others to participate in an Agile workflow, puts a premium on speed and also means that DBAs do not wield the firm control they did in the past.

Posted September 06, 2018

A simple search of the internet for the term "DevOps" reveals many different definitions, writes Tim Boles in a new IOUG SELECT article. "It is hard to just use one or two sentences to convey the meaning of DevOps," he notes, explaining that most definitions go into great detail about the methodology and whys, hows, and hoped-for results of DevOps. "However," says Boles, "they all use words and phrases like 'reduce the time,' 'rapid IT service delivery,' and 'agile.' "

Posted September 05, 2018

The big data landscape is constantly changing and the latest IOUG study is reflecting those changes as organizations continue to flock to the cloud, changing the role and mission of IT teams. DBTA recently held a webinar with Lee Levitt, director, business development, Oracle, who discussed how companies are navigating the change and capitalizing on the benefits cloud brings when it comes to data management and getting more out of analytics.

Posted September 05, 2018

Every year, thousands of data experts and professionals from over 100 different countries converge at Oracle OpenWorld in San Francisco to discover the newest updates to Oracle's ecosystem of technologies. Designed for attendees who want to connect, learn, explore and be inspired, Oracle OpenWorld offers more than 2,200 educational sessions led by more than 2,000 customers and partners sharing their experiences, first hand.

Posted September 04, 2018

Informatica, a cloud data management provider, is launching new product innovations that will enhance customer engagement with trusted, governed, and secure data management. The updates transform Informatica Master Data Management (MDM), Intelligent Cloud Services (IICS) and Data Privacy, and Protection, enabling enterprises with intelligent hybrid data management to transform customer experience.

Posted August 31, 2018

Moogsoft, a provider of artificial intelligence for IT operations, is unveiling Moogsoft Observe, extending Moogsoft's core AIOps platform capabilities from centralized analytics to the data source. Moogsoft ingests time-series and metrics data in real-time and applies AI to detect incidents at the source of the problem. Observe stores only anomalous and contextual data, giving IT teams highly-advanced, specific knowledge to improve their online services and applications while dramatically lowering data transport, ingestion, and storage costs.

Posted August 30, 2018

Amazon Web Services, Inc. (AWS), an Amazon.com company, unveiled Amazon Aurora Serverless., a new deployment option for Amazon Aurora that automatically starts, scales, and shuts down database capacity with per-second billing for applications with less predictable usage patterns. Amazon Aurora Serverless offers database capacity without the need to provision, scale, and manage any servers.

Posted August 30, 2018

The concepts of Agile methodology and continuous delivery have become popular in software development, yet they are somewhat less mature among DBAs and database developers. Shay Shmeltzer, director of product management for Oracle Cloud Development Tools, discussed how database administrators (DBAs) and SQL developers can take advantage of newer development approaches while also dealing with the unique challenges that exist in the world of database development.

Posted August 30, 2018

Alteryx is updating its analytics platform (2018.3), introducing new features that will enhance how users prepare, analyze, share and collaborate on data across the organization. The update reveals a new tool called Visualytics that will provide real-time, interactive visualizations across the Alteryx platform, enabling any data worker in an organization to easily visualize and understand their data throughout the entire analytics workflow, generating data-driven insights.

Posted August 29, 2018

SAP is deepening its partnership with Google and Intel to offer GCP virtual machines supporting the upcoming Intel Optane DC Persistent Memory for SAP HANA workloads. These GCP VMs will be powered by the future Intel Xeon Scalable processors (code-named Cascade Lake) thereby expanding VM resource sizing and providing cost benefits for customers.

Posted August 29, 2018

Silwood Technology Ltd., provider of self-service metadata discovery software, is entering a global reseller agreement with IDERA, a provider of database productivity tools. Through this agreement, IDERA will market and resell Safyr to complement its ER/Studio data modeling and architecture product suite.

Posted August 29, 2018

Kellyn Pot'Vin-Gorman recently summarized how data pods ease delivery with virtualized databases during her Data Summit 2018 session, "Making Big Data Bite-Size with DataOps."

Posted August 28, 2018

Envorso, a provider of business transformation solutions, is entering a strategic partnership with Signafire Technologies, an industry-leading fusion and content analysis company. The partnership will allow Envorso to offer enterprise data fusion solutions.

Posted August 17, 2018

AppViewX, a provider of network automation, is launching AppViewX 12.4, allowing NetOps and SecOps teams to leverage low-code automation on an enterprise-ready platform. The AppViewX 12.4 release will include more pre-packaged automation workflows and elements to enable speedy and efficient service delivery.

Posted August 16, 2018

CognitiveScale Inc., a provider augmented intelligence and AI software, is being selected by Dell to help transform customer experience and marketing productivity through AI. Dell chose CognitiveScale and its Cortex 5 software to power and transform the core of their customer journeys.

Posted August 14, 2018

At Data Summit 2018, Jeff Fried, director of product management at InterSystems, gave attendees a primer on how multi-model databases reduce data modeling complexity in his session, "Polyglot Persistence Versus Multi-Model Databases."

Posted August 14, 2018

Alation Inc., the data catalog company, is launching the Alation Partner Program which will be dedicated to the successful enterprise-wide deployment of data catalogs. One focus of the partner program is fulfilling the needs of customers to achieve success with enterprise-wide metadata management.

Posted August 09, 2018

In the big data world of today, issues abound. People discuss structured data versus unstructured data; graph versus JSON versus columnar data stores; even batch processing versus streaming. Differences between each of these kinds of things are important. How they are used can help direct how best to store content for use. Therefore, deep understanding of usage is critical in determining the flavor of data persistence employed.

Posted August 08, 2018

We are living in the age of polyglot persistence, which really just means that it makes sense to store data using the technology that best matches the way the data will be used by applications. The age of trying to force everything into a relational DBMS is over, and we now have NoSQL, NewSQL, in-memory, and Hadoop-based offerings that are being used to store data. But you really should also be looking at the algorithmic approach offered by Ancelus Database.

Posted August 08, 2018

This year is an expansive one for the database ecosystems that have evolved around the major platforms. Artificial intelligence (AI), machine learning, the Internet of Things (IoT), and cloud computing are now mainstream offerings seen within the constellations of database vendors, partners, and integrators.

Posted August 08, 2018

This may seem contradictory at first glance: Fresh data from the database user community finds that data lakes continue to increase within the enterprise space as big data flows get even bigger. Yet, at the same time, enterprises appear to have pulled back on Hadoop implementations.

Posted August 08, 2018

Still a relatively new solution coming into its own streaming platforms allow individuals to see data in real-time batches. Streaming solutions can help businesses analyze data in motion, simplify the development of applications, and extend the value of existing systems by integrating with already implemented applications along with supporting both structured and unstructured data.

Posted August 08, 2018

A relational database is a set of formally described tables from which data can be accessed or reassembled in many different ways without having to reorganize the database tables. The standard user and application programming interface (API) of a relational database is the Structured Query Language (SQL). SQL statements are used both for interactive queries for information from a relational database and for gathering data for reports.

Posted August 08, 2018

A brand new study fielded among Database Trends and Applications readers and members of the Independent Oracle Users Group reveals that database professionals are being tasked with managing more database instances and platforms, and in greater sizes, than ever before - both on premises and in the cloud. As a result, it's no surprise that the biggest priorities for database teams this year are improving database performance, database upgrades, and data integration.

Posted August 08, 2018

Providing an integrated environment that simplifies software development, these solutions are valued for their ability to improve database development with an end-to-end approach that helps developers stay on top of the latest technology innovations and build modern applications. For database development teams, maximizing competence, performance, adaptability, and readiness, will help simplify development and allow automation to achieve repeatable processes, all while avoiding potential risks that create downtime.

Posted August 08, 2018

Database downtime can inflict a fatal wound on the life of a business and having a trusted backup solution that can ensure that databases can be back up and running quickly in the event of an outage is critical. With long downtimes simply unacceptable, organizations seek solutions with capabilities such as the ability to manage backups seamlessly, manage and monitor backups, ensure data integrity, scale efficiently restore quickly to any point in time, and provide security features to stay in compliance with local geographic and industry mandates.

Posted August 08, 2018

Today's database administration solutions help to improve DBA productivity while simplifying repetitive administrative tasks, helping to locate and alleviate performance bottlenecks, and optimizing code. Businesses are utlizing highly complex, data environments with multiple data platforms across physical data centers and the cloud, managing systems and processes manually is no longer sufficient. What is needed is the ability to manage and monitor business-critical assets with automated precision.

Posted August 08, 2018

Companies are increasingly looking for the right database for the data storage need at hand. That might mean NoSQL, NewSQL, in-memory databases, and cloud databases—also known as database as a service—approaches. 

Posted August 08, 2018

Today's data visualization tools go beyond the standard charts and graphs used in Excel spreadsheets, displaying data in more sophisticated ways such as infographics, dials and gauges, geographic maps, sparklines, heat maps, and detailed bar, pie and fever charts. The images may include interactive capabilities, enabling users to manipulate them or drill into the data for querying and analysis.

Posted August 08, 2018

While the data growth rate, number of database instances, and number of platforms that each DBA must support has not changed radically in the last few years, the database infrastructure has become more complicated. Two key factors are at play in the increasing complexity.

Posted August 08, 2018

Top data modeling solutions enable organizations to discover, design, visualize, standardize and deploy high-quality data assets through an intuitive GUI. With the ability to view "any data" from "anywhere" for consistency, clarity and artifact reuse across large-scale data integration, master data management, big data and business intelligence/analytics initiatives, today, data modeling is a critical component to many initiatives.

Posted August 08, 2018

The votes have been counted and the results are in. Now, it's time to offer congratulations as Database Trends and Applications magazine unveils the 2018 Readers' Choice Awards winners. Many of the vendors and products are well-known with market-leading positions established over many years.  However, there are also newer names in the mix, representing the rapidly evolving nature of information technology solutions and services.

Posted August 08, 2018

Redgate Software is adding Nexus Technology to its worldwide partner roster to help organizations successfully introduce DevOps to their database development. Redgate's software is already used by 91% of companies in the Fortune 100, and the new partnership with Nexus now means companies in the Channel Islands are in the ideal position to explore how too they can take advantage of database DevOps.

Posted August 07, 2018

At Data Summit 2018, Jeff Fried, director of product management at InterSystems, discussed how to create and maintain both polyglot and multi-model databases in his session, "Polyglot Persistence Versus Multi-Model Databases."

Posted August 07, 2018

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23

Sponsors