Newsletters




Data Modeling

Data Modeling and Database Design and Development – including popular approaches such as Agile and Waterfall design - provide the basis for the Visualization, and Management of, Business Data in support of initiatives such a Big Data Analytics, Business Intelligence, Data Governance, Data Security, and other enterprise-wide data-driven objectives.



Data Modeling Articles

CognitiveScale Inc., a provider augmented intelligence and AI software, is being selected by Dell to help transform customer experience and marketing productivity through AI. Dell chose CognitiveScale and its Cortex 5 software to power and transform the core of their customer journeys.

Posted August 14, 2018

At Data Summit 2018, Jeff Fried, director of product management at InterSystems, gave attendees a primer on how multi-model databases reduce data modeling complexity in his session, "Polyglot Persistence Versus Multi-Model Databases."

Posted August 14, 2018

Amazon Web Services, Inc. (AWS), an Amazon.com company, unveiled Amazon Aurora Serverless., a new deployment option for Amazon Aurora that automatically starts, scales, and shuts down database capacity with per-second billing for applications with less predictable usage patterns. Amazon Aurora Serverless offers database capacity without the need to provision, scale, and manage any servers.

Posted August 10, 2018

Alation Inc., the data catalog company, is launching the Alation Partner Program which will be dedicated to the successful enterprise-wide deployment of data catalogs. One focus of the partner program is fulfilling the needs of customers to achieve success with enterprise-wide metadata management.

Posted August 09, 2018

In the big data world of today, issues abound. People discuss structured data versus unstructured data; graph versus JSON versus columnar data stores; even batch processing versus streaming. Differences between each of these kinds of things are important. How they are used can help direct how best to store content for use. Therefore, deep understanding of usage is critical in determining the flavor of data persistence employed.

Posted August 08, 2018

We are living in the age of polyglot persistence, which really just means that it makes sense to store data using the technology that best matches the way the data will be used by applications. The age of trying to force everything into a relational DBMS is over, and we now have NoSQL, NewSQL, in-memory, and Hadoop-based offerings that are being used to store data. But you really should also be looking at the algorithmic approach offered by Ancelus Database.

Posted August 08, 2018

This year is an expansive one for the database ecosystems that have evolved around the major platforms. Artificial intelligence (AI), machine learning, the Internet of Things (IoT), and cloud computing are now mainstream offerings seen within the constellations of database vendors, partners, and integrators.

Posted August 08, 2018

This may seem contradictory at first glance: Fresh data from the database user community finds that data lakes continue to increase within the enterprise space as big data flows get even bigger. Yet, at the same time, enterprises appear to have pulled back on Hadoop implementations.

Posted August 08, 2018

The concepts of Agile methodology and continuous delivery have become popular in software development, yet they are somewhat less mature among DBAs and database developers. Shay Shmeltzer, director of product management for Oracle Cloud Development Tools, discussed how database administrators (DBAs) and SQL developers can take advantage of newer development approaches while also dealing with the unique challenges that exist in the world of database development.

Posted August 08, 2018

Still a relatively new solution coming into its own streaming platforms allow individuals to see data in real-time batches. Streaming solutions can help businesses analyze data in motion, simplify the development of applications, and extend the value of existing systems by integrating with already implemented applications along with supporting both structured and unstructured data.

Posted August 08, 2018

A relational database is a set of formally described tables from which data can be accessed or reassembled in many different ways without having to reorganize the database tables. The standard user and application programming interface (API) of a relational database is the Structured Query Language (SQL). SQL statements are used both for interactive queries for information from a relational database and for gathering data for reports.

Posted August 08, 2018

A brand new study fielded among Database Trends and Applications readers and members of the Independent Oracle Users Group reveals that database professionals are being tasked with managing more database instances and platforms, and in greater sizes, than ever before - both on premises and in the cloud. As a result, it's no surprise that the biggest priorities for database teams this year are improving database performance, database upgrades, and data integration.

Posted August 08, 2018

Providing an integrated environment that simplifies software development, these solutions are valued for their ability to improve database development with an end-to-end approach that helps developers stay on top of the latest technology innovations and build modern applications. For database development teams, maximizing competence, performance, adaptability, and readiness, will help simplify development and allow automation to achieve repeatable processes, all while avoiding potential risks that create downtime.

Posted August 08, 2018

Database downtime can inflict a fatal wound on the life of a business and having a trusted backup solution that can ensure that databases can be back up and running quickly in the event of an outage is critical. With long downtimes simply unacceptable, organizations seek solutions with capabilities such as the ability to manage backups seamlessly, manage and monitor backups, ensure data integrity, scale efficiently restore quickly to any point in time, and provide security features to stay in compliance with local geographic and industry mandates.

Posted August 08, 2018

Today's database administration solutions help to improve DBA productivity while simplifying repetitive administrative tasks, helping to locate and alleviate performance bottlenecks, and optimizing code. Businesses are utlizing highly complex, data environments with multiple data platforms across physical data centers and the cloud, managing systems and processes manually is no longer sufficient. What is needed is the ability to manage and monitor business-critical assets with automated precision.

Posted August 08, 2018

Companies are increasingly looking for the right database for the data storage need at hand. That might mean NoSQL, NewSQL, in-memory databases, and cloud databases—also known as database as a service—approaches. 

Posted August 08, 2018

Today's data visualization tools go beyond the standard charts and graphs used in Excel spreadsheets, displaying data in more sophisticated ways such as infographics, dials and gauges, geographic maps, sparklines, heat maps, and detailed bar, pie and fever charts. The images may include interactive capabilities, enabling users to manipulate them or drill into the data for querying and analysis.

Posted August 08, 2018

While the data growth rate, number of database instances, and number of platforms that each DBA must support has not changed radically in the last few years, the database infrastructure has become more complicated. Two key factors are at play in the increasing complexity.

Posted August 08, 2018

Top data modeling solutions enable organizations to discover, design, visualize, standardize and deploy high-quality data assets through an intuitive GUI. With the ability to view "any data" from "anywhere" for consistency, clarity and artifact reuse across large-scale data integration, master data management, big data and business intelligence/analytics initiatives, today, data modeling is a critical component to many initiatives.

Posted August 08, 2018

Redgate Software is adding Nexus Technology to its worldwide partner roster to help organizations successfully introduce DevOps to their database development. Redgate's software is already used by 91% of companies in the Fortune 100, and the new partnership with Nexus now means companies in the Channel Islands are in the ideal position to explore how too they can take advantage of database DevOps.

Posted August 07, 2018

At Data Summit 2018, Jeff Fried, director of product management at InterSystems, discussed how to create and maintain both polyglot and multi-model databases in his session, "Polyglot Persistence Versus Multi-Model Databases."

Posted August 07, 2018

Informatica, the enterprise cloud data management provider, is launching new customer success offerings that leverage big data, AI, and machine learning. Informatica's new program boasts faster business value with consistent and superior experiences to customers worldwide.

Posted August 01, 2018

SAP is deepening its partnership with Google and Intel to offer GCP virtual machines supporting the upcoming Intel Optane DC Persistent Memory for SAP HANA workloads. These GCP VMs will be powered by the future Intel Xeon Scalable processors (code-named Cascade Lake) thereby expanding VM resource sizing and providing cost benefits for customers.

Posted August 01, 2018

Redgate Software is upgrading SQL Prompt, elevating it from helping to develop and standardize new code to discovering problems in legacy code. The latest update to SQL Prompt allows users to analyze an entire script of legacy code, no matter how large, and see a list of all the issues contained within it.

Posted July 31, 2018

Silwood Technology Ltd., provider of self-service metadata discovery software, is entering a global reseller agreement with IDERA, a provider of database productivity tools. Through this agreement, IDERA will market and resell Safyr to complement its ER/Studio data modeling and architecture product suite.

Posted July 31, 2018

Machine learning continues to grow with the advent of new technologies and solutions. It has emerged as a truly revolutionary technology innovation across all verticals and industries; according to analyst firm IDC, spending on machine learning will grow from $12B to $57.6B by 2021.

Posted July 30, 2018

The role of the CDO is growing—as are data-based threats to businesses.

Posted July 30, 2018

The big data landscape is constantly changing and the latest IOUG study is reflecting those changes as organizations continue to flock to the cloud, changing the role and mission of IT teams. DBTA recently held a webinar with Lee Levitt, director, business development, Oracle, who discussed how companies are navigating the change and capitalizing on the benefits cloud brings when it comes to data management and getting more out of analytics.

Posted July 27, 2018

Splice Machine is releasing its intelligent application platform on the Microsoft Azure cloud service, giving customers another way to use the data platform while have the choice to deploy on-prem, AWS, or Azure. Splice Machine is a scale-out SQL data platform that can run fast OLTP and in-memory OLAP on the same platform, along with machine learning and streaming.

Posted July 26, 2018

MemSQL is updating its flagship platform, advancing its performance along with adding capabilities to accelerate time to insight. MemSQL 6.5 delivers an integrated database for data capture, management, and operations that integrates seamlessly with existing systems, infrastructure, and employee skills. This allows customers to get faster time to value and reduces cost and complexity.

Posted July 25, 2018

Businesses are overwhelmed by a proliferation of data sources, types, and stores. The abundance of information and tools is increasing the challenge of combining data into meaningful, valuable insights. DBTA recently held a webinar featuring Kevin Petrie, senior director and technology evangelist, Attunity, Danny Sandwell, director of product marketing, erwin, Inc., and Jake Freivald, vice president, Product marketing, Information Builders, who discussed the key technologies and best practices for overcoming big data integration and governance challenges.

Posted July 18, 2018

CloudJumper, provider of a Workspace as a Service (WaaS) platform, is introducing Cloud Workspace for Azure. The enhanced platform integrates the Cloud Workspace Management Suite with Microsoft's Remote Desktop modern infrastructure (RDmi) for increased visibility into the user's Azure, Office365, and Cloud Workspace experience.

Posted July 17, 2018

Companies are committed to delivering higher levels of customer satisfaction for their online services. To create a path to continuous service delivery optimization, you need to start with a review of your current approach and toolset against your business needs.

Posted July 13, 2018

Arcadia Data, provider of visual analytics and BI software, is launching Arcadia Enterprise to transform modern business intelligence for big data by delivering the power of BI.

Posted July 11, 2018

Dome9 Security, the public cloud security company, is introducing new capabilities in the Dome9 Compliance Engine that extend the scope of the platform's automation beyond security and compliance monitoring. Using this new Compliance Engine functionality, enterprises can accelerate the resolution of dangerous misconfigurations and minimize the window of vulnerability in their public cloud environments.

Posted July 06, 2018

Pure Storage, an all-flash storage platform, is releasing Pure Service Orchestrator, delivering container storage-as-a-service. Pure Service Orchestrator equips customers with effortless, self-managed storage that drives data centric architecture with public cloud-like agility on-premises, backed by all-flash speed and enterprise reliability, according to the vendor.

Posted July 06, 2018

Expanding its range of services, in 2017, Dell Boomi acquired ManyWho, a provider of a unified cloud and low-code development platform that helps simplify workflow automation. Recently, Steve Wood, chief product officer at Boomi, talked about what's ahead for Boomi, what it means to be a connected business in the new hybrid cloud era, and the importance of low-code development.

Posted July 02, 2018

When we hear the term "think outside the box," how often do we really examine what that phrase truly means? First, one needs a box. And it is on this issue where most folks fail. Before one can consider what is "outside the box," one must clearly understand what exactly is meant by "inside the box." People often consider random approaches the same as being "outside the box." However, just different is not enough.

Posted July 02, 2018

Databricks is partnering with RStudio, providers of a free and open-source integrated development environment for R, to increase the productivity of data science teams and allow both companies to integrate Databricks' Unified Analytics Platform with the RStudio Server. The RStudio and Databricks integration removes the barriers that stop most R-based machine learning and artificial intelligence (AI) projects.

Posted June 29, 2018

Oracle recently announced general availability of Oracle Application Express (APEX) 18.1, a low-code rapid application development platform that can run in any Oracle Database and is included with every Oracle Database Cloud Service.  APEX enables users to develop, design, and deploy data-driven desktop and mobile applications using only a browser. 

Posted June 20, 2018

Hewlett Packard Enterprise is planning to invest $4 billion in Intelligent Edge technologies and services over the next four years. Specifically, HPE will invest in research and development to advance and innovate new products, services and consumption models across a number of technology domains such as security, AI and machine learning, automation, and edge computing.

Posted June 20, 2018

Innovative vendors are helping to point the way forward with technologies and services to take advantage of the wealth of data that is pouring into companies. This sixth DBTA 100 list spans a wide variety of companies that are each addressing the evolving demands for hardware, software, and services. Some are long-standing companies with well-established offerings that have evolved over time, while others are much newer to the data scene.

Posted June 14, 2018

Boundless, a provider of open and scalable GIS, is releasing Boundless Server Enterprise as a managed cloud service using the most advanced IT infrastructure available. With this release, customers are able to take full advantage of the unmatched geospatial technology of Boundless Server Enterprise, all packaged and ready to support the most complex business challenges immediately.

Posted June 06, 2018

IBM and partners launched a "Call for Code" initiative, an ambitious effort to bring startup, academic, and enterprise developers together to solve one of the most pressing societal issues of our time: preventing, responding to, and recovering from natural disasters.

Posted June 04, 2018

CloudJumper, a Workspace as a Service (WaaS) platform provider for agile business IT, is releasing a business-class Streaming App Services platform, offering a flexible application and data delivery for cloud-forward independent software vendors (ISVs) and application service providers (ASPs).

Posted June 04, 2018

SolarWinds is making a series of updates to its network management product portfolio, allowing the platform to support networks up to four times larger. This improvement makes it easy to consolidate monitoring solutions to a single provider enterprise-wise, and gives IT professionals' far greater flexibility to scale up and support larger data center networks as workloads increase, or scale out to address complex distributed networks. 

Posted June 04, 2018

As Hadoop adoption in the enterprise continues to grow, so does commitment to the data lake strategy. DBTA recently held a webinar with Mark Van de Wiel, CTO, HVR, Dale Kim, Sr. director, products/solutions, Arcadia Data, and Rick Golba, product marketing manager, Percona, who discussed unlocking the power of the data lake.

Posted June 04, 2018

Under usual circumstances, the one-to-many or many-to-many relationship, alone, drives the pattern used within the database model. Certainly, the logical database model should represent the proper business semantics of the situation. But on the physical side, there may exist extenuating circumstances that would cause a data modeler to consider including an associative table construct for a one-to-many relationship.

Posted June 01, 2018

It's spring-cleaning season, meaning many diligent homeowners are busy trying to organize their closets and homes in preparation for the spring and summer. Organizations, both small and large, also must keep up with cleaning their virtual dust bunnies on a regular basis, just as one would do within their own home.

Posted June 01, 2018

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17

Sponsors