▼ Scroll to Site ▼

Newsletters




Data Modeling

Data Modeling and Database Design and Development – including popular approaches such as Agile and Waterfall design - provide the basis for the Visualization, and Management of, Business Data in support of initiatives such a Big Data Analytics, Business Intelligence, Data Governance, Data Security, and other enterprise-wide data-driven objectives.



Data Modeling Articles

Cockroach Labs, provider of the distributed SQL database CockroachDB, is receiving $55 million in series C funding, enabling the company to further grow CockroachDB. The round is led by Altimeter Capital, Tiger Global, and existing investor GV, with participation from existing investors Benchmark, Index Ventures, Redpoint Ventures, FirstMark Capital, and Work-Bench, and brings Cockroach Labs' total funding to $108.5 million.

Posted August 06, 2019

Dgraph Labs, the cloud-native graph database company, is closing a $11.5 million Series A funding round that will enable the company to expand enterprise features and its user base, support next-generation GraphQL applications and build Dgraph-as-a-Service.

Posted July 31, 2019

BMC has announced the BMC AMI DevOps for Db2 solution, which is designed to accelerate the delivery of new and updated applications to the market. The solution integrates with the Jenkins Pipeline suite of plug-ins to provide an automated way of receiving, evaluating, and implementing Db2 schema changes as part of an application update. 

Posted July 30, 2019

A cloud migration is when a company moves some or all of its data center capabilities into the cloud, usually to run on the cloud-based infrastructure provided by a cloud service provider. However, there are always risks and challenges with migrating systems and databases to new environments.

Posted July 25, 2019

Data Summit 2019 in Boston drew industry experts with deep knowledge spanning all areas of enterprise IT, including AI and machine learning, analytics, cloud, data warehousing, and software licensing who presented 3 days of thought-provoking sessions, keynotes, panel discussions, and hands-on workshops. Here are some key takeaways from the Data Summit 2019.

Posted July 25, 2019

While change has always been a part of the database credo, the growing emphasis on data-driven decision making in today's economy has resulted in a dizzying plethora of technologies and methodologies entering the market. The number and scope of game-changing technologies are too numerous to mention, and one thing is certain: Database management will never be the same. We have identified some of the most promising technology initiatives, based on discussions with and input from data experts from across the industry spectrum, gathering their views on the key technologies—well-known or under the radar—that are worth watching.

Posted July 25, 2019

Percona, a provider of open source database software and services, is releasing the Percona Cloud Native Autonomous Database Initiative, a series of products that expand support for cloud-native applications and makes it easier for organizations to manage their hybrid multi-cloud environments.

Posted July 24, 2019

SAP is further deepening its commitment to the developer and open source community with the contribution of UI5 Web Components, a comprehensive library for Web developers. This library enables them to create enterprise-grade Web applications more easily.

Posted July 24, 2019

Data virtualization enables the ability to have one or more data stores that break the bank processing-wise, because they can physically exist once but logically exist in multiple transformed structures. Occasionally, IT managers get the idea that data virtualization is a more generic answer, presuming that if it works for the big data, it can work for all data.

Posted July 18, 2019

Collibra, the data intelligence company, is acquiring SQLdep, a leading SaaS provider of automated data lineage, empowering data citizens to uncover faster and deeper insights from data. SQLdep automates the discovery and visualization of technical data lineage and enables organizations to better capture context around data, understand data quality, support compliance initiatives, and increase trust.

Posted July 10, 2019

Compuware has introduced new zAdviser analytics to enable application development and delivery teams to make data-driven decisions that improve mainframe software quality, velocity, and efficiency. Compuware is also announcing expanded integration of ISPW and Git, the popular version control software. The integration helps developers with little mainframe experience work with mainframe source code. Changes in Git are automatically synchronized back into the mainframe where ISPW's automated build, deploy, and fallback capabilities can be leveraged in a CI/CD pipeline.

Posted July 08, 2019

SolarWinds, a provider of IT management software, is integrating AppOptics, the company's SaaS-based application performance monitoring solution, with its Loggly and Papertrail cloud-hosted log monitoring and log analytics solutions. AppOptics together with Loggly and Papertrail now combine application performance management (APM), distributed tracing, and log management to help technology professionals identify performance and availability issues before they affect users, pinpoint the root cause, and reduce mean time to repair (MTTR).

Posted July 08, 2019

Oracle has introduced the new Autonomous Database Dedicated service to provide customers with more control for their most mission-critical workloads. "Our Autonomous Database Dedicated service eliminates the concerns enterprise customers previously had about security, isolation, and operational policies when moving to cloud," said Juan Loaiza, executive vice president, Mission-Critical Database Technologies, Oracle.

Posted July 03, 2019

source{d}, a data platform for the software development life cycle (SDLC), is releasing a new Enterprise Edition with built-in visualization, management capabilities, and advanced analytic functions. source{d} enables enterprises to aggregate all SDLC data sources into one data lake where they can easily extract, load and transform source code, version control data, project tracking data, build systems data, configuration files and more.

Posted July 02, 2019

GridGain Systems, provider of enterprise-grade in-memory computing solutions based on Apache Ignite, has introduced new GridGain Developer Bundles, which include support, consulting, and training for GridGain Community or Enterprise Edition.

Posted July 02, 2019

RavenDB, a provider of database infrastructure solutions, is releasing its new RavenDB Cloud managed database service. RavenDB Cloud performs all daily tasks such as maintaining hardware servers, installation, configuration, monitoring internals, and security for its users worldwide.

Posted July 02, 2019

MongoDB, a provider of a general purpose data platform, is releasing an enhanced version of its core database MongoDB 4.2. Key features include distributed transactions, field level encryption, and an updated Kubernetes Operator.

Posted June 28, 2019

Hewlett Packard Enterprise (HPE) is introducing new edge solutions, research labs, and programs to simplify and accelerate Intelligent Edge adoption. These initiatives will enable customers to create unique digital experiences and leverage analytics and machine learning to adapt to changes in real-time.

Posted June 19, 2019

Tamr has announced the general availability of the Spring 2019 release of the company's patented data unification system. Purpose-built to leverage machine learning, human knowledge, and—where appropriaterules to solve data integration challenges, Tamr enables organizations to create unified data assets that fuel analytic insights and operational improvements.

Posted June 18, 2019

Scale Computing, a provider of edge computing solutions, announced that the KVM-based hypervisor in the HC3 product family is now fully supported by Parallels Remote Application Server17 (Parallels RAS). When combined with Parallels RAS, Scale Computing HC3 enables administrators to rapidly provision and manage virtual machines (VM) thin clones centrally from Parallels RAS Console to make VDI solutions faster, more affordable, and easier to use.

Posted June 18, 2019

Time series databases are optimized for collecting, storing, retrieving, and processing time series data. It's critical that businesses use a time series database for time series data and not one of the traditional data stores. DBTA recently held a webinar with Daniella Pontes, product manager, InfluxData, who discussed how Time Series Databases are built with specific workloads and requirements in mind, including the ability to ingest millions of data points per second.

Posted June 17, 2019

Alteryx has introduced Assisted Modeling, an interactive guide built into the Alteryx Platform to walk users through the creation of machine learning models. With the introduction of Assisted Modeling, the company plans to amplify the capabilities of analysts and citizen data scientists, delivering transparent machine learning in a code-free environment.

Posted June 12, 2019

At times, there is a need to have security within the database be a bit more sophisticated than what is available.  On specific tables, there may be a need to limit access to a subset of rows, or a subset of columns to specific users. Yes indeed, views have always existed, and yes indeed, views can be established limiting rows or columns displayed. However, views only can go so far.

Posted June 10, 2019

CDC can greatly minimize the amount of data processed; but the cost is that the processes themselves become more complicated and overall storage may be higher. Costs are moved around, the final level of processing becomes focused on the minimal changes, and this minimization is the efficiency to be gained. Moving forward, using the data becomes standardized and ultimately straightforward.

Posted June 10, 2019

SnapLogic, provider of an Intelligent Integration Platform, is now a certified CoupaLink Technology Partner after successfully completing the certification to integrate with the Coupa Business Spend Management (BSM) Platform. SnapLogic's new Coupa connector uses Coupa's REST-based APIs to enable seamless data flow to and from the Coupa BSM Platform.

Posted June 07, 2019

Data lake adoption is on the rise. Right now, 38% of DBTA subscribers have data lakes deployed to support data science, data discovery and real-time analytics initiatives, and another 20% are considering adoption. Today, most data lakes are on-premises. However, the cloud is becoming an increasingly attractive location as well. While data lakes have evolved and matured over the past few years of enterprise use, many challenges still exist.

Posted June 06, 2019

DBTA's next Data Summit conference will be held May 19-20, 2020, with pre-conference workshops on Monday, May 18. The conference will return to the Hyatt Regency Boston.

Posted June 06, 2019

SolarWinds, a provider of IT management software, is adding a broad refresh to its network management portfolio, boosting network security features and more. The addition of Network Insight support for Palo Alto Networks and key enhancements to the SolarWinds Orion Platform bring greater data visibility and scalability to IT pros.

Posted June 04, 2019

Datical, a provider of database release automation solutions, is receiving a patent number for its Database Change Management Simulator. The patented technology proactively forecasts and determines the impact of database schema and stored procedure changes before they are deployed.

Posted May 31, 2019

Container usage is now being adopted by organizations of all sizes, from small startups to companies with huge, established microservices platforms. At Data Summit 2019 Jeff Fried, director of product management, Intersystems and BA Insight, MIT, and Joe Carroll, product specialist, InterSystems, presented their session, "Understanding Database Containerization," focusing on helping practitioners navigate the minefield of database containerization and avoid some of the major pitfalls that can occur.

Posted May 30, 2019

Enterprise agility isn't a single initiative but rather a collection of activities and technologies that lead toward that goal. This includes adoption of microservices, containers, and Kubernetes to increase the flexibility of systems, applications, and data by releasing them from underlying hardware. In addition, practices such as DevOps are helping to increase the level of collaboration possible for fast-moving enterprises.

Posted May 30, 2019

Red Hat is contributing to Microsoft KEDA, a new open source project aimed at providing an event-driven scale capability for any container workload. Using KEDA, Red Hat puts Azure Functions on top of its OpenShift Container Platform in Developer Preview. It is designed to behave the same way it does when running on Azure as a managed service, but now running anywhere OpenShift runs, which means on the hybrid cloud and on-premises.

Posted May 28, 2019

Due to exponentially growing data stores, organizations today are facing slowdowns and bottlenecks at peak processing times, with queries taking hours or days. Some complex queries simply cannot be executed. Data often requires tedious and time-consuming preparation before queries can be run. David Leichner, CMO, SQream, demonstrated how the power of GPUs can help conquer these challenges with his presentation, "Accelerating Analytics in a New Era of Data," during Data Summit 2019.

Posted May 24, 2019

As Data Summit 2019 comes to a close, John O'Brien, principal advisor and CEO, Radiant Advisors, looked at lessons learned from this year's conference during his closing keynote. Companies are not lacking in technology options; in most cases, more advanced technologies exist than can be absorbed into the organization all at once.

Posted May 23, 2019

The ability for knowledge graphs to amass information and relationships and connect facts is showing potential for a range of use cases. Bob Kasenchak, director of business development, Access Innovations, Inc., USA, discussed the rise of knowledge graphs in his presentation, "From Structured Text to Knowledge Graphs: Creating RDF Triples From Published Scholarly Data" at Data Summit 2019.

Posted May 23, 2019

There are new technologies that contribute to speed and scale of a modern data platform. But as data size and complexity increase with Big Data, data quality and data integration issues must still be addressed. At Data Summit 2019 Prakriteswar Santikary, VP & Global Chief Data Officer, ERT discussed how to create the architecture of a modern, cloud-based, real-time data integration and analytics platform that ingests any type of clinical data (structured, unstructured, binary, lab values, etc.) at scale from any data sources, during his presentation, "Designing a Fast, Scalable Data Platform."

Posted May 22, 2019

DataOps is an emerging set of practices, processes, and technologies for building and enhancing data and analytics pipelines to better meet the needs of the business. The list of failed big data projects is long. They leave end users, data analysts, and data scientists frustrated with long lead times for changes.

Posted May 22, 2019

Whether the enterprise is completely new to machine learning (ML) or it has already trained and deployed a model from scratch, Google Cloud Platform has a variety of tools to help the company start using ML right now. Sara Robinson, developer advocate, Google, centered her presentation around the basics of machine learning during her Data Summit 2019 presentation, "Exploring Machine Learning on the Google Cloud Platform."

Posted May 21, 2019

Data science, the ability to sift through massive amounts of data to discover hidden patterns and predict future trends, may be in demand, but it requires an understanding of many different elements of data analysis. Extracting actionable knowledge from all your data to make decisions and predictions requires a number of skills, from statistics and programming to data visualization and business domain expertise.

Posted May 20, 2019

Machine learning (ML) is on the rise at businesses hungry for greater automation and intelligence with use cases spreading across industries. At the same time, most projects are still in the early phases. From selecting data sets and data platforms to architecting and optimizing data pipelines, there are many success factors to keep in mind.

Posted May 20, 2019

Apache Airflow is turning heads these days. It integrates with many different systems and it is quickly becoming as full-featured as anything that has been around for workflow management over the last 30 years. This is predominantly attributable to the hundreds of operators for tasks such as executing Bash scripts, executing Hadoop jobs, and querying data sources with SQL.

Posted May 16, 2019

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22

Sponsors