Newsletters




Data Modeling

Data Modeling and Database Design and Development – including popular approaches such as Agile and Waterfall design - provide the basis for the Visualization, and Management of, Business Data in support of initiatives such a Big Data Analytics, Business Intelligence, Data Governance, Data Security, and other enterprise-wide data-driven objectives.



Data Modeling Articles

Anyone who has ever attended Oracle OpenWorld knows that you must plan ahead. The conference held in San Francisco each fall is vast, and the upcoming conference, scheduled for September 18-22, 2016, promises to be equally expansive. Just as in years before, tens of thousands of attendees from well over 100 countries can be expected converge to learn more about Oracle's ever-expanding ecosystem of technologies, products and services during thousands of sessions held at the Moscone Center and multiple additional venues in downtown San Francisco. Here, Database Trends and Applications presents the annual Who to See @ Oracle OpenWorld special section.

Posted August 04, 2016

As one works through the normal forms, be it a journey to the placid shores of third normal, or the more arid climes of Boyce-Codd normal form, or even the cloudy peaks of fourth normal and beyond—and before one starts thinking about normalizing the design—the database designer has covered a lot of ground work already. Before thinking of normalizing, one needs to have conceptualized the relations that might be within the solution's scope.

Posted August 04, 2016

Oracle announced today that it is acquiring NetSuite in a transaction valued at approximately $9.3 billion. The transaction is expected to close in 2016. In a statement, Zach Nelson, CEO of NetSuite, said that NetSuite will benefit from Oracle's global scale and reach to accelerate the availability of its cloud solutions.

Posted August 03, 2016

To manage growing data volumes and pressing SLAs, many companies are leveraging Apache™ Kafka and award-winning Attunity Replicate with next-generation change data capture (CDC) for streaming data ingest and processing.

Posted August 03, 2016

The potential of your business intelligence or data processing application is limited without a comprehensive data connectivity solution. Staying competitive and relevant requires a breadth of data connectivity options.

Posted August 03, 2016

Our goal at Amazon Web Services (AWS) is to enable our customers to do things that were previously not possible, and make things that our customers can already do simpler and better at a much lower cost.

Posted August 03, 2016

It is both exciting and validating to be selected as the number 1 data modeling solution by DBTA's discerning readers for the third consecutive year in a row!

Posted August 03, 2016

Data replication advances a number of enterprise goals, supporting scenarios such as distribution of information as part of business intelligence and reporting initiatives, facilitating high availability and disaster recovery, and as part of a no-downtime migration initiative.

Posted August 03, 2016

Everyone knows the three Vs, volume, velocity, and variety, of big data but what's required is a solution that can extract valuable insights from new sources such as social networks email, sensors, connected devices as sensors, the web, and smartphones.

Posted August 03, 2016

Query and reporting solutions are part of a comprehensive business intelligence approach in every organization. As long as enterprises need to gather data, BI groups look to utilize query and report programs as primary applications that produce output from information systems

Posted August 03, 2016

As data grows, organizations are looking for ways to dig up insights from underneath layers of information. Data mining solutions provide the tools that enable them to view those hidden gems and facilitate better understanding of new business opportunities, competitive situations, and complex challenges.

Posted August 03, 2016

Business intelligence encompasses a variety of tools that enable organizations to collect data from internal systems and external sources, prepare it for analysis, develop, and run queries against the data, and create reports, dashboards and data visualizations.

Posted August 03, 2016

Data virtualization provides organizations with the ability to allow the business and IT sides of organizations to work closer together in a much more agile fashion, and helps to reduce the complexity.

Posted August 03, 2016

Data integration is critical to many organizational initiatives such as business intelligence, sales and marketing, customer service, R&D, and engineering. For many enterprises, however, the road to data-driven nirvana is stymied by the inflexible, calcified systems and processes that were laid out decades earlier and still control the data flow within many enterprises, according to Joe McKendrick, lead research analyst at Unisphere Research.

Posted August 03, 2016

The last thing companies want is tainted data merging with incorrect information. The process of maintaining data integrity enhances the reliability of information for use by a business. This is where tools to ensure data quality come in.

Posted August 03, 2016

Data is increasingly appreciated by companies as their most valuable asset. But the problem is that this view is not just held by organizations themselves, there are others - including hackers and - who see it that way as well. IT and data managers can play a pivotal role in enterprise security because they are the insiders with trusted status and they are aware of where the data is stored and how best to reduce or eliminate threats. Newer security technology can also relieve many of the manual burdens associated with database monitoring.

Posted August 03, 2016

For fast-paced, turn-on-a-dime digital enterprises, with demands for 24-by-7 uptime, no activity is more vital than keeping database systems up and running. Today, database availability is no longer just a critical IT issue; it is a critical business issue. To be prepared in the event of system failures, infrastructure owners and DBAs have developed strategies to increase resiliency and assure availability of data.

Posted August 03, 2016

Database development is growing more challenging all the time. Releases are expected to come out faster than ever, and teams are more spread out across, global geographies, time zones, and skill levels. Software deployment, meanwhile needs to be spread across cloud and onsite, and accessible through more devices than ever. What's needed is an integrated end-to-end environment with multi-platform support that helps simplify development and provides automation to achieve repeatable processes and avoid potential risks that can translate into unanticipated delays.

Posted August 03, 2016

If information is the lifeblood of organizations today, then delivering information where it is needed faster can be considered a matter of business health, and in some cases, even business survival.

Posted August 03, 2016

There is more data available to organizations than ever before, but the goal remains the same - to unlock nuggets of gold, the useful information that will result in competitive advantage for the organization, allowing it to react to customer's needs with lightning speed, uncover new opportunities, and act fast to counter competitive threats.

Posted August 03, 2016

Over the years, MultiValue technology has maintained its base of committed advocates despite the decades-long trend toward relational database management systems. And, now with an expanding appreciation for polyglot persistence, or put more simply, the selection of the best tool for the job, there is a growing recognition that different data management systems offer different benefits with some simply better suited for certain requirements than others.

Posted August 03, 2016

Today, data is being recognized and appreciated as an asset, and even, some have suggested, a kind of currency. But beyond the obvious businesses built on data - such as Airbnb's rental business, Uber's car service app, and Alibaba's online marketplace - every business today is striving to become a data-driven organization, with turn-on-a-dime agility and rapid insights into customer behaviors and desires.

Posted August 03, 2016

Splice Machine is teaming up with Incedo, a technology solutions provider specializing in data and analytics, product engineering, and emerging technologies, to generate solutions that will help enterprises manage data and accelerate data processing.

Posted August 03, 2016

Compuware has introduced a mainframe release automation solution that enables enterprises to bring continuous delivery best practices to their IBM z/OS environments. ISPW Deploy, built on the ISPW technology Compuware acquired in January of this year, facilitates faster and more reliable mainframe software deployment.

Posted July 11, 2016

DBmaestro, a provider of DevOps for Database solutions, has improved its open application programming interface (API) for DBmaestro TeamWork, enabling software vendor partners and customers to integrate with DBmaestro TeamWork faster.

Posted July 08, 2016

The next major release of MarkLogic's enterprise NoSQL database platform is expected to be generally available by the end of this year. Gary Bloom, president and CEO of the company, recently reflected on the changing database market and how new features in MarkLogic 9 address evolving requirements for data management in a big data world. "For the first time in years, the industry is going through a generational shift of database technology - and it is a pretty material shift," observed Bloom.

Posted June 30, 2016

The data manager now sits in the center of a revolution swirling about enterprises. In today's up-and-down global economy, opportunities and threats are coming in from a number of directions. Business leaders recognize that the key to success in hyper-competitive markets is the ability to leverage data to draw insights that predict and provide prescriptive action to stay ahead of markets and customer preferences. For that, they need to keep up with the latest solutions and approaches in data management. Here are 12 of the key technologies turning heads—or potentially opening enterprise wallets—in today's data centers.

Posted June 22, 2016

Talend is "going all-in" with Amazon Web Services (AWS), now providing its entire product line on AWS cloud. The latest release of Talend Integration Cloud extends the company's ability to allow IT organizations to quickly "spin up and spin down" big data and data integration workloads running on Amazon Redshift or Amazon EMR.

Posted June 10, 2016

Splice Machine has announced that it is releasing its database management system, a dual-engine RDBMS powered by Hadoop and Spark, as an open source platform. The community edition will be free for members of the open source community, allowing them to use as well as modify source code, whereas the enterprise edition will include a set of proprietary tools focused on operational features that enable DBAs and DevOps teams to maintain, tune, and keep the platform secure while it's live.

Posted June 10, 2016

Alteryx, Inc., a provider of self-service data analytics, is expanding its self-service analytics capabilities for analysts using the Salesforce platform. Alteryx is improving its integration with Salesforce Wave Analytics as well as updating read / write capabilities across the Salesforce Customer Success Platform.

Posted June 10, 2016

Cloudera is collaborating with Microsoft to build a new open source platform that will reduce the burden on application developers leveraging Spark. The two entities, together with other open source contributors, have built a new open source Apache licensed REST-based Spark Service, called Livy, which is still in early alpha development.

Posted June 09, 2016

Emerging and newer vendors can offer fresh, innovative ways of dealing with data management and analytics challenges. Here, DBTA looks at the 10 companies whose approaches we think are worth watching.

Posted June 06, 2016

Dynatrace, a digital performance software company, is teaming up with Pivotal, to deploy its application monitoring solutions for the Pivotal Cloud Foundry (PCF) platform. The integration of Dynatrace with Pivotal Cloud Foundry will enable companies to take advantage of this acceleration by collecting analytics for applications running on PCF, allowing them to detect and act on performance shortcomings and optimize end-to-end transaction latencies.

Posted May 26, 2016

COLLABORATE, the annual conference presented each year by the OAUG, IOUG and Quest, provides the opportunity to reflect on key changes in the Oracle ecosystem and allows the users groups to engage with their constituents about the areas of greatest importance. With the COLLABORATE 16 conference now behind her, Dr. Patricia Dues, the new president of the OAUG, talked with DBTA about what OAUG members are concerned with now and how the OAUG is helping them address emerging challenges.

Posted May 25, 2016

Redmonk's annual programming language rankings - based on GitHub and StackOverflow traffic - were recently released and to no one's surprise, JavaScript was ranked as the most popular programming language.

Posted May 18, 2016

Not too long ago, IT embraced the pattern language concepts of Christopher Alexander. Being an architect, of the more traditional variety, his ideas were based on creating spaces in which people felt good, even if they didn't comprehend exactly why. Architected spaces need to express multiple qualities that include being alive, whole, comforting, free, exact, egoless, and eternal. The more those qualities were embodied, the better people responded to the desirability of the space.

Posted May 04, 2016

Dell is releasing an upgraded version of its Statistica advanced analytics platform, providing tools to address Internet of Things (IoT) analytics requirements and leverage heterogeneous data environments.

Posted May 04, 2016

Analytics provider Lavastorm is partnering with Qlik, a provider of visual analytics software, to offer an integrated solution for a broader swath of users of all skill levels through Qlik Sense.

Posted May 03, 2016

Qubole is announcing two major changes. It is releasing an open sourced version of its StreamX tool and forming a partnership with Looker.

Posted May 02, 2016

Magnitude Software, a provider of Enterprise Information Management (EIM) software, unveiled a new a master data management offering designed to fuel business processes with accurate customer data for informed decision making.

Posted April 27, 2016

BackOffice Associates, a provider of information governance and data modernization solutions, is acquiring CompriseIT, a U.K. consulting firm specializing in helping enterprises adopt SAP Business Suite 4 SAP HANA (SAP S/4HANA). BackOffice Associates' acquisition of CompriseIT is the latest initiative in move to strengthen its expertise in helping customers as they embark on their journey to implement SAP S/4HANA.

Posted April 27, 2016

Cisco is launching an appliance that includes the MapR Converged Data Platform for SAP HANA, making it easier and faster for users to take advantage of big data. The UCS Integrated Infrastructure for SAP HANA is made easy to deploy, speeds time to market, and will reduce operational expenses along with providing users with the flexibility to choose a scale-up (on-premises) or scale-out (cloud) storage strategy.

Posted April 27, 2016

OpenText, a provider of enterprise information management (EIM) solutions, is releasing an enhanced version of its namesake platform, addressing key areas of the user experience, machine-to-machine integration, automation, and more.

Posted April 27, 2016

The core reason for implementing in-memory technology is to improve performance. To help accelerate adoption of in-memory technologies and provide a universal standard for columnar in-memory processing and interchange, the lead developers of 13 major open source big data projects have joined forces to create Apache Arrow, a new top level project within the Apache Software Foundation (ASF).

Posted April 24, 2016

GridGain Systems, provider of enterprise-grade in-memory data fabric solutions based on Apache Ignite, is releasing a new version of its platform. GridGain Professional Edition includes the latest version of Apache Ignite plus LGPL libraries, along with a subscription that includes monthly maintenance releases with bug fixes that have been contributed to the Apache Ignite project but will be included only with the next quarterly Ignite release.

Posted April 20, 2016

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21

Sponsors