▼ Scroll to Site ▼

Newsletters




Data Quality

Solutions and Services for Data Quality include Master Data Management, Data Cleansing, Data Deduplication, Address Verification, Customer Contact Data Management, Customer Relationship Management (CRM), Golden Record Creation, Geocoding, Data Integration, Data Management, and Mailing Software for Adherence to U.S. Postal Regulations.



Data Quality Articles

Business analytics provider Pentaho has partnered with Human Inference to deliver improved data quality for Pentaho Business Analytics. The combination of the two open source products is intended to provide highly accurate and consistent data to business applications for enterprises on-premise and in the cloud. According to the vendors, the new integration with Human Inference technology will enable Pentaho Business Analytics customers to rapidly create business intelligence applications with higher quality data, allowing for better and quicker decision making.

Posted April 24, 2012

Talend, a provider of open source integration software, has announced the availability of Talend Open Studio for Big Data, to be released under the Apache Software License. Talend Open Studio for Big Data is based on Talend Open Studio, augmented with native support for Apache Hadoop. In addition, Talend Open Studio for Big Data will be bundled in Hortonworks' Apache Hadoop distribution, Hortonworks Data Platform, constituting a key integration component of Hortonworks Data Platform.

Posted March 13, 2012

Melissa Data Corp., a developer of high performance data quality and address management solutions, has announced a corporate initiative evolving its solutions structure. The company is forming two distinct business divisions - the Data Quality division, which will include developer tools and enterprise plug-ins dedicated to helping customers clean and maintain high quality U.S., Canadian, and international contact data; and the Mailers Software division, which will focus on the needs of the direct marketer, with an emphasis on the company's flagship MAILERS+4 bulk mail software, mailing lists and sales leads, change-of-address processing, and data appending services.

Posted February 13, 2012

Melissa Data, a provider of contact data quality and integration solutions, today announced Contact Zone, its open source data integration software optimized for sophisticated contact data quality. Contact Zone provides a simple approach to data quality, using a streamlined graphical user interface to map data transformations from any type of source database to any type of data warehouse.

Posted January 17, 2012

The latest version of expressor software's flagship data integration platform, expressor 3.5, features cloud integration with Melissa Data's Data Quality Tools and Salesforce.com to provide comprehensive BI reporting and CRM integration with on premises applications. The new Salesforce.com and Melissa Data capabilities ship with expressor 3.5 Desktop Edition and Standard Edition.

Posted January 10, 2012

In this, our last E-Edition of Database Trends and Applications for 2011, we're taking a look back at some of the most widely read articles of the past year. These articles cover a range of topics. Some provide an examination of just-emerging or quickly evolving technologies, others highlight best practices in a specific discipline, while others comment on trends observed by industry experts. Click on the "December 2011 E-Edition UPDATE" headline above to access the articles. If you missed one earlier in the year, here's your second chance. All DBTA E-Editions are archived by month on the DBTA website.

Posted December 16, 2011

At IBM's 2011 Business Analytics Forum, IBM unveiled new software that brings the power of managing and analyzing big data to the workplace. The new offerings span a variety of big data and business analytics technologies across multiple platforms from mobile devices to the data center to IBM's SmartCloud, enabling employees from any department inside an organization to explore unstructured data such as Twitter feeds, Facebook posts, weather data, log files, genomic data and video, as part of their everyday work experience.

Posted October 24, 2011

Talend, an open source software provider, has announced an expansion of its "Powered by Talend" OEM Partner Program designed to help software vendors and SaaS providers embed its enterprise-grade open source integration technologies in their offerings. With the program, partners can leverage all components of the Talend Unified Platform, a set of data management and application integration technologies that Talend says can deliver core functionality at a fraction of the cost of custom development.

Posted October 20, 2011

Melissa Data Corp, a provider of contact data quality and integration solutions, introduced its SmartMover Component for Microsoft's SQL Server Integration Services (SSIS) at the PASS Summit 2011. SmartMover allows users to update U.S. and Canadian customer records with new move updated addresses, helping businesses stay in contact with their customers, while reducing wasted time, money, and postage on undeliverable-as-addressed mail. SmartMover is a "unique component for SSIS," Greg Brown, director of marketing for Melissa Data, tells 5 Minute Briefing.

Posted October 18, 2011

As companies learn to embrace "big data" - terabytes and gigabytes of bits and bytes, strung across constellations of databases - they face a new challenge: making the data valuable to the business. To accomplish this, data needs to be brought together to give decision makers a more accurate view of the business.

Posted September 21, 2011

HP is offering a series of new software solutions designed to improve collaboration among application development and delivery teams. The new HP ALM software solutions include HP Service Virtualization 1.0, HP Application Lifecycle Intelligence (ALI), and HP Agile Accelerator 5.0. "Without a performance management system, it's difficult to measure success," Matthew Morgan, senior director of worldwide product marketing at HP Software, said at a press and blogger briefing at launch day for the product line. Application metrics "should be digitized and automated, and not sit on an Excel desktop."

Posted July 25, 2011

BackOffice Associates, LLC, a provider of ERP data migration, data governance and master data management solutions for SAP, Oracle and other ERP vendors, announced the release of a new data quality as a service offering, QCloud.

Posted July 19, 2011

Is the day of reckoning for big data upon us? To many observers, the growth in data is nothing short of incomprehensible. Data is streaming into, out of, and through enterprises from a dizzying array of sources-transactions, remote devices, partner sites, websites, and nonstop user-generated content. Not only are the data stores resulting from this information driving databases to scale into the terabyte and petabyte range, but they occur in an unfathomable range of formats as well, from traditional structured, relational data to message documents, graphics, videos, and audio files.

Posted June 22, 2011

QueplixCorp., a provider of data integration and data management products, has introduced the new Data Quality Manager for QueCloud, enabling companies to create and maintain data consistency throughout the data migration, integration and management lifecycle with a single cloud-based platform. As a central component of the QueCloud dashboard, Data Quality Manager is tightly coupled with the solution's core data integration and data management functionality.

Posted June 21, 2011

Talend, a developer and distributor of open source middleware, has announced Talend Cloud, a cloud-enabled integration platform that provides a unified integration platform for on-premise systems, cloud-based systems and SaaS applications. Based on Talend's Unified Integration Platform, it also provides a common environment for users to manage the entire lifecycle of integration processes including a graphical development environment, a deployment mechanism and runtime environment for operations and a monitoring console for management - all built on top of a shared metadata repository.

Posted June 21, 2011

Data quality, MDM, and data governance software vendor Ataccama Corporation announced that it has entered into a cooperative partnership with Teradata Corporation, a leader in data warehousing and enterprise analytics. The partnership is aimed at enabling joint customers to improve data quality within their data warehouses. Ataccama is a global software company with headquarters in Prague, and offices in Toronto, Stamford, London, and Munich, and the new partnership with Teradata represents a worldwide geographical relationship, according to Michal Klaus, CEO, Ataccama Corp.

Posted May 24, 2011

Oracle has agreed to acquire Datanomic Limited, a provider of customer data quality software and related applications for risk and compliance screening. According to Oracle, the Datanomic technology combined with Oracle Product Data Quality will provide a complete data quality solution to reduce the cost and complexity of managing data across its customers' businesses. The transaction is expected to close in the first half of calendar year 2011, and Datanomic's management and employees are expected to join Oracle.

Posted April 21, 2011

Trillium Software has formed an alliance with Microsoft to provide integration between the Trillium Software System data quality solution and Microsoft Dynamics CRM 2011 customer relationship management (CRM) software. As a result, Microsoft Dynamics CRM users who choose to leverage integrated Trillium Software data quality services can ensure that global customer data is accurate and fit-for purpose, whether using on-premises or cloud deployment models (utilizing the Windows Azure platform).

Posted April 19, 2011

Melissa Data Corp, a developer of data quality and address management solutions, has announced that customers can now access detailed property and mortgage data on more than 140 million U.S. properties by using the company's new WebSmart Property Web Service. The comprehensive solution is available for sourcing nearly any information on a given property - from parcel and owner information to square footage to zoning and more.

Posted April 12, 2011

Melissa Data Corp, a developer of data quality and address management solutions, today announced completion of CASS Cycle N certification. Certification of Melissa's software comes months ahead of the USPS July 31, 2011 expiration of CASS Cycle M. In order to continue to qualify for postal automation discounts, CASS vendors must deliver CASS Cycle N to their customers beginning May 1, 2011. With CASS Cycle N, SuiteLink, a USPS product that improves mail delivery by adding known secondary information (suite numbers) to business addresses, will be required for processing.

Posted February 15, 2011

When designing a system an architect must conform to all three corners of the CIA (Confidentiality, Integrity and Accessibility) triangle. System requirements for data confidentiality are driven not only by business rules but also by legal and compliance requirements. As such, the data confidentiality (when required) must be preserved at any cost and irrespective of performance, availability or any other implications. Integrity and Accessibility, the other two sides of triangle, may have some flexibility in design.

Posted January 07, 2011

With the holiday season just behind us - and all the cards, catalogs and promotional mailings that go along with it - Melissa Data, a provider of data quality and address management solutions, reminds us that it is more important than ever to correct and cleanse data for customers and constituents in the new year. It's a green approach on many levels

Posted January 04, 2011

Melissa Data, a developer of high performance data quality and address management solutions, has announced expanded coverage for its GeoCoder Object. Available as a multiplatform API or as part of WebSmart Services, GeoCoder Object now provides accurate location-based information on 95% of all rooftops in the U.S.

Posted December 14, 2010

The IOUG has completed a number of ground-breaking studies in 2010 through the IOUG ResearchWire program. Conducted among IOUG members by Unisphere Research, 2010 IOUG ResearchWire Executive Summaries are available to all on the IOUG website.

Posted December 01, 2010

The year 2010 brought many new challenges and opportunities to data managers' jobs everywhere. Companies, still recovering from a savage recession, increasingly turned to the power of analytics to turn data stores into actionable insights, and hopefully gain an edge over less data-savvy competitors. At the same time, data managers and administrators alike found themselves tasked with managing and maintaining the integrity of rapidly multiplying volumes of data, often presented in a dizzying array of formats and structures. New tools and approaches were sought; and the market churning with promising new offerings embracing virtualization, consolidation and information lifecycle management. Where will this lead in the year ahead? Can we expect an acceleration of these initiatives and more? DBTA looked at new industry research, and spoke with leading experts in the data management space, to identify the top trends for 2011.

Posted November 30, 2010

Sybase has issued the third and final installment of results from a study on the business impacts of effective data. The study benchmarked some of the world's leading companies across a range of vertical industries by measuring the direct correlation between a company's IT investments and overall business performance.

Posted November 17, 2010

At InformaticaWorld last week, Informatica announced the general availability of the latest release of its master data management (MDM) product, Informatica 9 MDM.

Posted November 09, 2010

Estimates put the amount of data in existence at this time at more than a zettabyte (or a trillion gigabytes), which would be the equivalent of 75 billion fully loaded iPads. All this data is streaming into and through enterprises from transactions, remote devices, partner sites and user-generated content, with formats varying from structured, relational data to graphics and videos.

Posted October 06, 2010

Melissa Data, a developer of high performance data quality and address management solutions, showcased the Contact Verification Server at Oracle OpenWorld. Providing a turnkey solution, the appliance is built by Dell and incorporates six WebSmart components for contact data verification and enrichment, including address, phone, and email verification, name parsing, geocoding and change-of-address processing. The server can verify more than 7 million records per hour and additional servers can be clustered together for increased scalability, throughput and redundancy.

Posted October 06, 2010

Melissa Data, a developer of high performance data quality and address management solutions, showcased the Contact Verification Server at the Oracle OpenWorld show in San Francisco. Providing a turnkey solution, the appliance is built by Dell and incorporates six WebSmart components for contact data verification and enrichment, including address, phone, and email verification, name parsing, geocoding and change-of-address processing. The server can verify more than 7 million records per hour and additional servers can be clustered together for increased scalability, throughput and redundancy.

Posted September 28, 2010

Sybase, Inc., an SAP company, has revealed the results of a new report, "Measuring the Business Impacts of Effective Data." The study benchmarks leading enterprises across a wide range of industries by measuring the direct correlation between a company's IT investments and overall business performance.

Posted September 22, 2010

Trillium Software, a business of Harte-Hanks, Inc., has introduced the latest version of the Trillium Software System. Designed to visualize data issues, improve business rule validation and enhance data quality monitoring in operational environments, the new Trillium Software System is aimed at helping business users and analysts, data stewards and IT professionals work together to improve business decisions and outcomes through better, more accurate information.

Posted September 22, 2010

Trillium Software, a business of Harte-Hanks, Inc., has introduced the latest version of the Trillium Software System. Designed to visualize data issues, improve business rule validation and enhance data quality monitoring in operational environments, the new Trillium Software System is aimed at helping business users and analysts, data stewards and IT professionals work together to improve business decisions and outcomes through better, more accurate information.

Posted September 21, 2010

Oracle has introduced Oracle GoldenGate 11g and Oracle Data Integrator Enterprise Edition 11g, new releases of the two products that form the foundation of Oracle's data integration product line. Oracle GoldenGate 11g delivers real-time data integration and continuous availability for mission-critical systems through its low-impact, low-latency data acquisition, distribution and delivery capabilities, and Oracle Data Integrator Enterprise Edition 11g provides loading and transformation of data into a data warehouse environment through its high-performance extract, load and transform (E-LT) technology.

Posted September 14, 2010

Organizations turn to master data management (MDM) to solve many business problems - to reach compliance goals, improve customer service, power more accurate business intelligence, and introduce new products efficiently. In many cases, the need for an MDM implementation is dictated by the business challenge at hand, which knows no single data domain. Take a manufacturing customer, for example. The company decided to deploy an MDM solution in order to solve buy-side and sell-side supply chain processes, to more effectively manage the procurement of direct and indirect materials and to improve the distribution of products. To meet these goals the solution must be capable of managing vendor, customer, material and product master data. Unfortunately, quite a few vendors sell technology solutions that focus exclusively on either customer data integration (CDI) or product information management (PIM), which solves only a piece of the business problem.

Posted September 07, 2010

Melissa Data, a developer of data quality and address management solutions, has released a 25th anniversary special edition catalog. The catalog provides detailed listings on over 60 different Melissa Data products and services including enterprise data quality platforms, developer tools, address management software, mailing lists, and data hygiene services. The catalog also offers links to many white papers on data quality and direct marketing, as well as information on the Melissa Data Independent Software Vendor Program, and their Data Quality Challenge.

Posted July 20, 2010

When integrating data, evaluating objects from multiple sources aids in determining their equivalence. Each source may identify customers, but determining which customer from each system represents the same customer can prove daunting. Sometimes matching things is straight-forward; for example, if all sources should have an accurate social security number or taxpayer ID, success involves simply linking the matching numbers.

Posted July 12, 2010

Everybody seems to agree with the need for organizations to do a better job of protecting personal information. Every week the media brings us reports of more data breaches, and no organization is immune. Hospitals, universities, insurers, retailers, and state and federal agencies all have been the victims of breach events, often at significant costs. State privacy laws such as the new Massachusetts privacy statutes have placed the burden of protecting sensitive information squarely on the shoulders of the organizations that collect and use it. While some managers might view this as yet one more compliance hurdle to worry about, we feel it presents an excellent opportunity to evaluate existing practices and procedures. The good news is that there are some great solutions available today that can help organizations of all stripes address these requirements while at the same time tightening data security practices, streamlining operations, and improving governance.

Posted July 12, 2010

Quality can be a hard thing to define. What is good and what is bad may not be easily identified and quantified. When a data mart accurately reflects data exactly as found in the source, should that be considered a quality result? If the source data is bad, is the data mart of high quality or not? If the data mart differs from the source, when is the difference an improvement of quality and when is said difference evidence of diminished quality?

Posted June 23, 2010

Pervasive Software Inc., a global leader in cloud-based and on-premises data integration software, has formed a partnership with Melissa Data, a provider of data quality software and services including out-of-the-box connectivity to Melissa Data's data quality offerings, giving both companies' customers the benefits of seamless data quality and data integration capabilities.

Posted June 15, 2010

Quality can be a hard thing to define. What is good and what is bad may not be easily identified and quantified. When a data mart accurately reflects data exactly as found in the source, should that be considered a quality result? If the source data is bad, is the data mart of high quality or not? If the data mart differs from the source, when is the difference an improvement of quality and when is said difference evidence of diminished quality? While it may seem self-evident that correcting the source of load data would be the "right" thing to do, in practice that direction is not necessarily self-evident. The reasons supporting this nonintuitive approach are varied. Sometimes changes to the source impact other processes that must not change, or the changes will expose problems that may provoke undesired political fallout, or it may simply be that making the proper adjustments to the source application would prove too costly to the organization. For all these reasons and more, in the world of business intelligence, the dependent data often is expected to be of higher quality than the source data. In order for that improvement to occur, data placed within the dependent mart or data warehouse must be altered from the source. Sometimes these alterations become codified within the process migrating data from the source. Other times changes are made via one-time ad hoc updates. Either way, this alteration leads to a situation in which the dependent data will no longer equate one-for-one to the source data. Superficial comparisons of this altered content will highlight the disparity that what exists for analytics is not the same as what exists for the operational system.

Posted June 07, 2010

Varonis Systems Inc., a provider of data governance software, will soon be shipping Version 5.5 of its data management and governance toolsets. The updated editions of DatAdvantage and DataPrivilege represent the latest evolution of Varonis' Meta-data Framework, which enables customers to identify sensitive unstructured and semi-structured data on their file systems, SharePoint sites and network-attached storage (NAS) devices, find areas with excessive permissions and abnormal access activity, understand who can access, who is accessing, who shouldn't have access, and who owns the data, and remediate risk faster than traditional data protection products.

Posted June 01, 2010

At its Information On Demand conference in Rome, IBM unveiled new software to place the power of predictive analytics into the hands of business users for faster, more insightful decision making. According to the company, with three clicks, business users can now build a predictive model within a configurable Web browser interface, and run simulations and "what-if" scenarios that compare and test the best business outcomes before the model is ever deployed into an operational system. Business users now have full control over the analytic process, enabling them to make accurate decisions in real-time, based on changes in strategy, customer buying patterns and behaviors, or fluctuating market conditions.

Posted June 01, 2010

HiT Software, a provider of provider of data integration and change data capture software products, has been acquired by BackOffice Associates, a provider of data migration, data governance and master data management solutions for Oracle, SAP and other ERP vendors.

Posted May 20, 2010

Mergers and acquisitions often come quickly and when they do, it is critical to have tools and utilities capable of scaling to meet new challenges so operations continue seamlessly, customer service standards are upheld, and costs are contained. This was the case for UGI Utilities, a large natural gas and electric service provider in the eastern U.S. In 2006, UGI acquired the natural gas utility assets of PG Energy from Southern Union Company. A longtime customer of BMC, UGI found it was aligned with the right software company to provide implementation of mainframe service management solutions as well as first class support to get the job done and successfully integrate the newly acquired company's data into its environment, saving time and money.

Posted May 10, 2010

Informatica Corporation has announced new customer support offerings and proactive support capabilities as part of the Informatica Global Customer Support program. The additional services will be bundled into existing programs to further accelerate customer time-to-value, reducing cost-of-ownership and helping ensure ongoing project success.

Posted May 04, 2010

Sybase has announced the availability of Sybase PowerBuilder 12. The new release of Sybase's rapid application development tool enables developers to easily and cost effectively create or migrate their business applications on the Microsoft .NET Framework, for modern and visually appealing application user experiences.

Posted April 29, 2010

Quest Software, Inc., a Visual Studio Industry Partner and maker of Toad for Oracle, has announced the launch of Toad Extension for Visual Studio. Toad Extension for Visual Studio is a database schema provider that will support complete application lifecycle management (ALM) for Oracle in Visual Studio 2010, unifying Oracle developers with the rest of the Visual Studio development team.

Posted April 20, 2010

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21

Sponsors