DBTA Downloads: White Papers
This white paper discusses how online applications and social media have changed the concept of speed of delivery for data. Mobile apps and connective devices altered the notions of how data is collected and transmitted. GPS, sensor and other types of machine-to-machine data sources disrupt the perception of what data should or could look like. Moreover, with cloud infrastructure implementation time frames measured in weeks, not months, both business and Information Technology (IT) stakeholders have changed the way they think about data management projects, and in particular how data models and business and technical metadata are addressed.
This paper provides the guidelines to adopt Model as a Service (MaaS) through technology enabling Open Data MaaS engagement and application along the Database as a Service (DaaS) lifecycle. Accordingly, we introduce MaaS agile design and deployment enabled by CA ERwin and explain how to map Open Data requirements and best practices.
With MaaS, data models can be shared, off-line tested and verified to define data designing requirements, data topology, performance, placement and deployment, so that models themselves can be supplied as a service. Data models allow you to verify “on-premise” how and where data should be designed to meet the Cloud service’s requisites. As a consequence, models can be tuned based on real usage and then mapped to the Cloud service. Further, MaaS inherits the defined service’s properties so data models can be reused, shared and classified for new Cloud Services design and publication.
This white paper outlines the most important aspects and ingredients of successful DB2 for z/ OS performance management from DBTA columnist and data management expert Craig Mullins. It offers multiple guidelines and tips for improving performance within the three major performance tuning categories required of every DB2 implementation: the application, the database and the system.
If your organization relies on data, optimizing the performance of your database can increase your earnings and savings. Many factors, both large and small, can affect performance, so fine-tuning your database is essential. Performance-tuning expert Chuck Ezell sheds light on the right questions to get the answers you need by using a defined approach to performance-tuning, referred to as the 5 S’s.
The next generation of databases and data platforms is coming into full fruition to help enterprises more effectively store, process, analyze and deliver value from Big Data. This report hones in on the key challenges and opportunities ahead, and provides in-depth information on leading-edge technologies and solutions. Download your copy today to stay ahead of the latest developments in NoSQL, NewSQL and Hadoop.
This special research report provides valuable information for database administrators, IT managers and decision makers who are concerned with meeting demand for database services in a world where both the number of requests as well as the associated data volumes are steadily climbing. The key findings include:
With Big Data, the world has gotten far more complex for IT managers and those in charge of keeping the business moving forward. How can any organization pull all of the right information, consistently, from multiple sources and ensure a complete view for analysis? The adoption of new data stores such as MongoDB, Cassandra and Hadoop has given way to new approaches for the storage and analysis of both structured and multi-structured data. The key element to fully taking advantage of Big Data, is being able to connect the various types of BI applications with this variety of data sources in a way tha is consistent, scalable, efficient and fast.
This DBTA Thought Leadership Series discusses new approaches to planning and laying out tracks and infrastructure, moving to a real-time analytics requires new thinking and strategies to upgrade database performance. What are needed are new tools, new methodologies, new architectures, and a new philosophy toward managing data performance.
Oracle Data Integrator (ODI) is a best-of-breed data integration platform focused on fast bulk data movement and handling complex data transformations. The 12c version of Oracle Data Integrator continues to push this state of the art technology in data integration further ahead of the rest of the industry. Oracle continues to invest on this strategic data integration platform.
This whitepaper describes in detail some of the new features and capabilities offered in the Oracle Data Integrator 12c platform.
Oracle GoldenGate 12c Release 1 improves businesses’ ability to manage transactional processing in complex and critical environments. Oracle GoldenGate 12c advances Oracle’s leadership in real-time data replication technology with simplified configuration, extreme performance, and improved high availability solutions. The new release includes optimizations for Oracle Database 12c, intelligent and integrated delivery capabilities, integration with Oracle Data Guard Fast-Start Failover (FSFO), and tighter security.
Managers of database administrators have a recurring problem: they need to hire experts to keep their systems running, only to see their high-priced talent maddeningly chained to pesky requests and problems that could be handled by less-expensive employees. Outsourcing allows organizations to have people with the exact skills required at the moment they are needed. In this white paper, we explore the top 10 issues facing managers of DBAs and how outsourcing solves some of these pressing challenges by providing reliable and flexible staffing.
A problem of epidemic proportions exists with how Microsoft SQL Server is deployed and maintained across all industries and organizations of every size. If not addressed, deployment and maintenance issues can cause serious disruptions to database environments and the ongoing business of Microsoft SQL Server users.
Today’s 24/7 enterprises require a well-designed, next-generation data integration architecture. Why is data integration so difficult? For many organizations, data integration has been handled as a dark art over the years, implemented behind the scenes with ad hoc scripts, extract, transform, and load (ETL) operations, connectors, manual coding, and patching. Often, front-end applications to get at needed data are built and deployed one at a time, requiring considerable IT staff time, as well as creating a waiting period for business decision makers. This one-off, manual approach to data integration will not work in today’s competitive global economy. Decision makers need information, at a moment’s notice, that is timely and consistent. However, they are challenged by their organizations’ outdated data integration systems and methods. Often, information may be delayed for weeks, if not months, by the time it takes to develop handcoded scripts to deliver requested reports. The process
Hadoop is marching steadily into the enterprise, but key challenges remain, from manual coding demands to a lack of real-time capabilities and the time it takes to bring a Hadoop project into production. At the same time, brand-new startups and veteran software companies alike are delivering new offerings to the marketplace to make it easier to deploy, manage, and analyze Big Data on Hadoop. From data integration and business intelligence tools to integrated analytical platforms and a new wave of SQL-on-Hadoop solutions, the common goal is to help companies unleash the power of Hadoop for Big Data analytics. Download this special report to learn about the key solutions. Sponsored by MarkLogic, RainStor, Tableau, Qubole, Karmasphere, Appfluent, and Hadapt.
The frequency and sophistication of attacks shows no signs of declining, and today’s economic climate does not allow for more resources to be applied to the challenging task of testing and applying physical patches promptly upon their release. Combined with the compliance framework of SOX, HIPAA, PCI DSS, and the like, the need to demonstrate adherence to a strict patching policy will only become more demanding, and the use of McAfee Virtual Patching for Databases makes perfect sense for closing the window of risk, while saving dollars and minimizing business disruption.
Many organizations have found themselves drawn to virtualization and cloud computing architectures for the many benefits, only to find that the complexity of ensuring adequate data security was simply too great an obstacle. But the adoption of these technologies is inevitable.By deploying memory-based solutions for distributed database monitoring, enterprises will find that it is not only possible to protect sensitive information in these emerging computing models, but that the same architecture also provides both more effective and more efficient data security across their dedicated database servers as well.
A serious data breach brings monetary damage in its many forms: business disruption, bad publicity, stiff ones for noncompliance, and undermined customer confidence. But most damaging of all is the trouble that it creates when it comes to signing up new customers. A tarnished reputation is a big objection for sales and business development to overcome. That’s why data security in general and database security in particular are a crucial part of any company’s overall corporate health.
Every database, from the largest corporate data store to a small-scale MySQL deployment, contains valuable information. Hackers remain as determined as ever in their efforts to cash in on this treasure. The most effective way to protect any database is with effective multilayered database protection. Compliance officers and regulators know this. So does McAfee. That’s why McAfee has recently bridged the MySQL security gap with a unique solution that combines an open source auditing plug-in with industry-leading database security modules—the McAfee MySQL Audit Plug-In.
Databases are the number one target of cybercriminals and disgruntled insiders. Traditional perimeter and network security, as well as built-in database security measures, offer only limited protection when it comes to securing the organization’s most sensitive data. That’s why compliance officers as well as auditors are taking a much closer look at database security and compliance, and why four main database security vendors have entered the market.
Many organizations still rely on security solutions with inherent limitations. Given the complexities of today’s database platforms and the sophistication of today’s cybercriminals, deploying a comprehensive and dedicated database security solution is a must. Here are five reasons why.
UNSTRUCTURED DATA: Managing, integrating, and Extracting Value While unstructured data may represent one of the greatest opportunities of the big data revolution, it is one of its most perplexing challenges. In many ways, the very core of big data is that it is unstructured, and many enterprises are not yet equipped to handle this kind of information in an organized, systematic way. Effectively capturing and capitalizing on unstructured data isn’t just a technical challenge, it represents an organizational challenge. A flexible and agile enterprise environment—supported and embraced by all business units—will elevate unstructured data processing and analysis to a position in which it can help drive the business. This Thought Leadership Series is sponsored by Objectivity and Database Plugins
THE WORLD OF BUSINESS INTELLIGENCE IS EVOLVING. Not only do organizations need to make decisions faster, but the data sources available for reporting and analysis are growing tremendously, in both size and variety. This special report from Database Trends and Applications examines the key trends reshaping the business intelligence landscape and the key technologies you need to know about. This Best Practices feature is sponsored by Oracle, Attunity, Tableau, Objectivity and Pentaho.
It’s IT’s dirty little secret. Every time you touch database schema there’s a 76% chance it will break your application. Download the whitepaper: Bringing sexy back…to the database. Learn why it’s not a tools issue. It's a process issue. Understand how a model-based approach and rich interface for authoring schema changes across multiple platforms can eliminate manual scripting, provide fast, accurate application deployment, and break through change request backlogs. After all, you don’t need more SQL. You need less.
This white paper explores the advantages of EMC for Oracle for Oracle DBAs and discusses how EMC is delivering real advantages in performance, management, and cost.
Read about the challenges affecting multi-database environments, how server virtualization is being deployed, and how EMC private cloud solutions for Oracle can meet those challenges.
This IDC Vendor Spotlight – examines the issues of choosing the right infrastructure, fast successful adoption of databases, applications versions and vendors, and finally integrated support between database and application providers without adding risk to the business.
See how Callaway Golf implemented a “split-mirror” backup solution to ensure high-performance, high-reliability and recoverability, as well as perform backups to offset the loads on primary SAP servers.
THE IDEA OF THE REAL-TIME ENTERPRISE is straightforward: Increase your organizational responsiveness through automated processes and raise organizational effectiveness and competiveness. If your organization can fulfill orders, manage inventory, resolve customer issues, and implement strategies to address changing circumstances faster and more efficiently, your organization is going to be more successful. However, for most enterprises, this is still an unrealized objective. Increasing data volumes, data varieties, and business demands are now stretching the limitations of traditional data management technologies and intensifying the challenge of integrating and analyzing data in real-time. Consequently, many organizations are looking beyond their current IT infrastructures. Download this report to learn about the leading technologies enabling organizations to deliver data across the enterprise in real-time. Sponsored by Oracle, SAP, Objectivity, JackBe and BackOffice Associates.
The use case introduced identifies key actions, requirements and practices that can support activities to help formulate a plan for successfully moving data to the Cloud.
This white paper summarizes the results of the Edison Group’s evaluation of web portals for viewing and interacting with data models. The web portals evaluated were created for the following data modeling tools: CA ERwin Data Modeler (ERwin), Embarcadero ER/Studio (ER/Studio), and Sybase PowerDesigner (PowerDesigner).
CA ERwin Data Modeler enables organizations to collect and serve data models from and to any web data source and data management system in the cloud.
This white paper by industry expert Alec Sharp illustrates these points and provides specific guidelines and techniques for a business-oriented approach to data modeling. Examples demonstrate how business professionals.
Relational database management systems (RDBMSs) are systems of software that manage databases as structured sets of tables containing rows and columns with references to one another through key values. They include the ability to optimize storage, process transactions, perform queries, and preserve the integrity of data structures. When used with applications, they provide the beating heart of the collection of business functions supported by those applications. They vary considerably in terms of the factors that impact the total cost of running a database application, yet users seldom perform a disciplined procedure to calculate such costs. Most users choose instead to remain with a single vendor's RDBMS and never visit the question of ongoing hardware, software, and staffing fees.
The sheer volume and complexity of information generated by today’s enterprises has created the pressing need for a next-generation of business applications and database solutions. These solutions must be capable of handling a massive number of transactions and meeting the demands of an ever-growing population of concurrent users. SAP Sybase Adaptive Server Enterprise (SAP Sybase ASE) is a high-performance transactional database platform that is optimized for SAP Business Suite software and designed to deliver the exceptional performance and reliability required in the big data era. IBM Power Systems is a highly scalable UNIX platform tuned for SAP application and data management software. In collaboration, the SAP-IBM alliance of solutions create the optimal landscape for heightened performance levels to keep up with real-time business and increased operational efficiency in the face of growing complexity – the best of both worlds.
While the database landscape is growing with information quantities and varying data types, it is evident that the RDBMS will retain its position as the system of record for the enterprise. The old criteria for selecting a RDBMS platform have been superseded and today’s RDBMS evaluation principles seek to enhance reliability, adaptability, scalability, predictability, and manageability. With its remarkable heritage and bright future as part of SAP’s flagship information management technology, SAP Sybase ASE merits inclusion onto any enterprise’s RDBMS vendor selection checklist.
Today's database technology landscape is more dynamic than ever before. With growing data volume, data types and data uses, many organizations are using an array of specialized information management technologies to manage their data. Despite fresh breakthroughs and noteworthy solutions, identifying and selecting the right relational database management (RDBMS) platform is still a vital obligation and transactional databases continue to remain at the heart of the enterprise’s information processing responsibilities. In optimizing data management technologies in the organization, one must learn how the latest trends in information management are affecting IT organizations, why a ‘one size fits all’ database no longer makes sense, and how integrated solutions such as the SAP Real Time Data Platform can provide a logical, unified approach to data management.
Find out how SAP Sybase Adaptive Server Enterprise helps companies address key challenges in the areas of performance, reliability, and efficiency when it comes to exponential data growth. SAP Sybase ASE offers high data-processing responsiveness and throughput, and predictable, consistent performance.
Cloud databases are on the rise as more and more businesses look to capitalize on the advantages of cloud computing to power their business applications. In fact, a recent Unisphere Research study found that nearly one-third of organizations are currently using or plan to use a cloud database system within the next 12 months. Download this complimentary report, sponsored by NuoDB, GenieDB, 10gen, Cloudant, Progress DataDirect, Clustrix, Objectivity and TransLattice, to gain a deeper understanding of the different types of cloud databases, their unique benefits and how they are revolutionizing the IT landscape.
The good, old-fashioned relational database has faced disruption since the turn of the millennium, and the disruption is peaking now. Through the explosion of the “NoSQL databases,” read how MarkLogic stands out as a clear leader in the “Enterprise NoSQL” category.
Data virtualization solves the problem of consolidating critical data scattered across silos, providing a comprehensive, actionable view of data assets. Learn how MarkLogic presents a unified view of multi-structured data across organizational silos.
Oracle GoldenGate 11g Release 2 is the most feature rich, robust, and flexible data replication product on the market today. Written for business project owners, key stakeholders, and the entire IT organization, this white paper provides an overview of the new features in Oracle GoldenGate 11g Release 2.
This white paper provides a broad overview of the state of the data integration market. It examines the trends that are driving data integration technology forward and motivating businesses to undertake data-intensive projects such as business intelligence, data warehousing, data quality, consolidation, cloud computing and IT modernization initiatives.
This white paper describes Oracle’s mature, well-crafted strategy for meeting the new data integration requirements. Oracle Data Integration delivers pervasive and continuous access to timely and trusted data across heterogeneous systems. It includes a broad family of products designed to deliver maximum performance with low cost of ownership, ease of use, and reliability. Its comprehensive capabilities support the fundamental requirements of the enterprise including real-time and bulk data movement, data transformation, bi-directional replication, data services, data federation, and data quality for customer and product domains.
BIG DATA, a well-used term defining the growing volume, variety, velocity, and value of information surging through organizations-has become more than a buzz phrase thrown about at conferences and in the trade press. Big Data is now seen as the core of enterprise growth strategies. Business leaders recognize the rewards of effectively capturing and building insights from Big Data, and see the greatest opportunities for Big Data in competing more effectively and growing business revenue streams. As the amount and variety of data grows, so do the skills required to capture, manage and analyze this data. This specialized issue of Best Practices from Oracle, Attunity, Couchbase, HiT Software Inc, Progress DataDirect, LexisNexis, Confio and Objectivity focus on a more formidable challenge: making Big Data valuable to the business. Complimentary from DBTA.
Database usage is increasing sharply at a time when license and maintenance fees are also on the rise. Motivated by successful migrations to Linux and JBoss, IT organizations are turning to open source based RDBMS alternatives to control costs and improve operating leverage. Migrating from Oracle to open source based alternatives like Postgres Plus Advanced Server requires a clear understanding of human, technical and operational risks, and a game plan by which to manage those risks over an extended time horizon. That game plan includes a comprehensive technical assessment of your application databases, Oracle compatibility software that eliminates migration risks and costs, and a partner with deep domain expertise in Oracle database migrations. This paper will provide best practices for migrating Oracle applications to lower cost PostgreSQL alternatives; quantifying and mitigating associated risks; and reducing the operational pain associated with migration.
It’s no surprise that organizations still struggle with some level of bad data. Many don’t know where to start, who’s responsible for, or how to sustain a data quality initiative over the long haul. Written by leading industry analyst Dr. Elliot King, this ebook answers these questions and offers insight into the complex role of a data steward.
The requirements of interactive online applications are dramatically increasing, with users demanding ever-higher levels of availability (uptime). At the same time, datacenter and cloud infrastructures are becoming increasingly complex and failure prone. A novel approach to multi-regional replication of MySQL is presented, which in combination with several other minor modifications to the application stack can make 100% availability a reality. A future vision is provided for the next-generation applications enabled by this innovative solution.
The appeal of in-memory technology is growing as organizations face the challenge of Big Data, in which decision-makers seek to harvest insights from terabytes and petabytes worth of structured, semi-structured and unstructured data that is flowing into their enterprises. This special thought leadership series provides context and insight on the use of in-memory technology, and detailed explanations of new solutions from SAP, Tableau Software, Tibco Spotfire, JackBe, Terracotta and MemSQL. Complimentary from DBTA.
Is your organization’s systems and data environments ready for the Big Data surge? If not, you are not alone. A recent study conducted among Independent Oracle User Group members by DBTA’s Unisphere Research finds that fewer than one-in-five data managers are confident their IT infrastructure will be capable of handling the survey of Big Data. This special Best Practices section from DBTA provides context and insight on the need to address this issue now, and detailed explanations of new technologies for dealing with Big Data from Aster/Teradata, MarkLogic, Akiban, Progress/Data Direct, Infinitegraph, HP-Vertica and Denodo. Complimentary from DBTA.
Big data offers both vast opportunities as well as vexing challenges for every organization it touches. To be of value, information needs to be readily available, as close to real time as possible, to users in any location. Though the onset of big data has made that task more daunting, the enormous competitive advantages that come with effectively managing big data can far outweigh performance issues. The problem is many existing databases and data environments are not prepared, as performance may degrade when massive amounts of data in various formats are brought into current environments. Along with the technical issues associated with managing data growth, many organizations are simply not ready from a management point of view.
To compete in today’s economy, organizations need the right information, at the right time, at the push of a keystroke. But the challenge of providing end users access to actionable information when they need it has also never been greater than today. Enterprise data environments are not only growing in size, but in complexity - with a dizzying array of different data sources, types and formats. The September 2012 Best Practices in Data Integration, Master Data Management, and Data Virtualization report examines the data integration challenges and opportunities that Big Data is currently presenting data-driven organizations.
DBTA plunges into the topic of unstructured data in this thought-provoking and solutions-providing PDF section. Analyst Joseph McKendrick pens some of the timeliest and most unique perspective to date on leveraging and managing unstructured data, sourcing industry leaders and ground-breaking Unisphere Research user studies. Then eight market leading organizations map their solutions for accessing, managing and leveraging unstructured data, and target your IT and business needs with ten pages of clear, concise and cutting edge content. Get up-to-speed rapidly with this brand new section from DBTA. Complimentary.
With the rise of big data, the database and data management tools market is in a state of flux, the likes of which have not been seen in this sector before. Companies are now awash in big data, and end users are demanding greater capability and integration to mine and analyze new sources of information. As a result, organizations are supplementing their relational database environments with new platforms and approaches that address the variety and volume of information being handled. In this special section in Database Trends and Applications analyst Joseph McKendrick brings you up on current thinking and strategies users and vendors are pursuing to extract value from large, often unwieldy data stores. This is followed by nine separate sponsored content pieces focusing on in-memory, real-time data integration, data virtualization, BI, columnar databases, NoSQL and Hadoop.
The rise of Big Datais challenging many long-held assumptions about the way data is organized, managed, ingested, and digested. However, for many organizations, Big Data is still a new frontier that they have only begun to explore. "Many organizations leave their data to pile up; they are aware of it as a resource but haven't analyzed it. They don't know what's useful and what's worthless." This fourteen-page section from the March edition of Database Trends and Applications is an invaluable resource that provides multiple perspectives on the chief challenges our readers face and the solutions that will enable organizations to begin tapping into the power of Big Data assets.
Key extracts from Database Trends and Applications from the December print edition focus on "Data Security and Compliance".
Onsystex is changing the way the world views MultiValue applications by increasing the value of those critical business applications that have provided organizations with a competitive edge for many years. As an alternative to re-engineering or replacing proven, functional and stable applications, Onsystex provides the tools and services to allow these applications to be executed on recognized, standard relational databases such as Oracle, SQL Server, MySQL, and PostgreSQL, integrated with standard business intelligence (BI) tools, and interfaced with the latest Internet technologies in languages such as .NET and JAVA.