Setting up a replication configuration is a fairly standard way to enable disaster recovery (DR) for business-critical databases. In such a configuration, changes from a production or primary system are propagated to a standby or secondary system. One of the important technology decisions that organizations make upfront is the choice of the replication architecture.
Posted May 15, 2009
IT GRC—or, IT governance, risk and compliance—is rapidly gaining the attention of CIOs and CISOs in businesses across the country. After all, the objective of IT GRC is to more efficiently strike an appropriate balance between business reward and business risk, an essential equation that these executives must attain. How does IT GRC help? By replacing traditional, siloed approaches to addressing individual components with a more unified approach that takes advantage of the many commonalities and interrelationships that exist among governance, compliance and risk management.
Posted May 15, 2009
IT managers from organizations of all sizes know the importance of maintaining access to critical applications and data. From irritating "system unavailable" messages to the most unfortunate natural and manmade disasters where entire systems may be lost, the challenge is particularly acute for database-driven, transactional applications and data—the lifeblood of the business. The dynamic, transactional data and applications that comprise, process, manage and leverage critical customer accounts and history, sales, marketing, engineering and operational components keep the organization thriving.
Posted April 15, 2009
Ed Boyajian joined EnterpriseDB, the open source database company whose products and services are based on PostgreSQL, in June, 2008, as president and CEO. Before that, he spent six years in sales leadership roles at Red Hat, including vice president and general manager for North American sales, and vice president, worldwide OEM and North American channels. Recently Boyajian chatted with DBTA about the looming challenges and opportunities for open source in general as well as for EnterpriseDB's Postgres Plus product family.
Posted April 15, 2009
Those of us in the data security industry, practitioners and vendors alike, have been conditioned to think of data protection in terms that are analogous to physical security. Blocking devices and sensors are akin to locks and security systems. This is why for years we have been investing in those technologies that will block out unauthorized connections all the while making information more and more accessible. There is, however, a new world order at hand. Data creation rates now far outpace the ability of IT managers to write security rules, and the number of data breaches and threats that originate from network insiders have proven much more frequent and insidious than even our most dire predictions of five years ago.
Posted April 15, 2009
Every data integration initiative—whether it supports better decision making, a merger/acquisition, regulatory compliance, or other business need—requires a set of processes to be completed before the data can be made available to business users. Though this set of processes is fairly well understood by industry practitioners, there are still many areas left unaddressed and, therefore, the process is time-consuming, inefficient, unpredictable, and costly.
Posted April 15, 2009
Business intelligence (BI) and analytics solutions have been available for years now, and companies have learned to employ these tools for a variety of purposes, from simple report generation and delivery to more sophisticated data integration, executive dashboards, and data mining. They also recognize the need to get beyond spreadsheets, and to be able to provide more sophisticated, pervasive, and automated BI solutions to more end-user decision makers. However, most see their efforts stymied by the historically high cost of BI software and the complexity of available solutions.
Posted March 15, 2009
The time is past when the unique attributes of the MultiValue database model alone provided sufficient justification for the use of the technology, according to Pete Loveless. He explains why MV companies must support interoperability and integration from the ground up, in order to meet the challenges presented by the market now, and in the future.
Posted March 15, 2009
Over the past year, we have seen a number of new entrants in the data warehouse appliance market. What user requirements are driving the launch of these new appliance solutions and are appliances a niche solution, or is this the beginning of a broader-based trend?
Posted March 15, 2009
Many IT and business managers are now familiar with the concept of virtualization, especially as it pertains to the ability to run a secondary operating system within the same hardware that already supports a separate OS brand. Seasoned data center professionals have been aware of virtualization as a capability available on mainframes for years. The ability of virtualization to provide advantages to data center operations in terms of systems consolidation and simplifying administration has been well-documented.
Posted March 15, 2009
Decision-making is no longer restricted to the confines of the office. The need for critical financial metrics for an off-site board meeting, the latest market share reports for a client visit, or timely sales data for a supplier meeting are all examples that highlight the need for anytime, anywhere access to insightful information. If mobile technology is allowing users to check email, download ringtones, play games, manage schedules, and plan tasks, then why should work-related information be left behind? It is not. Mobile business intelligence (MBI), a convergence of business intelligence software, mobile technology, and Internet connectivity, is ensuring that information travels with the mobile workforce.
Posted February 15, 2009
Alvion Technologies provides a web-enabled platform that allows compilers, resellers and managers of marketing lists to easily deliver their product to end-users in support of targeted marketing efforts. Individual customers submit their data and then Alvion runs customer-specific data transformation and uploads the data to production servers, for access by end-users who are the customers of the data owners. If you need, for example, to find consumers within a 35-mile radius of your business that meet a certain profile, you can go online and find lists within Alvion, put in the criteria you are looking for, and those names will be provided to you, via electronic delivery, be it email or download.
Posted February 15, 2009
Virtualization is transforming self-evident physical machines into multiple virtual machines (VMs), which can be cloned instantly at no perceived cost and moved seamlessly from one physical machine to another. While the power of virtualization is enticing, its management implications are daunting. A completely new management protocol is needed to match the dynamic nature of virtual environments and keep pace with their evolution as they move beyond the enterprise and into the cloud.
Posted February 15, 2009
A leading supplier of data integration software for businesses, finding that its developers were spending too much time grappling with data management inside each of its products, adapted its architecture to a service-oriented architecture (SOA) and built its own data services platform (DSP). However, problems arose that required a complete rebuilding of the architecture. The underlying cause of those problems? Poorly architected data access.
Posted February 15, 2009
Complex Event Processing is only a few years old, but it is rapidly entering the mainstream in a large number of fields that require continuous analysis of large volumes of real-time data.
Posted January 15, 2009
Database administrators are critically important contributors in modern enterprises, ensuring that key infrastructure is performing optimally in support of the organization's goals. Like employees in every department, the best DBAs are constantly seeking to increase the value of their contributions and, correspondingly, to increase their compensation and to advance in their organizations. Increasing knowledge and skills and taking more important responsibilities are time-honored methods for career advancement. Here are 10 concrete suggestions for DBAs looking to get ahead.
Posted January 15, 2009
Combining Database Clustering and Virtualization to Consolidate Mission-Critical Servers
Posted January 15, 2009
The old maxim, "may you live in interesting times" certainly holds true for IT managers and professionals these days. The year 2008 was full of changes and challenges, and 2009 promises even more.
Posted December 15, 2008
Data is the byproduct of the information age and is being generated, processed and stored at an exponential rate. Storage area networks (SANs) have become the infrastructure of choice for networking, transporting and storing data traffic. As this trend continues, many IT managers are faced with network congestion and I/O bottlenecks. To alleviate congestion and increase network bandwidth, enterprises are looking to 8Gb/s Fibre Channel technology.
Posted October 15, 2008
Data transmission is growing rapidly, and the digitization of everything from financial transactions to video is enabling organizations to quickly share information with global partners both inside and outside their trusted network. However, many organizations do not recognize the operational, financial and security risks associated with this growing proliferation of perceived secure, user-managed file transfer systems.
Posted October 15, 2008
The industry is buzzing with talk of endpoint virtualization. This innovation is often seen as a means to reduce enterprise endpoint costs and increase the agility of new endpoint deployments. However, as many organizations discovered as they implemented server virtualization, unless such technologies are integrated within a single infrastructure framework that spans both the physical and virtual, they can add rather than reduce complexity and cost.
Posted October 15, 2008
We all know software piracy causes huge financial losses. It has been estimated that the world's software companies are now losing $40 billion in revenue in unlicensed installations. Yet, with all the security technology at our disposal, why isn't piracy going away? While some areas have been able to squelch a certain percentage of software theft, the problem is here to stay. The huge influx of new PC users, the ubiquitous nature of piracy tools over peer-to-peer networks, and the near-impossibility of enforcement across the globe stand in the way of significant progress. Moreover, the outsourcing of development work opens up new worries for those dealing with countries with weak intellectual property (IP) enforcement laws.
Posted September 15, 2008
Implementing comprehensive database security solutions can be an onerous task. Security requirements are always changing and new compliance requirements are constantly emerging. Despite this dynamic environment, there are simple steps that can be undertaken to dramatically and quickly reduce risk. Database security solutions are only as secure as the weakest link. Forward-thinking organizations should begin by addressing the vulnerabilities that are the most obvious and easiest to exploit.
Posted September 15, 2008
Claims that the mainframe is a near-death technology in the mission-critical world of today's robust business intelligence (BI) applications are exaggerated. Conventional wisdom says the mainframe-the "powerhouse" of corporate computing-is simply too costly, too complex and incapable of supporting a comprehensive BI system. Not so.
Posted September 15, 2008
The Capitol Corridor Joint Powers Authority (CCJPA) manages an Amtrak intercity passenger train service in eight Northern California counties, partnering with Amtrak, the Union Pacific Railroad, and Caltrans, the California Department of Transportation. Serving 16 stations along a 170-mile rail corridor, CCJPA offers a convenient way to travel between the Sierra Foothills, Sacramento, the San Francisco Bay Area, San Jose and the rest of the Silicon Valley.
Posted September 15, 2008
Early discussions on SQL Server 2008 seemed to suggest that it would really only be a point release, quite unlike what occurred with SQL Server 2005. Anyone looking at the new and upgraded features in SQL Server 2008 would soon realize that it offers much more than that. Given that SQL Server 2005 took some time to achieve mass adoption, the question that arises is how fast users will migrate to SQL Server 2008.
Posted September 15, 2008
Now more than ever, data has evolved into an asset more strategic and valuable than any raw material or capital construction project. Companies are scrambling to "compete on analytics," recognizing that the one to most effectively leverage information coming out of their systems gets the greatest competitive advantage.
Posted September 15, 2008
When you pick up the morning paper or turn on the news, you don't expect to be reading or listening to a story about your credit or debit card information being at risk. However, recent events indicate - as illustrated by the announcement of security breaches at the Hannaford supermarket chain and the Okemo Mountain Resort in Vermont - this will become an all too common event.
Posted August 15, 2008
Any system needs to be tested. And it's a simple fact that testing is better done by people independent of the system being tested. A different perspective can often highlight new areas of weakness, and there is no conflict of interest in managing a "pass."
Posted August 15, 2008
Psychologist Philip Zimbardo once said, "Situational variables can exert powerful influences over human behavior, more so than we recognize or acknowledge." That certainly appears to be true when we look at how we work with people who provide services to us in our personal lives versus those who do it in the business world. In our personal lives, we tend to hire specialists. Yet, in the business world we always seem to want to take the "holistic" route, i.e., find that one supplier who can do everything for us.
Posted August 15, 2008
Embarcadero Technologies, a provider of multi-platform tools that companies use to design, develop and manage databases and the data they contain, was acquired by Thoma Cressey Bravo a little more than a year ago in a $200 million go-private transaction. One year later, Embarcadero completed the purchase of CodeGear from Borland Software Corp. for approximately $24.5 million. DBTA talked with Wayne Williams, CEO of Embarcadero, about how the companies fit together to provide system-wide capabilities and also how in a larger sense the worlds of application and database development are converging.
Posted August 15, 2008
Data Integration (DI) technology, (specifically, extract, transform, and load (ETL) middleware), when combined with an intermediate data store such as a warehouse or mart, have played key roles in advancing business intelligence (BI) and performance management since the mid-1990s. Virtualized DI evolved from these technologies in the mid-2000s. Alternatively known as virtual data federation or enterprise information integration (EII), virtual DI eliminates the intermediate data store by leveraging high-performance query techniques that let the consuming application pull data directly from the source, in real time.
Posted August 15, 2008
IT process automation (ITPA) has stepped into the spotlight as a hot commodity among IT professionals worldwide. With IT process automation, IT can deliver better services in support of business, allowing for increased business agility and service delivery. IT accomplishes these goals by delivering services to business faster and with fewer errors. The reasons for automating IT processes within the data center and across the IT ecosystem are numerous - increased productivity, reduced human error, elimination of repetitive manual efforts, and most importantly, reduction of IT management costs. The true driving force behind the increasing interest in IT process automation, however, is the business value that it provides.
Posted July 15, 2008
In today's world of global economy companies recognize a growing need for a single point of responsibility for all security aspects. More and more companies acknowledge a growing need for a single point of accountability for all security aspects by creating the position of information security officer (ISO). One of the main tasks of ISO is to protect companies' main asset - the data. An ISO has to recognize that for any intruder there are two ways of stealing the data - while in transmission or directly from the database. Traditionally, the main emphasis has been placed on network controls to prevent unauthorized access, and to a lesser extent, protecting data in transmission. However, database security is often overlooked.
Posted July 15, 2008
Microsoft SQL Server version migrations are one of the more difficult activities for a DBA to execute, yet they are a very common change to the infrastructure. Databases eventually have to be migrated to another location, perhaps even a clustered instance. DBAs are tasked with making the database migration succeed despite all of the complexities and failures that are possible. The concerns around database migration can be alleviated through an automated process which results in an infrastructure that will be easier for DBA teams to manage, more compliant, easier to document, faster in turnaround time for changes, and more reliable with less downtime.
Posted July 15, 2008
Non-relational cloud databases such as Google's BigTable, Amazon's SimpleDB and Microsoft'sSQL Server Data Services (SSDS) have emerged. But while these new data stores may well fill a niche in cloud-based applications, they lack most of the features demanded by enterprise applications - in particular, transactional support and business intelligence capabilities.
Posted July 15, 2008
IDS 11.5 - the newest version of Informix Dynamic Server - was officially announced by IBM at the International Informix User Conference in Kansas City in April 2008. The announcement created a tremendous interest from customers, analysts and the press.
Posted June 15, 2008
An entire industry has sprung up in response to the never-ending battle against complexity, server sprawl, and rising power consumption. Virtualization is now the mantra for beleaguered data center managers looking for ways to consolidate, better utilize, or abstract away their farms of physical servers and hardware. However, in many cases, virtualization itself can lead to even more complexity andoffer uncertain value to the business. Many businesses are finding that virtualization is not ready for core mission-critical applications.
Posted June 15, 2008
MENTISoftware offers security solutions for companies using Oracle-based systems and is adding support for an expanded range of ERP suites and database platforms. DBTA talked with Rajesh Parthasarathy about why top management cares more than ever about protecting sensitive data.
Posted June 15, 2008
For the first time in over 20 years, there appear to be cracks forming in the relational model's dominance of the database management systems market. The relational database management system (RDBMS) of today is increasingly being seen as an obstacle to the IT architectures of tomorrow, and - for the first time - credible alternatives to the relational database are emerging. While it would be reckless to predict the demise of the relational database as a critical component of IT architectures, it is certainly feasible to imagine the relational database as just one of several choices for data storage in next-generation applications.
Posted June 15, 2008
Historically, database auditing and database performance have been like oil and water; they don't mix. So auditing is often eliminated, because performance drag on critical systems is unacceptable.
Posted May 15, 2008
Data and its analysis has become an important economic battleground for many industries, and nowhere is this more apparent than in the financial industry. Regulation is mandating greater data transparency across firms and trading practices. The increase in automated trading and the continuing search for new trading opportunities has led to exponential increases in the amount of data that must be captured, cleaned, managed and analyzed within a financial institution. To give you some idea of the size of the problem, the Options Pricing Reporting Authority (OPRA) in the U.S. is anticipating trade volumes at peak levels of around one million messages per second by mid-2008. Real-time data processing and the ability to store it for historic analysis have become particular pressure points for many investment banks, asset managers and hedge funds.
Posted May 15, 2008
Database sizes have grown exponentially, with more than half of all databases in use globally projected to exceed 10TB by 2010, according to The Data Warehouse Institute. But as the size of data warehouses has exploded and business requirements have forced companies to conduct more ad hoc queries on those warehouses, response times have slowed, requiring increasingly expensive database hardware investments.
Posted May 15, 2008
In an effort to consolidate data across six separate business divisions, MassHousing, a leading provider of affordable housing for individuals and major developments across the state of Massachusetts, deployed business intelligence (BI) to consolidate large quantities of disparate data in a fast and efficient way. In tandem with its executive information system (EIS), MassHousing established a business intelligence competency center (BICC), to ensure consistent deployment across the organization and efficiency of all BI systems. MassHousing is one best practice organization that is taking its business intelligence initiative a step further with a BICC to help enable true business optimization.
Posted May 15, 2008
In the 1958 IBM Journal article that is generally acknowledged as the first usage of the term "business intelligence," author Hans Peter Luhn described the challenges and goals of the BI community in terms that are profoundly resonant nearly a half-century later.
Posted April 15, 2008
The complexities of today's IT environments make protecting data an ongoing challenge. Effectively securing data necessitates knowing where that data resides at any given point in time. However, as companies outsource tasks such as order processing, customer service, and fulfillment, this information becomes increasingly difficult to ascertain. Many of the external systems that house such data are not visible and are often not understood by those responsible for certifying compliance with regulations such as the Payment Card Industry Data Security Standard (PCI DSS).
Posted April 15, 2008
Lately, I have been rereading one of my favorite books on change: Our Iceberg Is Melting by John Kotter and Holger Rathgeber. The book shares a fable in which a colony of penguins discovers that their Antarctic iceberg is melting. If the penguins do nothing, the iceberg will shortly melt away and dump the penguins into cold, dark waters of the Antarctic Ocean, which will eventually lead to their deaths from cold and exhaustion. The manner in which the penguins deal with this change holds some great lessons for all of us.
Posted April 15, 2008
The concept of a configuration management system (CMS) is an idea whose time has come - particularly since the release of IT Infrastructure Library (ITIL) Version 3, known as V3. ITIL V3 devotes considerable attention to the importance of a CMS. If you're not already familiar with a CMS and its functions, you may be wondering what's included in it, and how it differs from a configuration management database, known as a CMDB. So let's take a look at this vital component of a long-term IT management strategy
Posted April 15, 2008
In recent years, disaster recovery has garnered attention from the company boardroom to the Office of the CIO. Despite this fact, many companies have yet to implement an effective DR solution to safeguard their applications and data. This inertia is attributed to two factors - perception of the term "Disaster" ("when the disaster happens, we'll deal with it then") and shortcomings of existing solutions ("we don't have budget for machines to sit by idly").Due to the rarity of catastrophic disasters such as earthquakes, floods and fires, organizations rarely implement comprehensive disaster protection measures. However, there is another set of "technical disasters" that are caused by much more mundane events which regularly lead to significant system outages. These span faulty system components (server, network, storage, and software), data corruptions, backup/recovery of bad data, wrong batch jobs, bad installations/upgrades/patches, operator errors, and power outages, among others.
Posted March 15, 2008
Despite efforts to "democratize" business intelligence, it has remained stubbornly confined to a chosen few within organizations. Although vendors have worked hard to convince enterprises that their BI solutions could be extended to line-of-business managers and employees, high-end analytic tools have remained confined to power users or analysts with statistical skills, while the remainder of the organization relies on spreadsheets to cobble together limited pieces of information.This disconnect was confirmed in a 2007 survey conducted by Unisphere Research for the Oracle Applications Users Group, which found that most companies are still a long way off from the ideal of BI for all. The OAUG survey found that for the most part, BI reporting remains tied up in IT departments, and is still limited to analysts or certain decision makers. The majority of survey respondents said that it takes more than three to five days to get a report out of IT. Overall, the survey found, fewer than 10 percent of employees have access to BI and corporate performance management tools.
Posted March 15, 2008