Newsletters




Trends and Applications



The industry is buzzing with talk of endpoint virtualization. This innovation is often seen as a means to reduce enterprise endpoint costs and increase the agility of new endpoint deployments. However, as many organizations discovered as they implemented server virtualization, unless such technologies are integrated within a single infrastructure framework that spans both the physical and virtual, they can add rather than reduce complexity and cost.

Posted October 15, 2008

We all know software piracy causes huge financial losses. It has been estimated that the world's software companies are now losing $40 billion in revenue in unlicensed installations. Yet, with all the security technology at our disposal, why isn't piracy going away? While some areas have been able to squelch a certain percentage of software theft, the problem is here to stay. The huge influx of new PC users, the ubiquitous nature of piracy tools over peer-to-peer networks, and the near-impossibility of enforcement across the globe stand in the way of significant progress. Moreover, the outsourcing of development work opens up new worries for those dealing with countries with weak intellectual property (IP) enforcement laws.

Posted September 15, 2008

Implementing comprehensive database security solutions can be an onerous task. Security requirements are always changing and new compliance requirements are constantly emerging. Despite this dynamic environment, there are simple steps that can be undertaken to dramatically and quickly reduce risk. Database security solutions are only as secure as the weakest link. Forward-thinking organizations should begin by addressing the vulnerabilities that are the most obvious and easiest to exploit.

Posted September 15, 2008

Claims that the mainframe is a near-death technology in the mission-critical world of today's robust business intelligence (BI) applications are exaggerated. Conventional wisdom says the mainframe-the "powerhouse" of corporate computing-is simply too costly, too complex and incapable of supporting a comprehensive BI system. Not so.

Posted September 15, 2008

The Capitol Corridor Joint Powers Authority (CCJPA) manages an Amtrak intercity passenger train service in eight Northern California counties, partnering with Amtrak, the Union Pacific Railroad, and Caltrans, the California Department of Transportation. Serving 16 stations along a 170-mile rail corridor, CCJPA offers a convenient way to travel between the Sierra Foothills, Sacramento, the San Francisco Bay Area, San Jose and the rest of the Silicon Valley.

Posted September 15, 2008

Early discussions on SQL Server 2008 seemed to suggest that it would really only be a point release, quite unlike what occurred with SQL Server 2005. Anyone looking at the new and upgraded features in SQL Server 2008 would soon realize that it offers much more than that. Given that SQL Server 2005 took some time to achieve mass adoption, the question that arises is how fast users will migrate to SQL Server 2008.

Posted September 15, 2008

Now more than ever, data has evolved into an asset more strategic and valuable than any raw material or capital construction project. Companies are scrambling to "compete on analytics," recognizing that the one to most effectively leverage information coming out of their systems gets the greatest competitive advantage.

Posted September 15, 2008

When you pick up the morning paper or turn on the news, you don't expect to be reading or listening to a story about your credit or debit card information being at risk. However, recent events indicate - as illustrated by the announcement of security breaches at the Hannaford supermarket chain and the Okemo Mountain Resort in Vermont - this will become an all too common event.

Posted August 15, 2008

Any system needs to be tested. And it's a simple fact that testing is better done by people independent of the system being tested. A different perspective can often highlight new areas of weakness, and there is no conflict of interest in managing a "pass."

Posted August 15, 2008

Psychologist Philip Zimbardo once said, "Situational variables can exert powerful influences over human behavior, more so than we recognize or acknowledge." That certainly appears to be true when we look at how we work with people who provide services to us in our personal lives versus those who do it in the business world. In our personal lives, we tend to hire specialists. Yet, in the business world we always seem to want to take the "holistic" route, i.e., find that one supplier who can do everything for us.

Posted August 15, 2008

Embarcadero Technologies, a provider of multi-platform tools that companies use to design, develop and manage databases and the data they contain, was acquired by Thoma Cressey Bravo a little more than a year ago in a $200 million go-private transaction. One year later, Embarcadero completed the purchase of CodeGear from Borland Software Corp. for approximately $24.5 million. DBTA talked with Wayne Williams, CEO of Embarcadero, about how the companies fit together to provide system-wide capabilities and also how in a larger sense the worlds of application and database development are converging.

Posted August 15, 2008

Data Integration (DI) technology, (specifically, extract, transform, and load (ETL) middleware), when combined with an intermediate data store such as a warehouse or mart, have played key roles in advancing business intelligence (BI) and performance management since the mid-1990s. Virtualized DI evolved from these technologies in the mid-2000s. Alternatively known as virtual data federation or enterprise information integration (EII), virtual DI eliminates the intermediate data store by leveraging high-performance query techniques that let the consuming application pull data directly from the source, in real time.

Posted August 15, 2008

IT process automation (ITPA) has stepped into the spotlight as a hot commodity among IT professionals worldwide. With IT process automation, IT can deliver better services in support of business, allowing for increased business agility and service delivery. IT accomplishes these goals by delivering services to business faster and with fewer errors. The reasons for automating IT processes within the data center and across the IT ecosystem are numerous - increased productivity, reduced human error, elimination of repetitive manual efforts, and most importantly, reduction of IT management costs. The true driving force behind the increasing interest in IT process automation, however, is the business value that it provides.

Posted July 15, 2008

In today's world of global economy companies recognize a growing need for a single point of responsibility for all security aspects. More and more companies acknowledge a growing need for a single point of accountability for all security aspects by creating the position of information security officer (ISO). One of the main tasks of ISO is to protect companies' main asset - the data. An ISO has to recognize that for any intruder there are two ways of stealing the data - while in transmission or directly from the database. Traditionally, the main emphasis has been placed on network controls to prevent unauthorized access, and to a lesser extent, protecting data in transmission. However, database security is often overlooked.

Posted July 15, 2008

Microsoft SQL Server version migrations are one of the more difficult activities for a DBA to execute, yet they are a very common change to the infrastructure. Databases eventually have to be migrated to another location, perhaps even a clustered instance. DBAs are tasked with making the database migration succeed despite all of the complexities and failures that are possible. The concerns around database migration can be alleviated through an automated process which results in an infrastructure that will be easier for DBA teams to manage, more compliant, easier to document, faster in turnaround time for changes, and more reliable with less downtime.

Posted July 15, 2008

Non-relational cloud databases such as Google's BigTable, Amazon's SimpleDB and Microsoft'sSQL Server Data Services (SSDS) have emerged. But while these new data stores may well fill a niche in cloud-based applications, they lack most of the features demanded by enterprise applications - in particular, transactional support and business intelligence capabilities.

Posted July 15, 2008

IDS 11.5 - the newest version of Informix Dynamic Server - was officially announced by IBM at the International Informix User Conference in Kansas City in April 2008. The announcement created a tremendous interest from customers, analysts and the press.

Posted June 15, 2008

An entire industry has sprung up in response to the never-ending battle against complexity, server sprawl, and rising power consumption. Virtualization is now the mantra for beleaguered data center managers looking for ways to consolidate, better utilize, or abstract away their farms of physical servers and hardware. However, in many cases, virtualization itself can lead to even more complexity andoffer uncertain value to the business. Many businesses are finding that virtualization is not ready for core mission-critical applications.

Posted June 15, 2008

MENTISoftware offers security solutions for companies using Oracle-based systems and is adding support for an expanded range of ERP suites and database platforms. DBTA talked with Rajesh Parthasarathy about why top management cares more than ever about protecting sensitive data.

Posted June 15, 2008

For the first time in over 20 years, there appear to be cracks forming in the relational model's dominance of the database management systems market. The relational database management system (RDBMS) of today is increasingly being seen as an obstacle to the IT architectures of tomorrow, and - for the first time - credible alternatives to the relational database are emerging. While it would be reckless to predict the demise of the relational database as a critical component of IT architectures, it is certainly feasible to imagine the relational database as just one of several choices for data storage in next-generation applications.

Posted June 15, 2008

Historically, database auditing and database performance have been like oil and water; they don't mix. So auditing is often eliminated, because performance drag on critical systems is unacceptable.

Posted May 15, 2008

Data and its analysis has become an important economic battleground for many industries, and nowhere is this more apparent than in the financial industry. Regulation is mandating greater data transparency across firms and trading practices. The increase in automated trading and the continuing search for new trading opportunities has led to exponential increases in the amount of data that must be captured, cleaned, managed and analyzed within a financial institution. To give you some idea of the size of the problem, the Options Pricing Reporting Authority (OPRA) in the U.S. is anticipating trade volumes at peak levels of around one million messages per second by mid-2008. Real-time data processing and the ability to store it for historic analysis have become particular pressure points for many investment banks, asset managers and hedge funds.

Posted May 15, 2008

Database sizes have grown exponentially, with more than half of all databases in use globally projected to exceed 10TB by 2010, according to The Data Warehouse Institute. But as the size of data warehouses has exploded and business requirements have forced companies to conduct more ad hoc queries on those warehouses, response times have slowed, requiring increasingly expensive database hardware investments.

Posted May 15, 2008

In an effort to consolidate data across six separate business divisions, MassHousing, a leading provider of affordable housing for individuals and major developments across the state of Massachusetts, deployed business intelligence (BI) to consolidate large quantities of disparate data in a fast and efficient way. In tandem with its executive information system (EIS), MassHousing established a business intelligence competency center (BICC), to ensure consistent deployment across the organization and efficiency of all BI systems. MassHousing is one best practice organization that is taking its business intelligence initiative a step further with a BICC to help enable true business optimization.

Posted May 15, 2008

In the 1958 IBM Journal article that is generally acknowledged as the first usage of the term "business intelligence," author Hans Peter Luhn described the challenges and goals of the BI community in terms that are profoundly resonant nearly a half-century later.

Posted April 15, 2008

The complexities of today's IT environments make protecting data an ongoing challenge. Effectively securing data necessitates knowing where that data resides at any given point in time. However, as companies outsource tasks such as order processing, customer service, and fulfillment, this information becomes increasingly difficult to ascertain. Many of the external systems that house such data are not visible and are often not understood by those responsible for certifying compliance with regulations such as the Payment Card Industry Data Security Standard (PCI DSS).

Posted April 15, 2008

Lately, I have been rereading one of my favorite books on change: Our Iceberg Is Melting by John Kotter and Holger Rathgeber. The book shares a fable in which a colony of penguins discovers that their Antarctic iceberg is melting. If the penguins do nothing, the iceberg will shortly melt away and dump the penguins into cold, dark waters of the Antarctic Ocean, which will eventually lead to their deaths from cold and exhaustion. The manner in which the penguins deal with this change holds some great lessons for all of us.

Posted April 15, 2008

The concept of a configuration management system (CMS) is an idea whose time has come - particularly since the release of IT Infrastructure Library (ITIL) Version 3, known as V3. ITIL V3 devotes considerable attention to the importance of a CMS. If you're not already familiar with a CMS and its functions, you may be wondering what's included in it, and how it differs from a configuration management database, known as a CMDB. So let's take a look at this vital component of a long-term IT management strategy

Posted April 15, 2008

In recent years, disaster recovery has garnered attention from the company boardroom to the Office of the CIO. Despite this fact, many companies have yet to implement an effective DR solution to safeguard their applications and data. This inertia is attributed to two factors - perception of the term "Disaster" ("when the disaster happens, we'll deal with it then") and shortcomings of existing solutions ("we don't have budget for machines to sit by idly").Due to the rarity of catastrophic disasters such as earthquakes, floods and fires, organizations rarely implement comprehensive disaster protection measures. However, there is another set of "technical disasters" that are caused by much more mundane events which regularly lead to significant system outages. These span faulty system components (server, network, storage, and software), data corruptions, backup/recovery of bad data, wrong batch jobs, bad installations/upgrades/patches, operator errors, and power outages, among others.

Posted March 15, 2008

Despite efforts to "democratize" business intelligence, it has remained stubbornly confined to a chosen few within organizations. Although vendors have worked hard to convince enterprises that their BI solutions could be extended to line-of-business managers and employees, high-end analytic tools have remained confined to power users or analysts with statistical skills, while the remainder of the organization relies on spreadsheets to cobble together limited pieces of information.This disconnect was confirmed in a 2007 survey conducted by Unisphere Research for the Oracle Applications Users Group, which found that most companies are still a long way off from the ideal of BI for all. The OAUG survey found that for the most part, BI reporting remains tied up in IT departments, and is still limited to analysts or certain decision makers. The majority of survey respondents said that it takes more than three to five days to get a report out of IT. Overall, the survey found, fewer than 10 percent of employees have access to BI and corporate perĀ­formance management tools.

Posted March 15, 2008

When database administrators swap war stories, they are likely to relate similar tales about the woes of managing time-series data that may include clogs, jams, and general inefficiency. Why the ubiquitous complaints? Because a standard, relational database is not equipped to handle the rigorous demands this kind of data dishes out to its handlers.

Posted February 15, 2008

Emerging as the face of business intelligence (BI), dashboard technology has proven to be an integral component of any enterprise-wide BI strategy. Dashboards allow companies to benefit from a wealth of data and leverage their information assets through visually rich, responsive, and personalized BI indicators. Moreover, through effective BI dashboards, business leaders gain heightened insight into and visibility across the organization, allowing them to detect and solve problems quickly and make informed decisions on the spot.

Posted February 15, 2008

The data explosion driving data warehouse equipment purchases in the last few years has just begun. Equipment proliferation already pressurizes data center energy requirements. Fortunately, a column-based analytics server can help companies with both kinds of green - the environment and money - by offering enormous energy and cost reductions while significantly boosting performance.

Posted February 15, 2008

Efficiently sharing and managing the backup of data are common problems facing every organization, especially those with multiple, geographically-dispersed sites. Providing an adequate solution to both problems can be a vexing challenge. Businesses are under increased pressure from users and from auditors to facilitate secure, reliable, and auditable data transfer with near instantaneous access, data reliability, version coherency, and file security.

Posted January 15, 2008

There is perhaps no area within database administration more time-consuming or fraught with difficulty as the need to accurately shepherd the varied and ongoing vectors of change across an organization's database infrastructure. A typical company has hundreds of databases, each with thousands of database objects, instantiated across multiple environments. The process of database change management touches many different people in the organization, including analysts, architects, modelers, developers, and DBAs; it also invokes common umbrella functions, such as change management, corporate security, data governance and SOA.

Posted January 15, 2008

High-profile Internet security violations are on the evening news every week. Although the publicized computer break-ins seem to command the most attention, a wide range of other Internet violations and computer crimes now populate the IT landscape. An array of stakeholders - ranging from those in the executive suite to customers to regulators - are increasingly coming to view data as one of the most critical assets of the enterprise and the pressure is growing to treat it as such.

Posted January 15, 2008

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30

Sponsors