Newsletters




Guy Harrison

Guy Harrison

Guy Harrison is an executive director of R&D at Dell and has more than 20 years of experience in database design, development, administration, and optimization. Harrison is an Oracle ACE, and is the author of the Oracle Performance Survival Guide (Prentice Hall, 2009) and MySQL Stored Procedure Programming (O'Reilly, with Steven Feuerstein), as well as other books, articles and presentations on database technology. He is the architect of Dell's Spotlight family of diagnostic products, and has led the development of Dell's Toad for Cloud Databases. 

Harrison can be found on the internet at www.guyharrison.net, on email at guy_harrison@dell.com and is @guyharrisonon Twitter. 

Articles by Guy Harrison

Big data analytics is a complex field, but if you understand the basic concepts—such as the difference between supervised and unsupervised learning—you are sure to be ahead of the person who wants to talk data science at your next cocktail party!

Posted June 11, 2014

The "Internet of Things" (IoT) is shifting from aspirational buzzword to a concrete and lucrative market. New-generation computing devices require new types of operating systems and networks. While many have been initially based on some variation of the Linux OS and connect using existing Wi-Fi and Bluetooth wireless protocols, new operating systems and networking protocols are emerging.

Posted May 08, 2014

About 3 years ago, the AMP (Algorithms, Machines, People) lab was established at U.C. Berkeley to attack the emerging challenges of advanced analytics and machine learning on big data. The resulting Berkeley Data Analytics Stack—particularly the Spark processing engine—has shown rapid uptake and tremendous promise.

Posted April 04, 2014

Ironically, although the thin client advocates were right about many things - the success of browser-based applications, in particular - they were dead wrong about the diminishing role of the OS. More than ever, the OS is the source of competitive differentiation between various platforms, and a clear focus of innovation for the foreseeable future.

Posted March 12, 2014

Solid State Disk (SSD)—particularly flash SSD—promised to revolutionize database performance by providing a storage media that was orders of magnitude faster than magnetic disk, offering the first significant improvement in disk I/O latency for decades. Aerospike is a NoSQL database that attempts to provide a database architecture that can fully exploit the I/O characteristics of flash SSD.

Posted February 10, 2014

New devices promise to open up ways for us to improve our mental functioning and perhaps to further revolutionize social networking and big data. A world in which Facebook "likes" are generated automatically might not be far off, and mining the big data generated from our own brains has some amazing - though sometimes creepy - implications.

Posted January 07, 2014

Not all Hadoop packages offer a unique distribution of the Hadoop core, but all attempt to offer a differentiated value proposition through additional software utilities, hardware, or cloud packaging. Against that backdrop, Intel's distribution of Hadoop might appear to be an odd duck since Intel is not in the habit of offering software frameworks, and the brand, while ubiquitous, is not associated specifically with Hadoop, databases or big data software. However, given its excellent partnerships across the computer industry, Intel has support from a variety of vendors, including Oracle and SAP, and many of the innovations in its distribution show real promise.

Posted December 04, 2013

Two new approaches to application quality have emerged: "risk-based testing" - pioneered in particular by Rex Black - and "exploratory testing" - as evangelized by James Bach and others. Neither claim to eradicate issues of application quality, which most likely will continue as long as software coding involves human beings. However, along with automation of the more routine tests, these techniques form the basis for higher quality application software.

Posted November 13, 2013

Security for NoSQL continues to evolve rapidly in order to attract wider enterprise adoption. Robust security is a must-have for any database in the enterprise, and over the decades since the emergence of the relational model, security and authentication capabilities have continually improved. The first new-generation non-relational "NoSQL" databases, like the early relational databases, had very simplistic security mechanisms. Here is how security for NoSQL is changing.

Posted October 09, 2013

We've seen a lot of progress in the nearly 50 years between 1965 and 2013: the modern smartphone and the World Wide Web, in particular,have been transformative. However, it's probably true that our current world would not astonish a time traveler from 1965 as much as 1965 would have surprised a visitor from 1915. Where are the jetpacks, flying cars, colonies on Mars? However, it does look like we are finallygoing to see one of the long-awaited benefits of the future: the self-driving car.

Posted September 11, 2013

In many ways, Hadoop is the most concrete technology underlying today's big data revolution, but it certainly does not satisfy those who want quick answers from their big data. Hadoop - at least Hadoop 1.0 - is a batch-oriented framework that allows for the economical execution of massively parallel workloads, but provides no capabilities for interactive or real-time execution.

Posted August 07, 2013

Like many of my generation, my early visions of the future were influenced by films like "2001: A Space Odyssey" and the original "Star Trek" TV series. In each of these, humans interact with computers using conversational English, posing complex questions and getting intelligent relevant responses. So, you can imagine how primed someone like me is to hear that Google has been explicitly trying to create that Star Trek computer. At the Google IO conference in San Francisco in May, Amit Singhal, Google senior vice president, spoke of his early childhood experiences watching "Star Trek," and his dreams of one day building that computer.

Posted July 09, 2013

When NoSQL first hit the IT consciousness in 2009, an explosion of NoSQL databases seemed to appear out of thin air. Some of these contenders had in fact been around for some time, with others thrown together rather quickly to exploit the NoSQL buzz. The NoSQL pack thinned out as leaders in specific categories emerged, but for some time, there was no clear leading key-value NoSQL database.

Posted June 13, 2013

The term "NoSQL" is widely acknowledged as an unfortunate and inaccurate tag for the non-relational databases that have emerged in the past five years. The databases that are associated with the NoSQL label have a wide variety of characteristics, but most reject the strict transactions and stringent relational model that are explicitly part of the relational design. The ACID (Atomic-Consistent-Independent-Durable) transactions of the relational model make it virtually impossible to scale across data centers while maintaining high availability, and the fixed schemas defined by the relational model are often inappropriate in today's world of unstructured and rapidly mutating data.

Posted April 10, 2013

Google's dominance of internet search has been uncontested for more than 12 years now. Before Google, search engines such as AltaVista indexed web pages and allowed for keyword search with an interface and functionality superficially similar to that provided by Google. However, these first-generation search engines provided relatively poor ordering of results. Because an internet search would return pages ranked by the number of times a term appeared on the website, unpopular or irrelevant sites would be just as likely to achieve top rank as popular sites.

Posted March 14, 2013

Hadoop is the most significant concrete technology behind the so called "Big Data" revolution. Hadoop combines an economical model for storing massive quantities of data - the Hadoop Distributed File System - with a flexible model for programming massively scalable programs - MapReduce. However, as powerful and flexible as MapReduce might be, it is hardly a productive programming model. Programming in MapReduce reminds one of programming in Assembly language - the simplest operations require substantial code.

Posted February 13, 2013

Coverage of Windows 8 has understandably focused on the revolutionary Metro interface. Many believe that this new interface, while fine for tablets and phones, is a step backwards for desktop productivity. By forcing users to switch between two modes of operation - desktop and Metro, Windows 8 diminishes productivity and imposes steep learning curve on new users. The Metro interface itself supports only very limited multi-tasking, so, serious work often must be done in the traditional Windows desktop. Microsoft implicitly acknowledges these limitations by providing the latest version of Microsoft Office, not in Metro format, but as traditional "desktop" applications.

Posted January 03, 2013

As the undisputed pioneer of big data, Google established most of the key technologies underlying Hadoop and many of the NoSQL databases. The Google File System (GFS) allowed clusters of commodity servers to present their internal disk storage as a unified file system and inspired the Hadoop Distributed File System (HDFS). Google's column-oriented key value store BigTable influenced many NoSQL systems such as Apache HBase, Cassandra and HyperTable. And, of course, the Google Map-Reduce algorithm became the foundation computing model for Hadoop and was widely implemented in other NoSQL systems such as MongoDB.

Posted December 06, 2012

Five years ago, Radio Frequency ID (RFID) seemed posed to revolutionize commerce. Way back in 2003, Wal-Mart announced that it would be requiring that RFID tags - so called "electronic barcodes" - be attached to virtually all merchandise. Many- myself included - became convinced that the Wal-Mart directive would be the tipping point leading to universal adoption of RFID tabs in consumer goods and elsewhere.

Posted November 13, 2012

It's in the nature of hype bubbles to obscure important new paradigms behind a cloud of excitement and exaggerated claims. For example, the phrase "big data" has been so widely and poorly applied that the term has become almost meaningless. Nevertheless, beneath the hype of big data there is a real revolution in progress, and more than anything else it revolves around Apache Hadoop. Let's look at why Hadoop is creating such a stir in database management circles, and identify the obstacles that must be overcome before Hadoop can become part of mainstream enterprise architecture.

Posted October 10, 2012

Google is the pioneer of big data. Technologies such as Google File System (GFS), BigTable and MapReduce formed the basis for open source Hadoop, which, more than any other technology, has brought big data within reach of the modern enterprise.

Posted October 10, 2012

The first computer program I ever wrote (in 1979, if you must know) was in the statistical package SPSS (Statistical Package for the Social Sciences), and the second computer platform I used was SAS (Statistical Analysis System). Both of these systems are still around today—SPSS was acquired by IBM as part of its BI portfolio, and SAS is now the world's largest privately held software company. The longevity of these platforms—they have essentially outlived almost all contemporary software packages—speaks to the perennial importance of data analysis to computing.

Posted September 11, 2012

Throughout the 2000s, a huge number of website developers rejected the Enterprise Java or .NET platforms for web development in favor of the "LAMP" stack - Linux, Apache, MySQL and Perl/Python/PHP. Although the LAMP stack was arguably less scalable or powerful than the Java or .NET frameworks, it was typically easier to learn, faster in early stages of development - and definitely cheaper. When enterprise architects designed systems, they often chose commercial application servers and databases (Oracle, Microsoft, IBM). But, when web developers or startups faced these decisions, the LAMP stack was often the default choice.

Posted July 25, 2012

Seriously chronic geeks like me usually were raised on a strong diet of science fiction that shaped our expectations of the future. Reading Heinlein and Asimov as a boy led me to expect flying cars and robot servants. Reading William Gibson and other "cyberpunk" authors as a young man led me to expect heads-up virtual reality glasses and neural interfaces. Flying cars and robot companions don't seem to be coming anytime soon, but we are definitely approaching a world in which virtual - or at least augmented - reality headsets and brain control interfaces become mainstream.

Posted July 11, 2012

One of the earliest of the new generation of non-relational databases was CouchDB. CouchDB was born in 2005 when former Lotus Notes developer Damien Katz foresaw the nonrelational wave that only fully arrived in 2009. Katz imagined a database that was fully compatible with web architectures — and more than a little influenced by Lotus Notes document database concepts.

Posted June 13, 2012

Websites such as MySpace, Facebook, and LinkedIn have brought social networking and the concept of online community to a huge cross-section of our society. Penetration and usage of these platforms may vary depending on demographic (age and geography, in particular), but no one can debate the impact of Facebook and Twitter on both everyday life and on society in general.

Posted May 09, 2012

It's hard to overestimate Amazon's influence on cloud computing and on NoSQL databases. Amazon Web Services (AWS) was the first and still is the leading concrete example of an infrastructure as a service (IaaS) cloud - a collection of cloud-based services such as compute (EC2), storage (S3) and other application building blocks.

Posted April 11, 2012

Knowing how your customers feel about your products is arguably as important as actual sales data but often much harder to determine. Traditionally, companies have used surveys, focus groups, customer visits, and similar active sampling techniques to perform this sort of market research. Opposition or lack of faith in market research takes a number of forms. Henry Ford once said, "If I had asked people what they wanted, they would have said faster horses," while Steve Jobs said, "People don't know what they want until you show it to them." The real problem with market research is more pragmatic: It's difficult and expensive to find out what people think.

Posted March 07, 2012

In years to come, we might remember October 2011 as the month the big database vendors gave in to the dark side and embraced Hadoop. In October, both Microsoft and Oracle announced product offerings which included and embraced Hadoop as the enabler of their "big data" solution. The last of the big three database vendors - IBM - embraced Hadoop back in 2010.

Posted February 09, 2012

Along with thousands of IT professionals, I was in the San Francisco Moscone Center main hall last October listening to Larry Ellison's 2011 Oracle Open world keynote. Larry can always be relied upon to give an entertaining presentation, a unique blend of both technology insights and amusingly disparaging remarks about competitors.

Posted January 11, 2012

As the leading provider of relational database software, it's hardly surprising that Oracle initially gave little or no credence to the NoSQL movement that emerged in 2009. Indeed, an Oracle white paper from May 2011 concluded with the recommendation to "Go for the tried and true path," and avoid NoSQL databases.

Posted December 01, 2011

My 20-year-old daughter recently remarked that Facebook isn't as cool as it used to be. Sure, everyone has to be on Facebook, but that very ubiquity removes its mystique. The recently released Google+ is clearly targeted at Facebook and adds some features - particularly "Circles" - that are not available on Facebook. Facebook dominance may be indisputable today, but it is not guaranteed for all time. If I were Mark Zuckerberg, I would fear losing my cool status more than anything else.

Posted November 10, 2011

One of the greatest achievements in artificial intelligence occurred earlier this year when IBM's Watson supercomputer defeated the two reigning human champions in the popular Jeopardy! TV show. Named after the IBM founder Thomas Watson and not - as you may have thought - Sherlock Holmes' famous assistant, Watson was the result of almost 5 years of intensive effort by IBM, and the intellectual successor to "Deep Blue," the first computer to beat a chess grand master.

Posted October 15, 2011

The term "machine learning" evokes visions of massive super computers that eventually turn on and enslave humanity - think SkyNet from Terminator or HAL from 2001: A Space Odyssey. But the truth is that machine learning algorithms are common in web applications that we use every day and have a growing relevance to enterprise applications.

Posted September 14, 2011

Michael Stonebraker is widely recognized as one of the pioneers of the relational database. While at Berkeley, he co-founded the INGRES project, which implemented the relational principles published by Edgar Codd in his seminal papers. The INGRES project became the basis for the commercial Ingres RDBMS, which, during the 1980s, provided some of the most significant competition to Oracle.

Posted August 11, 2011

One of the funniest moments in the classic Star Trek motion pictures is the scene when the engineer "Scotty" - who has traveled back in time to the 1980s with his comrades - attempts to use a computer. "Computer!" he exclaims, attempting to initiate a dialogue with the PC. Embarrassed, a contemporary engineer hands him a mouse. "Aha," says Scotty who then holds the mouse to his mouth only to again exclaim, "Computer!" The idea that computers in the future would be able to understand human speech was common a few decades ago. Speech generation and recognition is so fundamental to the human experience that we tend to underestimate the incredible complexity of human information processing that makes it possible.

Posted July 07, 2011

Both HBase and Cassandra can deal with large data sets, and provide high transaction rates and low latency lookups. Both allow map-reduce processing to be run against the database when aggregation or parallel processing is required. Why then, would a merge of Cassandra and Hadoop be a superior solution?

Posted June 08, 2011

The rise of "big data" solutions - often involving the increasingly common Hadoop platform - together with the growing use of sophisticated analytics to drive business value - such as collective intelligence and predictive analytics - has led to a new category of IT professional: the data scientist.

Posted May 12, 2011

The relational database is primarily oriented toward the modeling of objects (entities) and relationships. Generally, the relational model works best when there are a relatively small and static number of relationships between objects. It has long been a tricky problem in the RDBMS to work with dynamic, recursive or complex relationships. For instance, it's a fairly ordinary business requirement to print out all the parts that make up a product - including parts which, themselves, are made up of smaller parts. However, this "explosion of parts" is not consistently supported by all the relational databases. Oracle, SQL Server and DB2 have special, but inconsistent, syntax for these hierarchical queries, while MySQL and PostgreSQL lack specific support.

Posted April 05, 2011

When computers first started to infringe on everyday life, science fiction authors and society in general had high expectations for "intelligent" systems. Isaac Asimov's "I, Robot" series from the 1940s portrayed robots with completely human intelligence and personality, and, in the 1968 movie "2001: A Space Odyssey," the onboard computer HAL (Heuristically programmed ALgorithmic computer) had a sufficiently human personality to suffer a paranoid break and attempt to murder the crew!

Posted March 09, 2011

Salesforce.com is well known as the pioneer of software as a service (SaaS) - the provision of hosted applications across the internet. Salesforce launched its SaaS CRM (Customer Relationship Management) product more than 10 years ago, and today claims over 70,000 customers. It's less widely known that Salesforce.com also has been a pioneer in platform as a service (PaaS), and is one of the first to provide a comprehensive internet-based application development stack. In 2007 - way before the current buzz over cloud development platforms such as Microsoft Azure - Salesforce launched the Force.com platform, which allowed developers to run applications on the same multi-tenant architecture that hosts the Salesforce.com CRM.

Posted February 02, 2011

The NoSQL acronym suggests it's the SQL language that is the key difference between traditional relational and newer non-relational data stores. However, an equally significant divergence is in the NoSQL consistency and transaction models. Indeed, some have suggested that NoSQL databases would be better described as "NoACID" databases - since they avoid the "ACID" transactions of the relational world.

Posted January 07, 2011

Because any database that does not support the SQL language is, by definition, a "NoSQL" database, some very different databases coexist under the NoSQL banner. Massively scalable data stores like Cassandra, Voldemort, and HBase sacrifice structure to achieve scale-out performance. However, the document-oriented NoSQL databases have very different architectures and objectives.

Posted November 30, 2010

Oracle CEO Larry Ellison has been notoriously critical of cloud computing - or at least of the way in which the term "cloud" has been applied. He often has expressed his frustration when "cloud" is applied to long established patterns such as software as a service (SaaS), especially when this is done by Salesforce.com. While there's widespread agreement that "cloud" has become a faddish, over-hyped and often abused term, some have speculated that Ellison's obvious frustration has been fueled by Oracle's inability to fully engage in the cloud computing excitement prior to the conclusion of the Sun acquisition.

Posted November 09, 2010

The relational database - or RDBMS - is a triumph of computer science. It has provided the data management layer for almost all major applications for more than two decades, and when you consider that the entire IT industry was once described as "data processing," this is a considerable achievement. For the first time in several decades, however, the relational database stranglehold on database management is loosening. The demands of big data and cloud computing have combined to create challenges that the RDBMS may be unable to adequately address.

Posted October 12, 2010

In Greek mythology, Cassandra was granted the gift of prophesy, but cursed with an inability to convince others of her predictions - a sort of unbelievable "oracle," if you like. Ironically, in the database world, the Cassandra system is fast becoming one of the most credible non-relational databases for production use - a believable alternative to Oracle and other relational databases.

Posted October 12, 2010

The promises of public cloud computing - pay as you go, infinite scale and outsourced administration - are compelling. However, for most enterprises, security, geography and risk mitigation concerns make private cloud platforms more desirable. Enterprise customers like the idea of on-demand provisioning, but are often unwilling to take the performance, security and risk drawbacks of moving applications to remote hardware that is not under their direct control.

Posted September 07, 2010

NoSQL - probably the hottest term in database technology today - was unheard of only a year ago. And yet, today, there are literally dozens of database systems described as "NoSQL." How did all of this happen so quickly? Although the term "NoSQL" is barely a year old, in reality, most of the databases described as NoSQL have been around a lot longer than the term itself. Many databases described as NoSQL arose over the past few years as reactions to strains placed on traditional relational databases by two other significant trends affecting our industry: big data and cloud computing.

Posted August 10, 2010

In biology, we are taught that survival favors diversity. Organisms that reproduce without variation die out during periods of rapid change, while organisms that show variation in feature tend to survive and adapt. Likewise, ecosystems consisting of relatively few homogenous species thrive only when conditions stay static. Does IT diversity create a competitive advantage in the business application ecosystem? Predictably, large vendors with vertically integrated stacks argue that mixing software components is a Bad Thing. These vendors claim that reducing the diversity in the application stack leads to better efficiency and maintainability.

Posted July 12, 2010

Although VMware continues to hold the majority share of the commercial virtualization market, other virtualization technologies are increasingly significant, though not necessarily as high profile. Operating system virtualization-sometimes called partial virtualization-allows an operating system such as Solaris to run multiple partitions, each of which appears to contain a distinct running instance of the same operating system. However, these technologies cannot be used to host different operating system versions, making them less appealing to enterprises seeking to consolidate workloads using virtualization.

Posted June 07, 2010

Until recently, IT professionals have been conditioned to regard response time, or throughput, as the ultimate measure of application performance. It's as though we were building automobiles and only concerned with faster cars and bigger trucks. Yet, just as the automotive industry has come under increasing pressure to develop more fuel-efficient vehicles, so has the IT industry been challenged to reduce the power drain associated with today's data centers.

Posted May 10, 2010

Spreadsheets, which have long been a disruptive force to enterprise IT, to some extent are the "killer" applications that helped drive the adoption of personal computers (PCs) in the enterprise. Spreadsheet products such as Lotus 1,2,3 - and early versions of Excel on the Mac - saw rapid adoption by business users. Inevitably, these users pushed the boundaries of the spreadsheet model, using spreadsheets as databases, and even to develop simple business applications. In the late 1980s, it was typical to see corporate IT rolling out massively expensive mainframe-based solutions, while departmental users got their real work done on spreadsheets running on cheap PCs.

Posted April 07, 2010

Open source applications were somewhat niche at the beginning of the decade but now are clearly mainstream. Credible open source alternatives now exist for almost every category of application, as well as every component of the application.

Posted March 04, 2010

In 1995, Netscape founder Marc Andreessen famously claimed that applications of the future would run within a web browser, relegating the role of the operating system - Windows, in particular - to "a poorly debugged set of device drivers." Fifteen years later, we can see that although rich applications such as Microsoft Office are still dominant, the web browser has become a platform that can deliver almost any conceivable type of business or consumer application.

Posted February 09, 2010

Google's first "secret sauce" for web search was the innovative PageRank link analysis algorithm which successfully identifies the most relevant pages matching a search term. Google's superior search results were a huge factor in their early success. However, Google could never have achieved their current market dominance without an ability to reliably and quickly return those results. From the beginning, Google needed to handle volumes of data that exceeded the capabilities of existing commercial technologies. Instead, Google leveraged clusters of inexpensive commodity hardware, and created their own software frameworks to sift and index the data. Over time, these techniques evolved into the MapReduce algorithm. MapReduce allows data stored on a distributed file system - such as the Google File System (GFS) - to be processed in parallel by hundreds of thousands of inexpensive computers. Using MapReduce, Google is able to process more than a petabyte (one million GB) of new web data every hour.

Posted January 11, 2010

When a company like Microsoft talks about the future of computing, you can expect a fair bit of self-serving market positioning - public software companies need to be careful to sell a vision of the future that doesn't jeopardize today's revenue streams. But, when a company like Microsoft releases a new version of its fundamental development framework - .NET, in this case - you can see more clearly the company's technical vision for the future of computing.

Posted December 14, 2009

There's an old but clever internet parody describing the "Built-in Orderly Organized Knowledge device (BOOK)." This device is described as a "revolutionary breakthrough in technology" that is compact and portable, never crashes and supports both sequential and indexed information access. Though satirical, the article makes excellent points: the printed book is indeed an information technology device, arguably the oldest in widespread use today

Posted November 11, 2009

The idea of "virtual" reality—immersive computer simulations almost indistinguishable from reality—has been a mainstay of modern "cyberpunk" science fiction since the early 1980s, popularized in movies such as The Thirteenth Floor and The Matrix. Typically, a virtual reality environment produces computer simulated sensory inputs which include at least sight and sound, and, perhaps, touch, taste and smell. These inputs are presented to the user through goggles, earphones and gloves or—in the true cyberpunk sci-fi—via direct brain interfaces.

Posted October 13, 2009

Google introduced the MapReduce algorithm to perform massively parallel processing of very large data sets using clusters of commodity hardware. MapReduce is a core Google technology and key to maintaining Google's website indexes.

Posted September 14, 2009

Attendees at the O'Reilly Velocity conference in June were treated to the unusual phenomenon of a joint presentation by Google and Microsoft. The presentation outlined the results of studies by the two companies on the effects of search response time. Aside from the novelty of Microsoft-Google cooperation, the presentation was notable both in terms of its conclusions and its methodology.

Posted August 14, 2009

Predictive Analytics - sometimes referred to as Predictive Data Mining - is a branch of Business Intelligence that attempts to use historical data to make predictions about future events. At its simplest, predictive analytics utilizes statistical techniques, such as correlation and regression, which many of us have encountered in college or even high school. Correlation analysis determines if there is a statistically significant relationship between two variables. For instance, height and age are highly correlated, while IQ and height are very weakly correlated. Regression attempts to find an equation between the two or more variables, so that you can predict one from the other.

Posted July 13, 2009

Virtualization has changed the IT landscape more dramatically than perhaps any other technology introduced over the past decade. Virtualized environments are omnipresent in the modern data center due to their economic advantages in hardware consolidation and manageability.

Posted June 15, 2009

Both open source software (OSS) and cloud computing continue to experience strong interest and growth despite the economic downturn. Clearly, both provide the promise of reduced operating and software licensing costs.

Posted May 19, 2009

Both open source software ( OSS) and cloud computing continue to experience strong interest and growth despite the economic downturn. Clearly, both provide the promise of reduced operating and software licensing costs. For instance, corporations looking to reduce the cost incurred by Microsoft Office licensing are looking more closely at the open source OpenOffice alternative, or at Google's online application suite, Google Apps. There's understandable resistance to moving from the rich experience offered by Microsoft to these lower-cost alternatives, but resistance has a way of disappearing in the face of financial imperatives.

Posted May 15, 2009

The business intelligence (BI) market is big: at least $10 billion in 2008 and much more if you include data warehousing projects. The tough economic environment may slow the growth of the BI market, but cost constraints, compliance and similar measures demanded by the current economy require accurate and timely business data, so BI is expected to remain a vigorous market segment regardless of the macro-economic situation.

Posted April 15, 2009

Way back in 2003, Walmart announced that it would require Radio Frequency ID (RFID) tags—so-called "electronic barcodes"—to be attached to virtually all merchandise. Walmart pioneered the use of the printed bar code back in the 1970s, and many—myself included—became convinced that the company's directive would be the tipping point leading to universal adoption of RFID tabs in consumer goods and elsewhere.

Posted March 15, 2009

few years ago, it seemed as though the days of the "micro-ISV"-very small Independent software vendors consisting of one or two developers-were over. The role once played by shareware windows applications had been supplanted by free web applications financed by advertising revenue. The start-up costs for such web applications-including funding a scalable and reliable web hosting infrastructure-were beyond the reach of most small software entrepreneurs.

Posted February 15, 2009

In the classic comedy, "The Hitchhikers' Guide to the Galaxy," a frustrated Ford Prefect can't understand why a bunch of marketing consultants shipwrecked on prehistoric earth can't invent the wheel.

Posted January 15, 2009

MOORE'S law—first expressed by Intel cofounder Gordon Moore in 1965—predicts that computing power will increase exponentially, doubling roughly every 18 months. Moore's law has proved remarkably accurate and we have all benefited from the rapid growth in CPU and computer memory available for our desktop computers.

Posted December 15, 2008

Posted November 15, 2008

Posted September 15, 2008

Non-relational cloud databases such as Google's BigTable, Amazon's SimpleDB and Microsoft'sSQL Server Data Services (SSDS) have emerged. But while these new data stores may well fill a niche in cloud-based applications, they lack most of the features demanded by enterprise applications - in particular, transactional support and business intelligence capabilities.

Posted July 15, 2008

For the first time in over 20 years, there appear to be cracks forming in the relational model's dominance of the database management systems market. The relational database management system (RDBMS) of today is increasingly being seen as an obstacle to the IT architectures of tomorrow, and - for the first time - credible alternatives to the relational database are emerging. While it would be reckless to predict the demise of the relational database as a critical component of IT architectures, it is certainly feasible to imagine the relational database as just one of several choices for data storage in next-generation applications.

Posted June 15, 2008

Sponsors