It is widely accepted that to realize the full potential of blockchain technology we will need a next-generation blockchain to supplement the one provided in the Bitcoin implementation. Ethereum represents just such a next-generation blockchain.
Posted September 02, 2016
By the mid-2000s, a huge number of web apps were built upon the so-called LAMP stack. LAMP applications utilize the Linux operating system, Apache web server and MySQL database server, and implement application logic in PHP or another language starting with the letter "P," such as Python or Perl. But the LAMP stack is now essentially obsolete technology, and the MEAN stack provides a lot of productivity advantages, especially for modern highly-interactive web sites. But the MEAN stack is not without compromise. Here's why.
Posted July 12, 2016
Posted May 04, 2016
Few of us working in the software industry would dispute that agile methodologies represent a superior approach to older waterfall-style development methods. However, many software developers would agree that older enterprise-level processes often interact poorly with the agile methodology, and long for agility at the enterprise level. The Scaled Agile Framework (SAFe) provides a recipe for adopting agile principles at the enterprise level.
Posted March 03, 2016
The development of a functional and practical quantum computing system has been "pending" for some decades now, but there are some real signs that this technology may become decisive soon. The implications of cryptography are encouraging major government investment - both the U.S. and China, in particular, are heavily investing in quantum computing technology. The arms race to develop functional quantum computing has begun.
Posted January 07, 2016
Almost every commercial endeavor and, indeed, almost every human undertaking, has software at its core. Yet, with software at the core of so much of our society, it's surprising to realize it's getting harder and harder to actually make a living selling software. In his recent book, "The Software Paradox," Stephen O'Grady - co-founder of analyst firm RedMonk - provides a cohesive and persuasive analysis of what those of us in the software business have been experiencing for several years - it's getting increasingly difficult to generate revenues selling "shrink-wrapped" software.
Posted November 09, 2015
Dystopian visions of a future in which automation eliminates the vast majority of jobs are nothing new. However, even though previous predictions of doom have been misplaced, there is new concern about the impact of the latest generation of automation on the nature of work and the prospects for universal employment in the future. In particular, we're increasingly seeing automation disrupt jobs that were long considered to require human judgment or abilities.
Posted July 08, 2015
You would have to have been living under a rock for the past few years not to have heard of Bitcoin. Bitcoin is an electronic "crypto" currency which can be used like cash in many web transactions. At time of writing there are about 14 million bitcoins in circulation, trading at approximately $250 for a total value of about $3.5 billion.
Posted May 14, 2015
Until recently, mention of Alan Turing outside of computer science circles would fail to generate any recognition. In recent months, however, the British technology pioneer has been brought into the public eye following the release of the film, "The Imitation Game." The film itself is a combination of Turing's personal biography and an account of his pivotal work cracking the German Enigma cipher machine during World War II.
Posted March 12, 2015
Smart watches can perform continuous biometric validation (through pulse signatures and other cues), and are always at hand - or at least, at wrist. Coupled with the ability to continually monitor health and fitness, and perhaps even the ability to include basic phone capability, there are real advantages to be had as smart watches mature.
Posted January 07, 2015
We are now seeing a seismic shift and increase in the significance of social network data for marketing and brand analysis. The next wave of social network exploitation promises to allow companies to narrowly target consumers and leads to predict market trends, and to more actively influence consumer behavior.
Posted November 12, 2014
In this month's column, Guy Harrison writes about Docker, an open source project based on Linux containers that is showing rapid adoption. "Unlike virtual machines, Docker containers do not have to include a copy of the guest OS - each Docker container essentially shares the same copy of the underlying OS," explains Harrison. "This allows Docker containers to be much smaller, which, in turn, allows them to be more easily deployed, provides for greater density (more containers per host) and permits faster initialization."
Posted September 10, 2014
Competitive differentiation in the mobile space depends only peripherally on the hardware - all modern smartphones have similar processing, display and networking capabilities. And, while some are notably superior in terms of a niche capability - cameras, for instance - it's the software, rather than the hardware, that separates the platforms.
Posted July 03, 2014
The "Internet of Things" (IoT) is shifting from aspirational buzzword to a concrete and lucrative market. New-generation computing devices require new types of operating systems and networks. While many have been initially based on some variation of the Linux OS and connect using existing Wi-Fi and Bluetooth wireless protocols, new operating systems and networking protocols are emerging.
Posted May 08, 2014
Ironically, although the thin client advocates were right about many things - the success of browser-based applications, in particular - they were dead wrong about the diminishing role of the OS. More than ever, the OS is the source of competitive differentiation between various platforms, and a clear focus of innovation for the foreseeable future.
Posted March 12, 2014
New devices promise to open up ways for us to improve our mental functioning and perhaps to further revolutionize social networking and big data. A world in which Facebook "likes" are generated automatically might not be far off, and mining the big data generated from our own brains has some amazing - though sometimes creepy - implications.
Posted January 07, 2014
Two new approaches to application quality have emerged: "risk-based testing" - pioneered in particular by Rex Black - and "exploratory testing" - as evangelized by James Bach and others. Neither claim to eradicate issues of application quality, which most likely will continue as long as software coding involves human beings. However, along with automation of the more routine tests, these techniques form the basis for higher quality application software.
Posted November 13, 2013
We've seen a lot of progress in the nearly 50 years between 1965 and 2013: the modern smartphone and the World Wide Web, in particular,have been transformative. However, it's probably true that our current world would not astonish a time traveler from 1965 as much as 1965 would have surprised a visitor from 1915. Where are the jetpacks, flying cars, colonies on Mars? However, it does look like we are finallygoing to see one of the long-awaited benefits of the future: the self-driving car.
Posted September 11, 2013
Like many of my generation, my early visions of the future were influenced by films like "2001: A Space Odyssey" and the original "Star Trek" TV series. In each of these, humans interact with computers using conversational English, posing complex questions and getting intelligent relevant responses. So, you can imagine how primed someone like me is to hear that Google has been explicitly trying to create that Star Trek computer. At the Google IO conference in San Francisco in May, Amit Singhal, Google senior vice president, spoke of his early childhood experiences watching "Star Trek," and his dreams of one day building that computer.
Posted July 09, 2013
Computer games have been front-runners in many important developments in the IT industry, including digital distribution, cloud storage, user driven design, and crowd sourcing. So it's not surprising that game developers are in a leading position when it comes to big data analytics and machine learning. Online games have the ability to monitor all aspects of player behavior, so, just as Google is able to refine your search results by analyzing your previous searches and comparing them to the billions of searches done every day, online game companies are able to modify game behavior to ensure a more optimal game experience by observing what works - and what doesn't - in the gamer's world.
Posted May 09, 2013
Google's dominance of internet search has been uncontested for more than 12 years now. Before Google, search engines such as AltaVista indexed web pages and allowed for keyword search with an interface and functionality superficially similar to that provided by Google. However, these first-generation search engines provided relatively poor ordering of results. Because an internet search would return pages ranked by the number of times a term appeared on the website, unpopular or irrelevant sites would be just as likely to achieve top rank as popular sites.
Posted March 14, 2013
Coverage of Windows 8 has understandably focused on the revolutionary Metro interface. Many believe that this new interface, while fine for tablets and phones, is a step backwards for desktop productivity. By forcing users to switch between two modes of operation - desktop and Metro, Windows 8 diminishes productivity and imposes steep learning curve on new users. The Metro interface itself supports only very limited multi-tasking, so, serious work often must be done in the traditional Windows desktop. Microsoft implicitly acknowledges these limitations by providing the latest version of Microsoft Office, not in Metro format, but as traditional "desktop" applications.
Posted January 03, 2013
Five years ago, Radio Frequency ID (RFID) seemed posed to revolutionize commerce. Way back in 2003, Wal-Mart announced that it would be requiring that RFID tags - so called "electronic barcodes" - be attached to virtually all merchandise. Many- myself included - became convinced that the Wal-Mart directive would be the tipping point leading to universal adoption of RFID tabs in consumer goods and elsewhere.
Posted November 13, 2012
The first computer program I ever wrote (in 1979, if you must know) was in the statistical package SPSS (Statistical Package for the Social Sciences), and the second computer platform I used was SAS (Statistical Analysis System). Both of these systems are still around today—SPSS was acquired by IBM as part of its BI portfolio, and SAS is now the world's largest privately held software company. The longevity of these platforms—they have essentially outlived almost all contemporary software packages—speaks to the perennial importance of data analysis to computing.
Posted September 11, 2012
Seriously chronic geeks like me usually were raised on a strong diet of science fiction that shaped our expectations of the future. Reading Heinlein and Asimov as a boy led me to expect flying cars and robot servants. Reading William Gibson and other "cyberpunk" authors as a young man led me to expect heads-up virtual reality glasses and neural interfaces. Flying cars and robot companions don't seem to be coming anytime soon, but we are definitely approaching a world in which virtual - or at least augmented - reality headsets and brain control interfaces become mainstream.
Posted July 11, 2012
Websites such as MySpace, Facebook, and LinkedIn have brought social networking and the concept of online community to a huge cross-section of our society. Penetration and usage of these platforms may vary depending on demographic (age and geography, in particular), but no one can debate the impact of Facebook and Twitter on both everyday life and on society in general.
Posted May 09, 2012
Knowing how your customers feel about your products is arguably as important as actual sales data but often much harder to determine. Traditionally, companies have used surveys, focus groups, customer visits, and similar active sampling techniques to perform this sort of market research. Opposition or lack of faith in market research takes a number of forms. Henry Ford once said, "If I had asked people what they wanted, they would have said faster horses," while Steve Jobs said, "People don't know what they want until you show it to them." The real problem with market research is more pragmatic: It's difficult and expensive to find out what people think.
Posted March 07, 2012
Along with thousands of IT professionals, I was in the San Francisco Moscone Center main hall last October listening to Larry Ellison's 2011 Oracle Open world keynote. Larry can always be relied upon to give an entertaining presentation, a unique blend of both technology insights and amusingly disparaging remarks about competitors.
Posted January 11, 2012
My 20-year-old daughter recently remarked that Facebook isn't as cool as it used to be. Sure, everyone has to be on Facebook, but that very ubiquity removes its mystique. The recently released Google+ is clearly targeted at Facebook and adds some features - particularly "Circles" - that are not available on Facebook. Facebook dominance may be indisputable today, but it is not guaranteed for all time. If I were Mark Zuckerberg, I would fear losing my cool status more than anything else.
Posted November 10, 2011
The term "machine learning" evokes visions of massive super computers that eventually turn on and enslave humanity - think SkyNet from Terminator or HAL from 2001: A Space Odyssey. But the truth is that machine learning algorithms are common in web applications that we use every day and have a growing relevance to enterprise applications.
Posted September 14, 2011
One of the funniest moments in the classic Star Trek motion pictures is the scene when the engineer "Scotty" - who has traveled back in time to the 1980s with his comrades - attempts to use a computer. "Computer!" he exclaims, attempting to initiate a dialogue with the PC. Embarrassed, a contemporary engineer hands him a mouse. "Aha," says Scotty who then holds the mouse to his mouth only to again exclaim, "Computer!" The idea that computers in the future would be able to understand human speech was common a few decades ago. Speech generation and recognition is so fundamental to the human experience that we tend to underestimate the incredible complexity of human information processing that makes it possible.
Posted July 07, 2011
The rise of "big data" solutions - often involving the increasingly common Hadoop platform - together with the growing use of sophisticated analytics to drive business value - such as collective intelligence and predictive analytics - has led to a new category of IT professional: the data scientist.
Posted May 12, 2011
When computers first started to infringe on everyday life, science fiction authors and society in general had high expectations for "intelligent" systems. Isaac Asimov's "I, Robot" series from the 1940s portrayed robots with completely human intelligence and personality, and, in the 1968 movie "2001: A Space Odyssey," the onboard computer HAL (Heuristically programmed ALgorithmic computer) had a sufficiently human personality to suffer a paranoid break and attempt to murder the crew!
Posted March 09, 2011
The NoSQL acronym suggests it's the SQL language that is the key difference between traditional relational and newer non-relational data stores. However, an equally significant divergence is in the NoSQL consistency and transaction models. Indeed, some have suggested that NoSQL databases would be better described as "NoACID" databases - since they avoid the "ACID" transactions of the relational world.
Posted January 07, 2011
Oracle CEO Larry Ellison has been notoriously critical of cloud computing - or at least of the way in which the term "cloud" has been applied. He often has expressed his frustration when "cloud" is applied to long established patterns such as software as a service (SaaS), especially when this is done by Salesforce.com. While there's widespread agreement that "cloud" has become a faddish, over-hyped and often abused term, some have speculated that Ellison's obvious frustration has been fueled by Oracle's inability to fully engage in the cloud computing excitement prior to the conclusion of the Sun acquisition.
Posted November 09, 2010
The promises of public cloud computing - pay as you go, infinite scale and outsourced administration - are compelling. However, for most enterprises, security, geography and risk mitigation concerns make private cloud platforms more desirable. Enterprise customers like the idea of on-demand provisioning, but are often unwilling to take the performance, security and risk drawbacks of moving applications to remote hardware that is not under their direct control.
Posted September 07, 2010
In biology, we are taught that survival favors diversity. Organisms that reproduce without variation die out during periods of rapid change, while organisms that show variation in feature tend to survive and adapt. Likewise, ecosystems consisting of relatively few homogenous species thrive only when conditions stay static. Does IT diversity create a competitive advantage in the business application ecosystem? Predictably, large vendors with vertically integrated stacks argue that mixing software components is a Bad Thing. These vendors claim that reducing the diversity in the application stack leads to better efficiency and maintainability.
Posted July 12, 2010
Although VMware continues to hold the majority share of the commercial virtualization market, other virtualization technologies are increasingly significant, though not necessarily as high profile. Operating system virtualization-sometimes called partial virtualization-allows an operating system such as Solaris to run multiple partitions, each of which appears to contain a distinct running instance of the same operating system. However, these technologies cannot be used to host different operating system versions, making them less appealing to enterprises seeking to consolidate workloads using virtualization.
Posted June 07, 2010
Until recently, IT professionals have been conditioned to regard response time, or throughput, as the ultimate measure of application performance. It's as though we were building automobiles and only concerned with faster cars and bigger trucks. Yet, just as the automotive industry has come under increasing pressure to develop more fuel-efficient vehicles, so has the IT industry been challenged to reduce the power drain associated with today's data centers.
Posted May 10, 2010
Spreadsheets, which have long been a disruptive force to enterprise IT, to some extent are the "killer" applications that helped drive the adoption of personal computers (PCs) in the enterprise. Spreadsheet products such as Lotus 1,2,3 - and early versions of Excel on the Mac - saw rapid adoption by business users. Inevitably, these users pushed the boundaries of the spreadsheet model, using spreadsheets as databases, and even to develop simple business applications. In the late 1980s, it was typical to see corporate IT rolling out massively expensive mainframe-based solutions, while departmental users got their real work done on spreadsheets running on cheap PCs.
Posted April 07, 2010
Open source applications were somewhat niche at the beginning of the decade but now are clearly mainstream. Credible open source alternatives now exist for almost every category of application, as well as every component of the application.
Posted March 04, 2010
In 1995, Netscape founder Marc Andreessen famously claimed that applications of the future would run within a web browser, relegating the role of the operating system - Windows, in particular - to "a poorly debugged set of device drivers." Fifteen years later, we can see that although rich applications such as Microsoft Office are still dominant, the web browser has become a platform that can deliver almost any conceivable type of business or consumer application.
Posted February 09, 2010
Google's first "secret sauce" for web search was the innovative PageRank link analysis algorithm which successfully identifies the most relevant pages matching a search term. Google's superior search results were a huge factor in their early success. However, Google could never have achieved their current market dominance without an ability to reliably and quickly return those results. From the beginning, Google needed to handle volumes of data that exceeded the capabilities of existing commercial technologies. Instead, Google leveraged clusters of inexpensive commodity hardware, and created their own software frameworks to sift and index the data. Over time, these techniques evolved into the MapReduce algorithm. MapReduce allows data stored on a distributed file system - such as the Google File System (GFS) - to be processed in parallel by hundreds of thousands of inexpensive computers. Using MapReduce, Google is able to process more than a petabyte (one million GB) of new web data every hour.
Posted January 11, 2010
When a company like Microsoft talks about the future of computing, you can expect a fair bit of self-serving market positioning - public software companies need to be careful to sell a vision of the future that doesn't jeopardize today's revenue streams. But, when a company like Microsoft releases a new version of its fundamental development framework - .NET, in this case - you can see more clearly the company's technical vision for the future of computing.
Posted December 14, 2009
There's an old but clever internet parody describing the "Built-in Orderly Organized Knowledge device (BOOK)." This device is described as a "revolutionary breakthrough in technology" that is compact and portable, never crashes and supports both sequential and indexed information access. Though satirical, the article makes excellent points: the printed book is indeed an information technology device, arguably the oldest in widespread use today
Posted November 11, 2009
The idea of "virtual" reality—immersive computer simulations almost indistinguishable from reality—has been a mainstay of modern "cyberpunk" science fiction since the early 1980s, popularized in movies such as The Thirteenth Floor and The Matrix. Typically, a virtual reality environment produces computer simulated sensory inputs which include at least sight and sound, and, perhaps, touch, taste and smell. These inputs are presented to the user through goggles, earphones and gloves or—in the true cyberpunk sci-fi—via direct brain interfaces.
Posted October 13, 2009
Google introduced the MapReduce algorithm to perform massively parallel processing of very large data sets using clusters of commodity hardware. MapReduce is a core Google technology and key to maintaining Google's website indexes.
Posted September 14, 2009
Attendees at the O'Reilly Velocity conference in June were treated to the unusual phenomenon of a joint presentation by Google and Microsoft. The presentation outlined the results of studies by the two companies on the effects of search response time. Aside from the novelty of Microsoft-Google cooperation, the presentation was notable both in terms of its conclusions and its methodology.
Posted August 14, 2009
Predictive Analytics - sometimes referred to as Predictive Data Mining - is a branch of Business Intelligence that attempts to use historical data to make predictions about future events. At its simplest, predictive analytics utilizes statistical techniques, such as correlation and regression, which many of us have encountered in college or even high school. Correlation analysis determines if there is a statistically significant relationship between two variables. For instance, height and age are highly correlated, while IQ and height are very weakly correlated. Regression attempts to find an equation between the two or more variables, so that you can predict one from the other.
Posted July 13, 2009
Virtualization has changed the IT landscape more dramatically than perhaps any other technology introduced over the past decade. Virtualized environments are omnipresent in the modern data center due to their economic advantages in hardware consolidation and manageability.
Posted June 15, 2009