Newsletters




Trends and Applications



There is no doubt that virtualization is radically changing the shape of IT infrastructure, transforming the way applications are deployed and services delivered. Databases are among the last of the tier 1 applications to be hosted on virtual servers, but the past year has seen a huge wave of increase for production Oracle, SQL Server and other databases on VMware platforms. For all the benefits of virtualization, including cost-effectiveness, there are some impacts on the IT staff involved. Unfortunately for the DBAs virtualization often means losing control and visibility of their systems, which can ultimately hinder their ability to deliver database-oriented business solutions. While in the past DBAs had perfect visibility to the physical servers hosting the databases, the virtualization layers and the tools to manage them are typically out of bounds to them. While all the excitement of late has centered on VMware and other virtual machine systems, the DBAs have a valid reason for skepticism.

Posted July 27, 2011

Given all of the recent discussion around big data, NoSQL and NewSQL, this is a good opportunity to visit a topic I believe will be (or should be) forefront in our minds for the next several years - high velocity transactional systems. Let's start with a description of the problem. High velocity transactional applications have input streams that can reach millions of database operations per second under load. To complicate the problem, many of these systems simply cannot tolerate data inconsistencies.

Posted July 27, 2011

Oracle has introduced the Oracle Exadata Storage Expansion Rack to offer customers a cost-effective way to add storage to an Oracle Exadata Database Machine. "There are customers, earlier Exadata Database Machine customers, that have now started to fill up the disks that they have on the Database Machine and they are starting to look for ways to expand their storage capacity, and so this is going to be really welcome for them," says Tim Shetler, vice president of Product Management, Oracle.

Posted July 27, 2011

Sybase, an SAP company, has announced the general availability of Sybase IQ 15.3, which aims to help enterprise IT departments overcome the scalability limitations of many data warehouse approaches. By implementing a business analytics information platform that allows sharing of computing and data resources through the new Sybase IQ PlexQ technology, the company says enterprises can break down user and information silos to increase analytics adoption throughout their entire organization. There is a lot of talk about big data, but how to manage it and analyze it is only half the problem, observes David Jonker, senior product marketing manager of Sybase IQ. "The other half is how do you make it more pervasive throughout the enterprise and from our perspective that is where a lot of the existing data warehousing solutions fall down."

Posted July 27, 2011

The big data playing field grew larger with the formation of Hortonworks and HPCC Systems. Hortonworks is a new company consisting of key architects and core contributors to the Apache Hadoop technology pioneered by Yahoo. In addition, HPCC Systems, which has been launched by LexisNexis Risk Solutions, aims to offer a high performance computing cluster technology as an alternative to Hadoop.

Posted July 27, 2011

Time series data is a sequence of data points typically measured at successive times and may be spaced at uniform time intervals. Time-stamped data can be analyzed to extract meaningful statistics or other characteristics of the data. It can also be used to forecast future events based on known past events. Time series data enables applications such as economic forecasting, census analysis and forecasting, fleet management, stock market analysis, and smart energy metering. Because it is time-stamped, time series data has a special internal structure that differs from relational data. Additionally, many applications such as smart metering store data at frequent intervals that require massive storage capacity. For these reasons, it is not sufficient to manage time series information using the traditional relational approach of storing one row for each time series entry.

Posted July 07, 2011

Today, we operate in a global economy at internet speed. Globalization of our workforce has shifted the way work gets done. The explosion of wireless and edge technology has raised the expectations of consumers, who are more informed, educated, and knowledgeable about products and services. This changing landscape places immense pressure on business applications in organizations worldwide. Critical application outages caused by software defects can cost the business millions of dollars in revenue for every hour of downtime.

Posted July 07, 2011

As the economy shifts to expansion mode, and businesses start hiring again, a familiar challenge is rearing its head. Companies are scrambling to find the talent needed to effectively run, maintain, and expand their technology platforms. This is not a new problem by any means, but this time around, it is taking on a greater urgency, as just about every organization relies on information technology to be competitive and responsive to growth opportunities. A new survey of 376 employers finds a majority depend on the educational sector - universities and colleges - to provide key IT skills, often in conjunction with their own internal training efforts. However, few of the executives and managers hiring out of colleges are entirely satisfied with the readiness of graduates.

Posted July 07, 2011

Representing a continued expansion of its big data analytics portfolio, IBM has introduced a new addition to the Netezza product family of analytics appliances that is designed to help organizations uncover patterns and trends from extremely large data sets. The appliance is the first to be delivered by IBM since it acquired Netezza in November 2010. According to IBM, using the new appliance, businesses can now more easily sift through petabytes of data, including banking and mobile phone transactions, insurance claims, electronic medical records, and sales information, and they can also analyze this information to reveal new trends on consumer sentiment, product safety, and sales and marketing effectiveness. "This new appliance takes the scalability to a completely new dimension," says Razi Raziuddin, senior director of product management at IBM Netezza.

Posted June 24, 2011

BMC Software has acquired the portfolio of IMS (Information Management System) database products and customers from NEON Enterprise Software, a mainframe management software company. According to BMC, adding NEON Enterprise Software's IMS products to its existing offerings will satisfy the critical need organizations have for industry-leading, high-performance solutions that not only help manage, optimize and support IMS environments, but also reduce operating costs and improve business service delivery. All told, BMC is acquiring more than 20 products through the acquisition, says Robin Reddick, director of MSM Solutions Marketing at BMC Software.

Posted June 24, 2011

EMC Corporation, a provider of storage and infrastructure solutions, announced it will be shipping a data warehouse appliance that leverages the Apache Hadoop open-source software used for data-intensive distributed applications. The company's high-performance, data co-processing Hadoop appliance - the Greenplum HD Data Computing Appliance - integrates Hadoop with the EMC Greenplum Database, allowing the co-processing of both structured and unstructured data within a single solution. EMC also says the solution will run either Hadoop-based EMC Greenplum HD Community Edition or EMC Greenplum HD Enterprise Edition software.

Posted June 24, 2011

Oracle has announced the availability of Oracle JDeveloper 11g Release 2. Part of Oracle Fusion Middleware 11g, Oracle JDeveloper is a free, full-featured IDE. "It's a very broad and very productive environment targeted toward Oracle developers in the Java environment," says Bill Pataky, vice president of product management, Oracle, tells 5 Minute Briefing. The new release enhances the overall development experience by delivering an improved IDE, including support for new technologies and standards, as well as updates to Oracle Application Development Framework (ADF) 11g. "Java developers will immediately notice the difference."

Posted June 24, 2011

Composite Software has introduced Composite 6, a new version of its flagship data virtualization software that provides "big data" integration support for the Cloudera Distribution including Apache Hadoop (CDH), IBM Netezza and HP Vertica data sources. In addition, Composite 6, which is now completing beta test and will be commercially available in July, includes performance optimizations, cache enhancements, new data governance capabilities and ease-of-use features. "Data virtualization is emerging as an ideal solution for managing today's complex data integration challenges," says Jim Green, CEO for Composite Software.

Posted June 24, 2011

HP has unveiled a new suite of software which it says is designed to rationalize, measure and improve IT performance called the HP IT Performance Suite. The suite provides CIOs insight from across a comprehensive range of solutions to manage and optimize application development, infrastructure and operations management, security, information management, and financial planning and administration. Each product in the HP Software portfolio improves the performance of the discrete IT functions addressed, while a new IT Executive Scorecard helps technology executives optimize overall IT investments and outcomes.

Posted June 24, 2011

When you think of mission-critical services, perhaps none is as critical as electrical service. Not much can happen in modern businesses, government offices, or even homes without it. Central Vermont Public Service (CVPS) is the largest electric company in Vermont. More than 159,000 customers in 163 communities rely on the electrical service CVPS provides. And, according to J.D. Power and Associates, a global marketing and surveying company, for overall customer satisfaction, CVPS continues to rank in the top tier of utilities in the eastern region, more than 50 points above the regional average.

Posted June 08, 2011

Is the day of reckoning for big data upon us? To many observers, the growth in data is nothing short of incomprehensible. Data is streaming into, out of, and through enterprises from a dizzying array of sources-transactions, remote devices, partner sites, websites, and nonstop user-generated content. Not only are the data stores resulting from this information driving databases to scale into the terabyte and petabyte range, but they occur in an unfathomable range of formats as well, from traditional structured, relational data to message documents, graphics, videos, and audio files.

Posted June 08, 2011

Jaded IT professionals and managers, as well as market analysts, weary and wary from decades of overblown analyst claims about emerging new technologies, "paradigm shifts" and "enchanted quadrants," will take heart in a new series of Unisphere Research studies being released over the next several months. The first of these, "The Post-Relational Reality Sets In: 2011 Survey on Unstructured Data," has just been released, and tackles the current dimensions and impact of unstructured data on enterprise IT practices, technologies, policies, purchasing priorities and the evaluation of new technologies.

Posted June 08, 2011

The service management world of today is all about linking business services to the underlying IT infrastructure, creating an effective bridge between the business and technology. In theory, this provides a clear window into the IT environment to increase accountability, productivity and efficiency. Effective service management also provides business context, so IT can take action to avert service-impacting events by understanding business priority. However, current business service management (BSM) does not provide enough guidance about how to manage services proactively and effectively. This issue is now more important than ever, because on the horizon lurks an exciting new arena for service management-virtualization and cloud computing.

Posted May 25, 2011

The Oracle Exadata Database Machine X2-2 and X2-8 with the Solaris option began shipping just this month. Now in its third generation, the Database Machine combines all the components to create what the company describes as the best platform for running the Oracle Database. Here, Tim Shetler, vice president of Product Management, Oracle, talks about the performance innovations that differentiate Oracle's offering, how customers are using the system today for business advantage, and also — what's ahead.

Posted May 25, 2011

SnapLogic, a provider of application integration software, has introduced a solution aimed at enabling easy connection and reliable large data integration between business applications, cloud services, social media and Hadoop. The product, called SnapReduce, transforms SnapLogic data integration pipelines directly into MapReduce tasks, making Hadoop processing more accessible and resulting in optimal Hadoop cluster utilization. "This is Hadoop for humans," says Gaurav Dhillon, CEO of SnapLogic.

Posted May 25, 2011

SAP AG and Sybase, Inc., an SAP company, have announced plans to make the enterprise resource planning (ERP) application SAP ERP the first SAP Business Suite application running on Sybase Adaptive Server Enterprise (ASE). The announcement was made at SAPPHIRE NOW in Orlando where pilot customers also showcased how they are using SAP ERP on Sybase ASE. In combining SAP applications with Sybase technology, along with "harmonized" customer services and support, the companies say they will be able to offer organizations a new database option for running SAP applications and accessing critical information, providing efficiency gains and cost reductions.

Posted May 25, 2011

Informatica Corporation has announced Informatica Cloud Summer 2011, a major new release of its cloud integration service. The Informatica Cloud Summer 2011 release enables universal cloud integration and unified hybrid deployment for both on-premise and cloud deployments. The new release provides ease of use cloud features to enhance the simplicity of learning, deploying, administering, managing and configuring cloud integration, as well as enterprise-class functionality, including fine-grained access controls and delegated administration.

Posted May 25, 2011

Pervasive Software Inc., a provider of data integration solutions, has launched an online marketplace and community that is intended to fill a market void by providing simplification, ease of access and a public marketplace for data integration products, solutions, connectors, plug-ins and templates. Pervasive Galaxy is intended to serve as a community platform for data integration ecosystems to enable simple, profitable convergence between business-to-business integration producers and consumers through faster market and social connection. "It's Bazaar Voice, iTunes, and App Store, all rolled into one mashup," Ron Halversen, director of integration marketing, tells 5 Minute Briefing. "Galaxy is an app exchange for connectors."

Posted May 25, 2011

Big data provides new opportunities to improve customer care, unearth business insights, control operational costs, and in some cases, enable entirely new business models. By having access to larger and broader data sets, you can improve forecasts and projections for the business. A healthcare organization can conduct longitudinal analysis against years of data for patients treated with coronary attacks in order to improve care and speed time to recovery. A retailer can conduct deeper analysis on buying behavior during recessionary times if they have access to large data sets collected during the last economic downturn. Additionally, organizations across many sectors, such as communications, financial services and utilities, face significant regulatory and legal requirements for retaining and providing fast access to historical data for inquiries, audits and reporting.

Posted May 12, 2011

North American businesses are collectively losing $26.5 billion in revenue each year as a result of slow recovery from IT system downtime, according to a recent study. To protect against unexpected outages, IT organizations attempt to prepare by creating redundant backup systems, duplicating every layer in their existing infrastructure and preparing elaborate disaster recovery processes. This approach is expensive and only partly effective, as demonstrated by the string of notable outages, and can be seen, at best, as a way to minimize downtime. Major social networking companies, such as Google and Facebook, have figured out how to scale ut application stacks rather than scale up vertically.

Posted May 12, 2011

Through the many changes in IT over the years, one constant has always been a concern for performance. With database systems there is especially true. Even with the many advances in relational database technology, SQL performance still remains a key concern for IT professionals and management. Writing SQL for performance is one of the single biggest opportunities for professionals to contribute efficient, effective, cost saving deliverables to projects. Writing SQL for performance can also avoid having to respond to an urgent problem with performance in a production environment. To a considerable extent, a person can choose whether they are running because it is lunch or whether they are running because they are lunch, by following a few simple techniques for writing SQL for performance.

Posted May 12, 2011

River Parishes Community College (RPCC) is an open-admission, 2-year, public institution. It is located in the small Ascension Parish town of Sorrento in what is known as the River Parishes region of the state because of the parishes' proximity to the Mississippi River. RPCC recently implemented a new self-service student portal based on Revelation Software's Web 2.0 toolkit, OpenInsight for Web (O4W). The new portal allows students to accomplish a range of tasks on their own, such as scheduling classes, without requiring assistance from school administrators.

Posted May 12, 2011

Melissa Data Corp, a developer of data quality and address management solutions, has announced that customers can now access detailed property and mortgage data on more than 140 million U.S. properties by using the company's new WebSmart Property Web Service. The comprehensive solution is available for sourcing nearly any information on a given property - from parcel and owner information to square footage to zoning and more. The information provided by the service is all publicly available information that Melissa Data is compiling from various databases, Greg Brown, director of marketing for Melissa Data, tells DBTA. The service is expected to be particularly useful for property investors, mortgage and refinancing lenders, developers, real estate professionals, risk managers, insurance agencies, and companies looking to target market products and services to homeowners.

Posted April 27, 2011

"Big data" has emerged as an often-used catch phrase over the past year to describe exponentially growing data stores, and increasingly companies are bolstering their product lines to address the challenge. But helping companies manage and derive benefit from the onslaught of data has consistently been the focus for MarkLogic Corporation, whose flagship product, MarkLogic Server, is a purpose-built database for unstructured information. The company recently announced Ken Bado as its new chief executive officer and a member of the board of directors. In terms of new directions for the company as he takes the reins, Bado says, "First of all, you are going to see a much more aggressive message from MarkLogic with respect to unstructured and specifically ‘big data.' " In addition, there will also be changes seen in the company's go-to-market approach, he says. "Right now, our business model is a direct model through an enterprise-type selling machine, that has been quite effective in getting us to where we are, but there are three other levels that we need to address pragmatically to help us build scale and grow."

Posted April 27, 2011

Application Security, Inc. (AppSec), a provider of database SRC solutions for the enterprise, and Securosis, a security research and analysis firm, have partnered to provide what they are describing as the industry's first comprehensive guide to quantifying enterprise database security processes. "What we wanted to do was go to some of the experts in the industry who have not only been analysts but also lived in this environment and have them systematically go through the process and document everything from organizational considerations down to specific steps and then provide a means to quantify the man hours, the expenses, and the technologies associated with each step in this process," Thom VanHorn, vice president of marketing, AppSec, tells DBTA.

Posted April 27, 2011

Database Trends and Applications (DBTA) met with Oracle Applications Users Group (OAUG) president Mark C. Clark during the recent COLLABORATE 11 conference in Orlando, Florida. Clark, who became president of the users group earlier this year, is a senior partner for O2Works, which specializes in configuring the Oracle Applications to adhere to best practices and to streamline business operations. Now, more than 2 years following the financial meltdown of late 2008, it is clear that more users are again out attending COLLABORATE. "We have gone through a period of very tight IT budgets, a 2-to-4 year phase of maintenance. Everybody I am talking to is looking at opportunities to do projects this year. And if they aren't doing it this year, they are planning for it next year," said Clark, commenting on the renewed enthusiasm for attending the conference.

Posted April 27, 2011

Big data is one of those terms that is quickly gaining momentum among technologists. If you watch closely, you'll notice that everyone seems to have an opinion on what "big data" means and wants to own the term. As industry experts discuss what to name this problem, in 2011, companies will be tasked with bringing big data from back office offline analytics to customer-facing 24x7 production systems. Customers are paying attention and they need solutions that support not only massive data sets but also mixed information types, extended feature sets, real-time processing, and technical teams that have not hand-coded these systems from the ground up. Here are five big data solution trends we see developing as our customers work hard to solve "big data" or "big information" problems.

Posted April 05, 2011

Organizations today are beginning to understand that, second to their employees, data is their most critical asset. Consequently, they need to approach data management as they approach capital management - by employing disciplined methodologies utilizing automation and actionable intelligence. Once employed, these methodologies secure and protect data in a scalable and repeatable fashion, without requiring additional intervention from IT personnel or disturbing business processes. In the age of information overload, with the explosive growth of unstructured and semi-structured data, best practices help organizations of all sizes effectively manage, control and protect this valuable asset.

Posted April 05, 2011

On the surface, the idea of using a single source integrator to implement SAP's Enterprise Resource Planning (ERP) software seems ideal. The appeal lies in the potential to make an incredibly complex project appear simple. The single source model promises the ease of having only one vendor to pay, only one team to work with and a single source of accountability should things go wrong. Yet what works in theory doesn't always bear out in real world applications.

Posted April 05, 2011

A member of the Oracle Applications Users Group (OAUG) since 1992, Mark C. Clark recently took over as president of the organization. He spoke with DBTA about what's in store for members at the annual Oracle users conference COLLABORATE as well as for the year ahead. Helping members prepare for an upgrade to Oracle Applications Release 12, providing additional smaller, more targeted regional events, and a continued emphasis on a return to the basics with networking and education are at the top of his to-do list for 2011.

Posted March 23, 2011

McAfee has announced its intention to acquire Sentrigo, a privately owned provider of database security and compliance, assessment, monitoring and intrusion prevention solutions. In addition, McAfee has also announced a comprehensive database security solution to protect business-critical databases without impacting performance and availability. McAfee's coordinated approach based on the Security Connected initiative launched in October 2010, involves protecting a company's most important data assets from network to server to the database itself, resulting in data being protected in every state (data in motion, data at rest, and data in use) via access controls, network security, server security, data protection and encryption - all centrally managed to minimize risk and maximize efficiency.

Posted March 23, 2011

Revolution Analytics, a commercial provider of software and services based on the open source R project for statistical computing, and IBM Netezza announced they are teaming up to integrate Revolution R Enterprise and the IBM Netezza TwinFin Data Warehouse Appliance. According to the vendors, this will enable customers to directly leverage the capabilities of the open source R statistics language as they run high-performance predictive analytics from within data warehouse platforms.

Posted March 23, 2011

Despite highly publicized data breaches, ranging from the loss of personally identifiable information such as credit card and Social Security numbers at major corporations to the WikiLeaks scandal involving sensitive U.S. Department of Defense and U.S. State Department information, and the "alphabet soup" of compliance regulations, data around the globe remains at grave risk, according to John Ottman, president and CEO of Application Security, Inc., who has written "Save the Database, Save the World" to focus attention on the problem and present steps to its solution. While super secure networks are important, that alone is far from enough and a layered data security strategy with a commitment to "protecting data where it lives - in the database" must be pursued to avoid risks posed by outside hackers as well as authorized users, says Ottman. A stronger government hand may be needed as well to defend "the critical infrastructure that operates in the private sector," he suggests.

Posted March 23, 2011

Data continues growing rapidly, flowing into enterprises from traditional sources as well as new pipelines fueled by web and social media. Often presented in a range of formats and structures, this data onslaught phenomenon has come to be known as "big data." Companies, educational institutions, and government agencies are striving to meet the management challenge of this data deluge as well as mine this wealth of information for business advantage. In this special section, DBTA asks key vendors to explain their strategies for enabling customers to better handle ever-increasing data stores.

Posted March 09, 2011

NoSQL Option: Triplestore Databases

Posted March 09, 2011

The recent public release of thousands of leaked U.S. State Department cables by WikiLeaks continues to shake up governments across the world. The information captured and sent out to the wild is not only an embarrassment to U.S. government officials whose candid assessments of foreign leaders were exposed but also to the fact that that the organization with the tightest and most comprehensive data security technologies, protocols, and policies in the world unknowingly fell victim to a massive data breach. Can private corporations or smaller government agencies with less-stringent security protocols and standards expect to do any better? Securing data is tough enough, and now, with the increase of initiatives such as virtualization and cloud computing, the odds of loss of control and proliferation of sensitive data become even greater.

Posted March 09, 2011

A new survey of database administrators and managers reveals that a pervasive culture of complacency hampers information security efforts, and as a result of lax practices and oversight, sensitive data is being left vulnerable to tampering and theft. While tools and technologies provide multiple layers of data security both inside and outside the firewall, organizations appear to lack the awareness and will to make security stick. The study, "Data in the Dark: Organizational Disconnect Hampers Information Security," was conducted by Unisphere Research among 761 members of PASS, the Professional Association for SQL Server, in September 2010. The survey was fielded in partnership with Application Security, Inc.

Posted March 09, 2011

A new survey of 430 members of the Oracle Applications Users Group (OAUG) reveals that organizations lack a sense of urgency about securing critical data, and the greatest challenges to securing application and data environments are primarily organizational and budget-related. The survey was conducted by Unisphere Research, a division of Information Today, Inc., in partnership with Application Security, Inc. (AppSec), a provider of database security, risk and compliance solutions, in December 2010. According to the OAUG's 2011 Data Security report, "Managing Information in Insecure Times," 53% of respondents stated that budget was the greatest impediment holding back information security efforts. Thirty-three percent claimed a lack of an understanding of the threats prevents them from rallying support for countermeasures. And more than one-quarter of respondents cited a disconnect between IT teams and executive management as a major impediment to implementing proper security measures. The study shows a serious lack of understanding and concern for data and application security in today's organizations, according to Thom VanHorn, vice president global marketing at AppSec. "My take-away from the study is that there is a lack of communication, there is a lack of buy-in at the highest levels, and there is not a focus on implementing best practices," VanHorn says.

Posted February 23, 2011

Cloud and Hadoop - Keys to a Perfect Marriage Explored in New DBTA Webcast On-Demand

Posted February 23, 2011

The market for data warehouse appliances - solutions consisting of integrated software and hardware - is heating up, with new twists emerging from both established and new appliance vendors. Netezza, an early proponent of the appliance approach, was acquired in November 2010 by IBM. Here, Phil Francisco, vice president, product management and product marketing for IBM Netezza, shares his views on what's changing and what's ahead for appliances. Going forward, he anticipates that there will be very specific, vertically-oriented solutions that are built on appliances, which will take into account the kinds of data models and the kind of functionality that is required for industries such as telco, retail, and financial services.

Posted February 23, 2011

The SHARE conference convenes on February 27th in Anaheim, with an agenda packed with industry initiatives and knowledge-sharing on the latest best practices and technology trends. In this Q&A, SHARE president Janet Sun provides her vision for the IBM users group in the coming years. "We see the mainframe as the center of the enterprise IT universe. If you don't think so, try unplugging it," says Sun. "Our organization focuses on enterprise IT, and that includes the mainframe. Today's SHARE membership continues to strive to leverage advances in information technology, and SHARE is a great place to do that."

Posted February 23, 2011

Data growth is driving the use of virtualization within data centers. The virtualization evolution from server to storage to desktop is catching on at many small-to-medium size businesses, as well as at large enterprises. Aimed at providing a better end-user and administrator experience than their physical counterparts, virtualized desktops promise lower cost of acquisition and management with a highly scalable, easy-to-deploy and fully protected environment. However, with virtualization desktop infrastructure (VDI) comes a set of new challenges. Chief among these are storage and server resource allocation and data protection and recovery.

Posted February 02, 2011

IBM announced the latest release of the Informix database server, version 11.7, in October 2010, thus marking the fourth major release since Informix joined the company. One of the most exciting features in Informix 11.7 is the "Flexible Grid." Wouldn't you like to administer multiple servers as easily as a single server? Wouldn't you like to mix different hardware, operating systems, and versions of software? The Informix Flexible Grid provides this capability.

Posted February 02, 2011

DBTA Webcast on How to Improve DB Security with Virtual Patching Now Available on Demand

Posted February 02, 2011

There is a wealth of information, connections and relationships within the terabytes and petabytes of data being collected by organizations on distributed cloud platforms. Utilizing these complex, multi-dimensional relationships will be the key to developing systems to perform advanced relationship analysis. From predictive analytics to the next generation of business intelligence, "walking" the social and professional graphs will be critical to the success of these endeavors.

Posted February 02, 2011

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30

Sponsors