Newsletters




Trends and Applications



With many organizations having established data warehouse solutions there comes a point where the solution needs to be extended in an unusual direction. This challenge presented when a European mobile virtual network operator (MVNO) wanted to add both compliance and archiving functionality to their existing data warehouse. MVNOs differentiate themselves by very competitive pricing on certain parts of the call package, for example ethnic MVNOs target ethnic communities by providing inexpensive calls to their home country. To enable such offerings and to track costs, especially where subscribers start using the service in unexpected and possibly costly ways, is a key business function and this is where the data warehouse is used.

Posted August 09, 2012

Every now and then, the IT industry - vendors and customers alike - takes a common problem, gives it a catchy name, and drives the buzz (and market) rapidly to create new business opportunities for everybody. It's happening again with the "big data" phenomenon. It's here, it's real, and yes - it is probably going to cost you a lot of money over the next few years.

Posted August 09, 2012

Big data is toughfor enterprises to handle, and adding to the challenge is the fact that much of it is unstructured data—business documents, presentations, log files, and social media data. Respondents to a survey of 264 data managers and professionals—subscribers to Database Trends and Applications—almost unanimously agree that unstructured data is on the rise and ready to engulf their current data management systems. The trouble is, their management typically does not understand the scope of the challenge and is failing to recognize the significance of unstructured data assets to the business.

Posted August 09, 2012

Flash memory is taking the data center world by storm and creating new and innovative opportunities to challenge the status quo. This is occurring across numerous use cases: turning SQL server databases into fraud detection powerhouses for the world's largest retailers, scaling MySQL to support the infrastructure behind a premier sport league's mobile application, and capturing massive data volumes in real time with new NoSQL stores like MongoDB. Across the board, these organizations are breaking new ground with unprecedented performance, scaling beyond what was previously possible, and slashing infrastructure spending.

Posted August 09, 2012

Oracle has introduced a new migration tool that aims to make it easier for users to migrate data from SQL Server to MySQL. The new migration tool is integrated into MySQL Workbench, which allows the visual design, development and administration of MySQL Databases. According to Oracle, with MySQL Workbench, SQL Server developers and DBAs can easily convert existing applications to run on MySQL, both on Windows and other platforms. In addition, to address the growing demand from business analysts using MySQL for data marts and analytic applications, Oracle has announced a new "MySQL for Excel" application plug-in that allows users to import, export and manipulate MySQL data, without requiring prior MySQL technical knowledge.

Posted July 25, 2012

Information availability software provider Attunity has partnered with EMC to offer a high-performance big data replication solution for the EMC Greenplum Unified Analytics Platform, as well as dedicated optimizations to Attunity Replicate from EMC Greenplum. "Enterprise architects have to pay special attention to data flow. Data is their critical asset and right now enterprise architects are really challenged with a ‘data bottleneck,'" Matt Benati, Attunity's vice president of global marketing, tells 5 Minute Briefing.

Posted July 25, 2012

The volume of data now being stored by businesses is at a point where the term "big data" almost feels inadequate to describe it. The size of big data sets is a constantly moving target, ranging from a few dozen terabytes to many petabytes of data in a single data set. And it is estimated that, over the next 2 years, the total amount of big data stored by business will be four times today's volumes. As business continues its inexorable shift to the cloud, weblogs continue to fuel the big data fire. But there are plenty of other sources as well - RFID, sensor networks, social networks, social, Internet text and documents, Internet search indexing, call detail records, scientific research, military surveillance, medical records, photography archives, video archives, and large-scale e-commerce transaction records.

Posted July 25, 2012

Big data is one of the most significant industry disruptors in IT today. Even in its infancy, it has shown significant ROI and has almost universal relevance to a wide cross-section of the industry. Why? Big data turns traditional information architecture on its head, putting into question commonly accepted notions of where and how data should be aggregated, processed, analyzed, and stored. Enter Hadoop and NoSQL, the open source data-crunching platform. Although these technologies are hotter than an Internet IPO, you simply can't ignore your current investments - those investments in SQL which drive everything from your data warehouse to your ERP, CRM, SCM, HCM and custom applications.

Posted July 25, 2012

Big data analytics provider Datameer has announced that its technology is now integrated into Dell's Emerging Solutions Ecosystem, which offers complementary hardware, software, and services in a pre-packaged bundle. Built with Dell infrastructure, Datameer analytics technology is offered on top of Cloudera's distribution of Apache Hadoop. The software is well suited to organizations that need to make quick decisions based on structured and unstructured data, such as financial services, retail, telecommunications, and Web 2.0 companies. "The advantage to the end user is they can buy all of this from one source, it's all integrated together, it runs seamlessly, and it's an easy solution for people to purchase and use," Joe Nicholson, Datameer's vice president of marketing and business development, tells DBTA.

Posted July 25, 2012

Jaspersoft, maker of business intelligence (BI) software, today announced availability of Jaspersoft Business Intelligence 4.7, with reports generated by JasperReports Server now give casual users the ability to interact with more of their data. According to Jaspersoft, its open source business model and zero-cost per-user licensing fees make interactive reporting affordable for even the largest scale reporting projects. Additional improvements in Jaspersoft 4.7 include direct native connectivity to big data sources and expanded mobile device support. "This is exciting for a couple of reasons," Mike Boyarski, director of product marketing for Jaspersoft, tells DBTA. "One is that no BI tool out there so far has this level of interactivity for what we call casual BI users."

Posted July 25, 2012

Syncsort, a global leader in high-performance data integration solutions, has certified its DMExpress data integration software for high-performance loading of Greenplum Database. Syncsort has also joined the Greenplum Catalyst Developer Program. Syncsort DMExpress software delivers extensive connectivity that makes it easy to extract and transform data from nearly any source, and rapidly load it into the massively parallel processing (MPP) Greenplum Database without the need for manual tuning or custom coding. "IT organizations of all sizes are struggling to keep pace with the spiraling infrastructure demands created by the sheer volume, variety and velocity of big data," says Mitch Seigle, vice president, Marketing and Product Management, Syncsort.

Posted July 25, 2012

Compuware Corporation has launched a free online cloud service enabling organizations to compare the speed of their website's performance against leading competitor sites. SpeedoftheWeb.org provides a series of industry-relevant indexes with categorized collections of 1,000 of the world's most trafficked websites - as ranked by Alexa and Compuware Gomez benchmarks. As end-user expectations become greater, the cost of poor website performance also grows, both in terms of lost revenues and brand loyalty. In addition, end users are showing less patience for slow-performing websites, according to Compuware. In response to these heightening requirements, organizations need to ensure performance along every spot in the web application delivery chain, Alois Reitbauer, Compuware APM's Web 2.0 and Mobile Performance Expert and author of Java Enterprise Performance, tells DBTA.

Posted July 25, 2012

Imagine Google returning search results on "Lady Gaga" in 0.03 seconds, but taking 30 seconds to return results for "Lunar Eclipse." That might seem unacceptable; however, that is the reality for most of today's enterprise data analytics. Some queries can come back in seconds with others taking minutes or even hours. Perhaps with a lot of tuning ahead of time, ad hoc analysis query performance can be improved, but in most cases, it remains a huge challenge. When you have the right indexes, summary tables, and statistics, you can get answers quickly. The IBM Informix Warehouse Accelerator solves this problem using novel algorithms on modern hardware.

Posted July 11, 2012

U.S. state governments, the European Union, the U.S. Federal Trade Commission, and other governing bodies around the globe—it seems every regulatory body is debating its own definition of personally identifiable information (PII). Recent topics from behavioral marketing to GPS to Anonymous hacks have elevated privacy to the regulatory priority list. There is already significant regulatory variation about what data constitutes PII and personal health information (PHI). Existing rules were largely written in an attempt to solve known data challenges such as the problems of credit card fraud (PCI), identity theft (regional disclosure rules like the U.K. Data Privacy Act and U.S. state laws), and electronic data sharing inefficiencies (HIPAA). Some were written for loftier goals such as human rights and reputation management. They mandate controls over relatively easily characterized descriptors like credit card numbers, street addresses, and birthdates and affect physical as well as electronic disclosures.

Posted July 11, 2012

Big data is everywhere today. It fills IT headlines and keynotes technology conferences. It's become a favorite topic for both industry analysts and technology investors. With lots of computing power and better database storage techniques, Big data makes it practical to store and analyze petabytes and petabytes of detailed transactional and media data. But despite the headlines, big data is not the most compelling data need that the majority of business end users have. A far bigger challenge for most people is getting access to the right data to help them do their jobs better.

Posted July 11, 2012

Datameer has announced a new release of its big data analytics solution, which combines data integration, analytics and visualization of any data type in one application. The new capabilities offered in Datameer 2.0 are in two main categories, Joe Nicholson, vice president of marketing, Datameer, tells DBTA. One is adding new functionality and the other is bringing Hadoop to the desktop with Hadoop natively embedded in two of three new editions of the application.

Posted June 28, 2012

In a live presentation that was also made available on the web, Oracle CEO Larry Ellison unveiled Oracle's cloud strategy and introduced Oracle Cloud Social Services, a new enterprise social platform offering. "We made a decision to rebuild all of our applications for the cloud almost 7 years ago. We called that project ‘Fusion,'" he told the audience. While joking that, at the time, some competitors called it "con-fusion," Ellison also recounted the years of work, input from thousands of people, and billions of dollars that were required to enable Oracle to make the transition from being an on-premise application provider to being a cloud application provider - as well as an on-premise application provider - and to rewrite and modernize all of its applications.

Posted June 28, 2012

Lucid Imagination, a developer of search, discovery and analytics software based on Apache Lucene and Apache Solr technology, has unveiled LucidWorks Big Data, a fully integrated development stack that combines advantages of multiple open source projects including Hadoop, Mahout, R and Lucene/Solr to provide search, machine learning, recommendation engines and analytics for structured and unstructured content in one solution available in the cloud. "With more and more companies being challenged by the explosive growth of information, as has been widely reported, the vast majority of that content is unstructured or semi structured text, and traditional business intelligence or traditional analytics methodologies don't come close to addressing the vast percentage of content," Paul Doscher, CEO of Lucid Imagination, tells DBTA.

Posted June 28, 2012

IBM stepped up its smarter computing initiative with a broad range of performance and efficiency enhancements to its storage and technical computing systems - the engines of big data. As part of its ongoing smarter computing effort, IBM has announced a new strategic approach to designing and managing storage infrastructures with greater automation and intelligence, as well as performance enhancements to several key storage systems and the Tivoli Storage Productivity Center suite. IBM also announced its first offerings that incorporate software from IBM's acquisition of Platform Computing earlier this year. "Enterprises are dealing with data that is increasing exponentially in both size and complexity," said Rod Adkins, senior vice president of IBM Systems & Technology Group. The enhanced systems and storage solutions have the performance, efficiency, and intelligence to handle this big data, he added.

Posted June 28, 2012

Data analytics vendor Teradata and information management software provider Kalido have introduced a new joint solution that they say will allow customers to build or expand a data warehouse in 90 days or less, providing deeper analytics to users for improved business decision-making. This solution combines the Teradata Data Warehouse Appliance with the Kalido Information Engine, providing customers with a streamlined data consolidation tool that aggregates disparate data into a single unified platform.

Posted June 28, 2012

Companies are scrambling to learn all the various ways they can slice, dice, and mine big data coming in from across the enterprise and across the web. But with the rise of big data — hundreds of terabytes or petabytes of data — comes the challenge of where and how all of this information will be stored. For many organizations, current storage systems — disks, tapes, virtual tapes, clouds, inmemory systems — are not ready for the onslaught, industry experts say. There are new methodologies and technologies coming on the scene that may help address this challenge. But one thing is certain: Whether organizations manage their data in their internal data centers, or in the cloud, a lot more storage is going to be needed. As Jared Rosoff, director of customer engagement with 10gen, puts it: "Big data means we need ‘big storage.'"

Posted June 13, 2012

As data continues to grow unabated, organizations are struggling to manage it more efficiently. By better leveraging their expanding data stores and making the information available more widely, organizations hope to put big data to work — helping them to achieve greater productivity and more informed decision making, as well as compete more effectively as a result of insights uncovered by analytics on their treasure troves of information. Improving the management of big data is not something to consider addressing at some point in the hazy future — the big data challenge is already here, according to a new survey of 264 data managers and professionals who are subscribers to Database Trends and Applications.

Posted June 13, 2012

Social media network-based business intelligence represents the next great frontier of data management, promising decision makers vast vistas of new knowledge gleaned from exabytes of data generated by customers, employees, and business partners. Mining data from Facebook, Twitter, blogs, wikis, and internal corporate networks potentially may surface new insights into impending market shifts, patterns in customer sentiment, and competitive intelligence. It's a rich opportunity not lost on today's organizations, a new survey of 711 business and IT managers from across the globe reveals. A majority of respondents are either planning to collect and analyze data from both proprietary and public social media networks, or are doing so already.

Posted June 13, 2012

Informix Genero, a new IBM offering developed under partnership with Four Js Development Tools, is a logical enhancement to the Informix 4GL language and environment that offers extensive capabilities for developing modern web and desktop GUI applications, reports, and web services. With IBM Informix Genero, users can recompile 4GL applications and run them as GUI and web applications while retaining the business logic.

Posted May 23, 2012

With the advance of big data and the corollary increasing demand for business intelligence (BI), the market is experiencing an ongoing challenge in finding and retaining skilled BI professionals, according to Simon Boardman, vice president of marketing, Eagle Creek Software Services. This problem is being further exacerbated by stricter visa rules for skilled foreign workers, notes Boardman, who spoke with DBTA during the COLLABORATE 12 conference in Las Vegas.

Posted May 23, 2012

As a leader in pharmacy technology, National Health Systems, Inc. provides a range of services for the retail pharmacy industry. Built on a foundation of dedication and commitment to its customers and the profession of Pharmacy, NHS companies PDX, NHIN, and Rx.com provide pharmacies with the tools they need to provide the best possible patient care, manage their businesses, and enhance their competitiveness in the marketplace.

Posted May 23, 2012

Google has announced that Google BigQuery, a web service that lets users do interactive analysis of massive data sets, is now available to the public. Billed as enabling customers to "analyze terabytes of data with just a click of a button," the company says the data is secured, replicated across data centers, and can be easily exported.

Posted May 23, 2012

Informatica Corporation, a provider of data integration software and services, has announced the latest release of its Informatica Cloud solution. Offered as an integration platform-as-a-service (iPaaS), the latest release from Informatica features the Cloud Connector Toolkit, Cloud Integration Templates, and new enterprise features, all of which are part of the new Informatica Cloud Developer Edition and allow developers to rapidly embed end-user customizable integration logic and connectivity into cloud applications.

Posted May 23, 2012

Big data has become a big topic for 2012. It's not only the size, but also the complexity of formats and speed of delivery of data that is starting to exceed the capabilities of traditional data management technologies, requiring the use of new or exotic technologies simply to manage the volume alone. In recent years, the democratization of analytics and business intelligence (BI) solutions has become a major driving force for data warehousing, resulting in the use of self-service data marts. One major implication of big data is that in the future, users will not be able to put all useful information into a single data warehouse. Logical data warehouses bringing together information from multiple sources as needed is replacing the single data warehouse model. Combined with the fact that enterprise IT departments are continually moving towards distributed computing environments, the need for IT process automation to automate and execute the integration and movement of data between these disparate sources is more important than ever.

Posted May 09, 2012

There has been a significant change in the IT world recently; solution developers no longer believe the answer to all data management challenges is a relational database. After 40 years, data management was considered to be a quiet part of IT where the products and providers were firmly decided. It is evident that information management has again become quite dynamic with a broad set of solutions offering new options for managing the big data challenges of today.

Posted May 09, 2012

The term "big data" refers to the massive amounts of data being generated on a daily basis by businesses and consumers alike - data which cannot be processed using conventional data analysis tools owing to its sheer size and, in many case, its unstructured nature. Convinced that such data hold the key to improved productivity and profitability, enterprise planners are searching for tools capable of processing big data, and information technology providers are scrambling to develop solutions to accommodate new big data market opportunities.

Posted May 09, 2012

CIOs and IT departments are on the frontlines of a monumental IT shift. With the number of mobile devices and applications exploding and bandwidth soaring, they are being asked to find ways to enable the brave new world of enterprise mobility. All involved - from users to IT - recognize the productivity and business efficiency benefits of this trend, but it is typically only IT that also recognizes the dangers unchecked mobility poses to sensitive corporate data.

Posted May 09, 2012

It was welcome news that a common set of privacy standards are to be applied to organizations across the entire European Union (EU) for the first time - as well as the game plan that includes immediate notification of breaches and other "data misplacements." The new requirements are sure to create a lot of moaning and groaning back and forth across the pond about the new rules, but - as we have seen with the PCI DSS governance rules - after a short while, they will become the accepted business practice and part of the data protection and management landscape.

Posted April 26, 2012

Oracle addressed the need to make IT infrastructure and business analytics technologies simpler and more efficient in a presentation to OpenWorld Tokyo 2012 attendees that was also made available via live webcast. In addition to presenting its strategy and plans for business analytics, the company also unveiled new additions to its product portfolio. In his keynote address, Oracle president Mark Hurd explained how the business users of tomorrow will require faster and more comprehensive information access. "The true question with analytics is how to get the right information to the right person at the right time to make the right decision," he said.

Posted April 26, 2012

IBM has introduced DB2 10 and InfoSphere Warehouse 10 software that integrates with big data systems, automatically compresses data into tighter spaces to prevent storage sprawl, and slices information from the past, present, and future to eliminate expensive application code. Over the past 4 years, more than 100 clients, 200 business partners, and hundreds of experts from IBM Research and Software Development Labs around the world collaborated to develop the new software.

Posted April 26, 2012

MapR Technologies, Inc., provider of the MapR distribution for Apache Hadoop, has introduced new data connection options for Hadoop to enable a range of data ingress and egress alternatives for customers. These include direct file-based access using standard tools and file-based applications, direct database connectivity, Hadoop specific connectors via Sqoop, Flume and Hive; as well as direct access to popular data warehouses and applications using custom connectors. Additionally, technology providers Pentaho and Talend are partnering with MapR to provide direct integration with MapR's distribution, and MapR has also entered into a partnership with data warehouse and business intelligence platform vendor Tableau Software.

Posted April 26, 2012

Helping customers manage the influx of mobile devices, networks and applications in the enterprise, SAP has unveiled a new release of the Afaria mobile device management solution. With the 7.0 release of Afaria, SAP aims to allow enterprise IT to more effectively manage mobile applications and devices through a new user interface (UI) for simplified administration, improved workflow and enterprise integration capabilities. "The consumerization of IT is driving our innovation path and commitment to providing customers with the industry's most comprehensive, robust and streamlined mobility management platform, including mobile device management," says Sanjay Poonen, president, Global Solutions, SAP.

Posted April 26, 2012

Big Data. It's a term used to characterize those applications that have such enormous data sets, they have exceeded the capabilities and capacities of traditional database management systems. These huge data sets-measured in terabytes, petabytes, exabytes, and zettabytes-must instead span large clusters of servers and storage arrays. Search engine companies were the first to face this situation, and the result is an open source solution called Hadoop.

Posted April 11, 2012

Over the years, organizations tend to acquire more and more applications, usually for good business reasons. However, they often don't have the discipline to remove older, obsolete, or duplicate applications - even when the applications are inflexible and unable to adapt quickly enough to changing business conditions. These older applications generally run on inflexible legacy systems and while their core functions are essential to running the business, they can be a major drag at the same time. All too often, enterprises' applications portfolios are out-of-control, forcing them to waste previous resources supporting technology that has little or no business value.

Posted April 11, 2012

Winning in today's business world is tougher than ever. Unless companies make better decisions faster than competitors, they are sure to lose market share. The problems facing organizations are challenging and without access to timely, relevant, and up-to-date information, businesses are at a disadvantage. Companies in almost every sector need a complete and accurate view of customers in order to optimize sales revenue opportunities and to optimize customer satisfaction. The economic downturn also means that fraud has been on the rise, especially in information intensive industries like insurance, manufacturing, and retail.

Posted April 11, 2012

The database industry is changing. Some internet applications such as search engines and social networks have rolled out their own scale-out technologies, such as NoSQL and MapReduce, effectively ignoring the traditional database, and essentially accusing it of being too underpowered. The database titans in turn remind us why consistency, schemas and query languages matter. Both camps point out considerable weaknesses in each other's approaches. The question is, what will power the future information systems?

Posted April 11, 2012

There's no question that cloud computing is a hot commodity these days. Companies of all types and sizes are embracing cloud computing-both internally and from external service providers - as a way to cost-effectively build new capabilities. With the rapid growth of cloud comes new questions about responsibility within organizations, in terms of how services will be paid for, who has ultimate say over cloud decisions, and how cloud fits into the overall strategic direction of the business.

Posted March 28, 2012

There's no question that cloud computing is a hot commodity these days. Companies of all types and sizes are embracing cloud computing-both internally and from external service providers - as a way to cost-effectively build new capabilities. With the rapid growth of cloud comes new questions about responsibility within organizations, in terms of how services will be paid for, who has ultimate say over cloud decisions, and how cloud fits into the overall strategic direction of the business.

Posted March 21, 2012

Social media business intelligence is on the rise, according to a new Unisphere Research study sponsored by IBM and Marist College. The study found that while social media monitoring and analysis is in its early stages, many organizations plan to monitor, collect, stage and analyze this data over the next 1 to 5 years and. In particular, LOB respondents, who are closer to customers, show appreciation for the benefits of monitoring SMNs.

Posted March 21, 2012

MarkLogic Corporation has joined the technology partner program of Hortonworks, a leading vendor promoting the development and support of Apache Hadoop. According to the vendors, by leveraging MarkLogic and Hortonworks, organizations will be able to seamlessly combine the power of MapReduce with MarkLogic's real-time, interactive analysis and indexing on a single, unified platform. There are two main reasons that MarkLogic has chosen to partner with Hortonworks, says Justin Makeig, senior product manager at MarkLogic. One is Hortenworks' extensive experience with Hadoop installations and the second is that its core product is 100% open source.

Posted March 21, 2012

Pentaho Corporation, an open source business analytics company, has formed a strategic partnership with DataStax, a provider of big data solutions built upon the Apache Cassandra project, a high performance NoSQL database. The relationship will provide native integration between Pentaho Kettle and Apache Cassandra. This will merge the scalable, low-latency performance of Cassandra with Kettle's visual interface for high-performance ETL, as well as integrated reporting, visualization and interactive analysis capabilities. According to the companies, organizations seeking to leverage their big data have found it difficult to implement and employ analytics technologies. "One of the big challenges today is ease of use of these tools," says Ian Fyfe, Pentaho's chief technology evangelist. Often built on open source projects, it "takes a lot of deep skills to use these systems, and these are skills that are hard to find," he explains.

Posted March 21, 2012

Novell announced an update to its ZENworks suite, which includes integrated Mac device management, and full disk encryption capabilities. ZENworks 11 Support Pack 2 enables customers to lock out threats without shutting down IT access, the vendor says. ZENworks 11 now offers a more holistic approach to supporting Mac devices in the enterprise. With this release, Mac support is provided through Remote Management for Mac, Asset Management for Mac, Mac OSX Patching and Mac Bundles.

Posted March 21, 2012

The volume of business data under protection is growing rapidly, driven by the explosion of mobile computing, the use of powerful business applications that generate more data, and stringent regulations that require companies to retain data longer and maintain it in a format that is readily available upon request. The problem of massive data growth is particularly acute in traditional, large data-intensive enterprises that have become increasingly reliant on database-driven business automation systems, such as Oracle, SQL, and SAP. These organizations are also increasingly adopting a new wave of data-intensive applications to analyze and manage their "big data" - further compounding the problem.

Posted March 07, 2012

Organizational focus has been placed on the emergence of "big data" - large-scale data sets that businesses and governments use to create new value with today's computing and communications power. Big data poses many opportunities, but managing the rapid growth adds challenges, including complexity and cost. Leaders must address the implications of big data, increasing volume and detail of information captured by enterprises, the rise of multimedia, social media, and the internet.

Posted March 07, 2012

For enterprises grappling with the onslaught of big data, a new platform has emerged from the open source world that promises to provide a cost-effective way to store and process petabytes and petabytes worth of information. Hadoop, an Apache project, is already being eagerly embraced by data managers and technologists as a way to manage and analyze mountains of data streaming in from websites and devices. Running data such as weblogs through traditional platforms such as data warehouses or standard analytical toolsets often cannot be cost-justified, as these solutions tend to have high overhead costs. However, organizations are beginning to recognize that such information ultimately can be of tremendous value to the business. Hadoop packages up such data and makes it digestible.

Posted March 07, 2012

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24

Sponsors