Newsletters




Trends and Applications



The latest IOUG study on database security finds that there are measures that need to be taken to safeguard data from internal abuse; however, preventing privileged users from negligence or malfeasance is a serious challenge. According to this year's study, human error has beat out internal hackers or unauthorized users as the biggest security risk. In addition, more than half of respondents say their organizations still do not have, or are unaware of, data security plans to help address contingencies as they arise. These enterprise data security challenges, and more, are highlighted in a new survey of 350 data managers and professionals by the Independent Oracle Users Group. Underwritten by Oracle Corporation and conducted by Unisphere Research, a division of Information Today, Inc., it covered progress within three key areas of database security - prevention, detection, and administration.

Posted December 19, 2012

Attunity Ltd., a provider of information availability software solutions, has officially launched Attunity CloudBeam, a fully-managed data transfer SaaS platform for Amazon Web Services (AWS) Simple Storage Service (S3). With its beta completed, the high-performance data transfer solution was unveiled and demonstrated live at the AWS re: Invent Customer and Partner Conference from in Las Vegas, NV. "This is aimed at folks today that are using AWS or will be using AWS for all kinds of use cases where data is core to their strategy," Matt Benati, vice president of Global Marketing at Attunity, tells DBTA. "All these use cases demand the movement of data and it really has to be a frictionless movement of data at scale. That is what Attunity does best."

Posted December 19, 2012

Enterprise NoSQL vendor MarkLogic recently brought its summit series to New York. Themed as "Big Data, Beyond the Hype: Delivering Results," the one-day conference included presentations by MarkLogic executives as well as partners and customers. In his opening keynote, CEO Gary Bloom highlighted the need for a next-generation database to address the problems and opportunities posed by big data, while also cautioning that there are "a lot of shiny objects" in the market now trying to capture people's attention that may not deliver the necessary results.

Posted December 19, 2012

In the never-ending battle for enterprise data security, industry experts say there has been progress on several fronts, but there is still much work that needs to be done. There is an enormous amount of data that tends to leak out of the secure confines of data centers, creating a range of security issues. "There are many copies of data which have less security and scrutiny than production environments," Joseph Santangelo, principal consultant with Axis Technology, tells DBTA. "The increased reliance on outsourcers and internal contractors leave sensitive data within corporate walls open to misuse or mistakes." Or, as another industry expert describes it, the supply chain often proves to be the greatest vulnerability for data security. "A typical organization has a direct relationship with only 10% of the organizations in its supply chain — the other 90% are suppliers to suppliers," Steve Durbin, global vice president of the Information Security Forum, tells DBTA.

Posted December 06, 2012

Protecting databases using encryption is a basic data security best practice and a regulatory compliance requirement in many industries. Databases represent the hub of an information supply chain. However, only securing the hub by encrypting the database leaves security gaps because sensitive data also exists alongside the database in temporary files, Extract-Transform-Load (ETL) data, debug files, log files, and other secondary sources. According to the "Verizon 2011 Payment Card Industry Compliance Report," unencrypted data that resides outside databases is commonly stolen by hackers because it is easier to access

Posted December 06, 2012

Are organizations' systems and data environments ready for the big data surge? Far from it, a new survey shows. The survey of 298 members of the Independent Oracle Users Group (IOUG), conducted by Unisphere Research and sponsored by Oracle Corp., finds fewer than one out of five data managers and executives are confident that their IT infrastructure will be capable of handling the surge of big data. And big data is already here — more than one out of 10 survey respondents report having in excess of a petabyte of data within their organizations, and a majority report their levels of unstructured data are growing. Since big data incorporates so many different data types in varying volumes and from many different sources, it would make both data managers and end users' lives easier if it all could be brought into a single comprehensive framework that can be easily managed and accessed. This, in fact, has long been the holy grail of the IT and database industries — a vision that, unfortunately, has yet to be realized.

Posted December 06, 2012

While no one can dispute the importance of enterprise resource planning (ERP) systems to organizational performance and competitiveness, executives in charge of these systems are under intense pressure to stay within or trim budgets. Close to half of the executives in a new survey say they have held off on new upgrades for at least a few years. In the meantime, at least one out of four enterprises either are scaling back or have had to scale back their recent ERP projects due to budget constraints.

Posted December 06, 2012

For years, data warehouses and extract, transform and load (ETL) have been the primary methods of accessing and archiving multiple data sources across enterprises. Now, an emerging approach - data virtualization - promises to advance the concept of the federated data warehouse to deliver more timely and easier-to-access enterprise data. These are some of the observations made at Composite Software's third Annual Data Virtualization Day, held in New York City. This year's gathering was the largest ever, with nearly 250 customers and practitioners in attendance, Composite reports.

Posted November 13, 2012

TVSN is a 24x7x365 television shopping network that sells clothing, health and beauty aids, electronics, home furnishings, collectibles, and jewelry in Australia. Customers can place orders at any hour of the day or night any way they desire, by phone or online. Since TVSN is always open and always on, downtime is just not possible. Originally only available on cable TV, at 8:30 am on Monday October 24, 2012, TVSN in combination with Network Ten Australian flipped the switch to make television shopping available to anyone in Australia who has a television. As a result, TVSN now reaches 6.5 million households. TVSN has relied on Revelation Software since the late 1980s when it was called Demtel, a telemarketing company that was one of the first to run infomercial-style ads Down Under, using a Revelation G application in the call center and warehouse.

Posted November 13, 2012

Companies are facing serious external challenges managing aging IT infrastructures and application portfolios. To decrease costs and risks while increasing flexibility and innovation, many are turning to cloud technologies. By adopting cloud platforms, companies enable the delivery of "everything as-a-service." This empowers the workforce with faster any-device access to solutions that are available, affordable and ready to use. However, in order to realize cloud computing benefits, organizations must first transform and modernize their applications portfolios. Modernization is a key to business success and a significant challenge for chief information officers (CIOs).

Posted November 13, 2012

Big data is here, offering both vast opportunities — as well as vexing challenges — for every organization it touches. For a number of years, it has been understood that to be of value, information needs to be readily available, as close to real time as possible, to users in any location. Now, with the onset of "big data," the task gets more daunting. "These are all increasing the demands on both transactional and analytics data systems," says Bernie Spang, director of database software and systems for IBM.

Posted November 13, 2012

With information now being generated from all corners of the enterprise, executives, managers, and professionals can ask and get answers to questions they have never been able to consider. For companies that are able to offer business decision makers rapid and easy access to business intelligence (BI) or analytic data from which they can assemble their own interfaces and reports, this means competitive advantage. However, today's BI systems still present obstacles to realizing this vision, according to a new survey of 250 data managers and professionals, conducted by Unisphere Research, a division of Information Today, Inc., for Tableau Software. The survey findings are outlined in a new report, titled, "Opening Up Business Intelligence to the Enterprise: 2012 Survey of Data Professionals On Self-Service BI and Analytics."

Posted November 13, 2012

Software operates the products and services that we use and rely on in our daily lives. It is often the competitive differentiation for the business. As software increases in size, complexity, and importance to the business, so do the business demands on development teams. Developers are increasingly accountable to deliver more innovation, under shorter development cycles, without negatively impacting quality. Compounding this complexity is today's norm of geographically distributed teams and code coming in from third-party teams. With so many moving parts, it's difficult for management to get visibility across their internal and external supply chain. Yet, without early warning into potential quality risks that could impact release schedules or create long term technical debt, there may be little time to actually do something about it before the business or customers are impacted.

Posted October 24, 2012

The recent explosion of mobile applications has dramatically altered the consumer landscape, making it the norm for users and customers alike to expect access and support anytime, anywhere. With Cisco recently reporting that mobile-connected devices are set to exceed the world's population this year, it's no surprise that the surge is overflowing into the enterprise. While there are a few leading innovators in enterprise mobility, the vast majority of businesses are still struggling to take the first steps towards a streamlined strategy. The question is no longer "Do we?" but "How do we?"

Posted October 24, 2012

Melissa Data, a provider of contact data quality and direct marketing solutions, has announced Personator, an integrated data quality web service designed to provide identity verification and fraud prevention for e-commerce applications. Personator offers the ability to determine whether associations between different elements, such as name and address, are correct, thereby increasing accuracy by ensuring a valid and correct link between the data and identity of individual customer contacts.

Posted October 24, 2012

Simba Technologies has partnered with Hortonworks, a commercial vendor of Apache Hadoop, to provide ODBC access to Hortonworks Data Platform. The use of Simba's Apache Hive ODBC Driver with SQL Connector is aimed at providing Hortonworks customers with easy access to their data for BI and analytics using the SQL-based application of their choice. "Simba's Apache Hive ODBC Driver technology makes it easier for our customers to harness the power of their big data using popular and familiar BI and analytics applications," says Shaun Connolly, Hortonworks' VP, Strategy.

Posted October 24, 2012

Oracle CEO Larry Ellison laid out four key products in his opening keynote at Oracle OpenWorld this year. The announcements - all related to the cloud - include an Oracle IaaS offering in addition to PaaS and SaaS; the addition of an Oracle Private Cloud option; Oracle Database 12c; and the new Exadata X3.

Posted October 24, 2012

Percona Live 2012, a MySQL conference, was held in New York City. With nearly 300 attendees participating, the first day of the event featured tutorials with in-depth presentations on specific topics, while the second day focused on conference sessions. Also new at Percona Live this year was an exhibit hall for MySQL ecosystem participants to put their products on display and network with potential customers. Sponsors included Clustrix, Continuent, ScaleArc, Nimbus Data, Fusion-io, Tokutek, Codership, Couchbase, Akiban, Ospero, ParElastic, SkySQL, ScaleBase, and New Relic.

Posted October 24, 2012

Application performance management (APM) software provider Precise has announced the availability of Precise 9.5, a major product release designed to help organizations deliver a better experience for customers using their cloud and mobile applications. Precise 9.5 rapidly detects and analyzes application problems resulting from server and storage virtualization resource contention, and also addresses the challenge of managing mobile traffic growth. The new release focuses on three key themes, all with the common goal of identifying and resolving potential problems before they can affect the customer experience or cause an outage, Sherman Wood, vice president of product at Precise, tells DBTA.

Posted October 24, 2012

It is an understatement to say we're witnessing an example of Moore's Law — which states the number of transistors on a chip will double approximately every two years — as we seek to manage the explosion of big data. Given the impact this new wealth of information has on hundreds of millions of business transactions, there's an urgent need to look beyond traditional insight-generation tools and techniques. It's critical we develop new tools and skills to extract the insights that organizations seek through predictive analytics.

Posted October 10, 2012

When virtualization was first born, IT departments went gangbusters using this revolutionary change to get better performance out of their servers. In all the excitement of implementation, something not so very small was overlooked — backup and recovery. The lack of proper planning forced jobs and recovery to fail, and backup admins started feeling backed into a corner. Thankfully, times have changed, and IT departments, now very aware of these issues, have gotten savvy at avoiding the potential pains of virtualization infrastructure. But a new challenge has emerged.

Posted October 10, 2012

It's in the nature of hype bubbles to obscure important new paradigms behind a cloud of excitement and exaggerated claims. For example, the phrase "big data" has been so widely and poorly applied that the term has become almost meaningless. Nevertheless, beneath the hype of big data there is a real revolution in progress, and more than anything else it revolves around Apache Hadoop. Let's look at why Hadoop is creating such a stir in database management circles, and identify the obstacles that must be overcome before Hadoop can become part of mainstream enterprise architecture.

Posted October 10, 2012

For a long time, data integration has been the holy grail for data organizations, promising a single, accurate picture of relevant data from across the enterprise, regardless of original source or format. Now, with the rise of new approaches — including master data management (MDM) and data virtualization, there is hope that this goal is within reach. But the challenges keep on coming, and lately, there has been a surge in unstructured data that may fall outside MDM realms.

Posted September 26, 2012

Enterprise NoSQL database company MarkLogic Corporation today rolled out a new version of its flagship product,, MarkLogic 6, which includes new tools for faster application development, improved analytics and new visualization widgets to enable greater insight, and the ability to create user-defined functions for fast and flexible analysis of extremely large volumes of data. Key features of MarkLogic's NoSQL database include ACID transactions, horizontal scaling, real-time indexing, high availability, disaster recovery, government-grade security, and built-in search. With this release, in addition to MarkLogic's NoSQL flexibility, the company is focused on building features into the product that allow it to be easier to use and more accessible to a wider group of users within the enterprise.

Posted September 26, 2012

Percona, Inc. has announced the latest release of Percona Server, which it describes as its "enhanced drop-in replacement for MySQL." According to the company, Percona Server Version 5.5.27-28.0 includes new features that make it more valuable as an alternative for MySQL users. Offered free as an open source solution, Percona Server has self-tuning algorithms and support for high-performance hardware. In addition, the company is planning a two-day Percona Live Event for NYC in October and also for London in December, with speakers and tutorials spanning multiple tracks across the MySQL ecosystem. A more expansive, four-day conference, Percona Live MySQL Conference and Expo 2013, is planned for Santa Clara in April.

Posted September 26, 2012

NoSQL company Couchbase has announced its integration solution for VMware vFabric Application Director. The solution allows deployment and management of Couchbase Server on any VMware vCloud-powered private, public or hybrid clouds so that application middleware teams can more easily leverage Couchbase Server in their virtual and cloud environments. Because so many users are deploying Couchbase on the cloud, it seemed a natural fit to support VMware vFabric Application Director, Frank Weigel, vice president of products at Couchbase, tells 5 Minute Briefing.

Posted September 26, 2012

Database security company Application Security, Inc. (AppSecInc) has announced the general availability of a major new release of its flagship platform, DbProtect. Version 6.4 incorporates insights gained from 10 years of working with customers, Josh Shaul, CTO of AppSecInc, tells 5 DBTA. DbProtect is intended to let organizations evaluate the security of their database environment and have access to preventative controls so they can eliminate security risks without the need to patch or reconfigure databases. With this release, the product, which has been rebuilt from scratch, offers a much easier to use interface as well as the ability to provide various groups of stakeholders with individual views based on a single scan, thereby limiting the burden on the database as well as limiting user access based on roles, notes Shaul.

Posted September 26, 2012

Quest Software has introduced Toad Business Intelligence Suite, a packaged suite of tools to link traditional and non-traditional data sources, bridging the gap between BI environments and distributed big data sources. In addition, Toad BI Suite aims to span the divide between technical and non-technical users by offering tailored interfaces designed to meet their individual data provisioning and analytic needs.

Posted September 26, 2012

Address data quality is at the heart of all business operations. Inaccurate addresses can cost businesses anywhere between a few cents to many dollars per customer interaction. Undelivered products or information could result in dissatisfied or even lost customers, a cost that is far greater than the fee to implement an automated address check. Here, a list of the top 10 tips to select the right address data quality provider.

Posted September 11, 2012

In recent years, the networks of developers, integrators, consultants, and manufacturers committed to supporting database systems have morphed from one-on-one partnerships into huge ecosystems in which they have become interdependent on one another, and are subject to cross-winds of trends and shifts that are shaping their networks. Nowhere is this more apparent than the huge ecosystem that has developed around Oracle. With Oracle's never-ending string of acquisitions, new functionality, and widespread adoption by enterprises, trends that shape this ecosystem are certain to have far-reaching effects on the rest of the IT world. Concerns that percolate through the ecosystem reflect — and influence — broad business concerns. New paradigms — from cloud computing to big data to competing on analytics — are taking root within the Oracle ecosystem long before anywhere else.

Posted September 11, 2012

Are today's data systems — many of which were built and designed for legacy systems of the past decade — up to the task of moving information to end users at the moment they need it? And is this information timely enough? In many cases, there's a lot of work that still needs to be done before real-time information, drawn from multiple sources, becomes a reality. A new survey of 338 data managers and professionals who are subscribers to Database Trends and Applications reveals that real-time data access is still a distant pipe dream for at least half of the companies represented in the survey. The survey, conducted by Unisphere Research, a division of Information Today, Inc., in partnership with Attunity in March of 2012, finds that close to half of the survey respondents, 48%, report that relevant data within their organizations still take 24 hours or longer to reach decision makers. This suggests that much data is still batch-loaded overnight.

Posted September 11, 2012

Big data and cloud analytics vendor Kognitio has partnered with Xtremeinsights, a provider of solutions for leveraging Hadoop in existing data management systems. Together, the partners aim to deliver software and integration technologies to businesses that want to leverage the Hadoop platform and gain actionable insights from their big data. Using its in-memory analytical platform, Kognitio speeds up the analysis of data from Hadoop clusters, enabling ad hoc, real-time analytics at a significantly lower cost. "Xtremeinsights can build the underlying infrastructure so that your business users can do ad hoc analysis on ridiculous amounts of data and get answers in real-time," Michael Hiskey, Kognitio's vice president of marketing and business development, tells DBTA.

Posted August 23, 2012

SAP AG introduced a new solution to help organizations gain real-time insights into market trends and customer sentiment. The SAP rapid-deployment solution for sentiment intelligence with SAP HANA is intended to allow users to analyze customer sentiment from social networking sites, communities, wikis, blogs and other sources, and combine the information with CRM data. Customers that have had success getting started with big data analytics are the ones that have set out to solve a very specific use case or set out to solve a specific problem, David Jonker, director of marketing for database and technology at SAP, tells DBTA. "The rapid deployment solution for sentiment intelligence does exactly that."

Posted August 23, 2012

TransLattice, a provider of distributed databases and application platforms for enterprise, cloud and hybrid environments, has released TransLattice Elastic Database (TED), which the company describes as the world's first geographically distributed relational database management system (RDBMS). A single database can run on multiple TransLattice nodes around the world, allowing for greater data availability, performance, and scalability at a lower cost than traditional databases. "Since we have the ability to pre-position the data close to end users and have a node that's operating on their behalf in distributed queries, we can offer a much higher level of user experience than conventional systems," Michael Lyle, CTO of TransLattice, explains to DBTA. Additionally, TED makes it easier for global enterprises to comply with data jurisdiction policy requirements.

Posted August 23, 2012

Informatica, Talend, Jaspersoft, and Pervasive Software Inc. have joined the Google Cloud Platform Partner Program as Technology Partners. To help customers get the most out of its cloud platform products, explains Eric Morse, head of Sales and Business Development, for Google's Cloud Platform, Google work closely with technology companies that provide powerful complementary solutions integrated with the platform.

Posted August 23, 2012

Pentaho's Business Analytics 4.5 is now certified on Cloudera's latest releases, Cloudera Enterprise 4.0 and CDH4. Pentaho also announced that its visual design studio capabilities have been extended to the Sqoop and Oozie components of Hadoop. "Hadoop is a very broad ecosystem. It is not a single project," Ian Fyfe, chief technology evangelist at Pentaho, tells DBTA. "Sqoop and Oozie are shipped as part of Cloudera's distribution so that is an important part of our support for Cloudera as well - providing that visual support which nobody else in the market does today."

Posted August 23, 2012

Oracle has announced Oracle Exalogic Elastic Cloud Software 2.0. According to Oracle, customers in 43 countries across 22 industries have already adopted Oracle Exalogic, and it is the fastest growing Oracle engineered system with 3x Y/Y sales bookings based on the last two quarters of FY 2012. The second generation of Exalogic is raising the bar even further, with a single integrated system that addresses the key business goals of application owners - to seize market opportunities, lower business risk and reduce cost and complexity, noted Hasan Rizvi, senior vice president for product development at Oracle, who spoke during a webcast presentation to launch the new release.

Posted August 23, 2012

Data management software vendor Terracotta has released the latest version of its flagship product, BigMemory 3.7, providing performance at any scale through in-memory data management. The latest release offers improved in-memory access, allowing customers to store big data in real time with high application speed, performance, and scale. "We've made optimizations that allow people to put more data and capacity into our product so that they get more value out of the data that they put in," Gary Nakamura, general manager of Terracotta, explains to DBTA.

Posted August 23, 2012

With many organizations having established data warehouse solutions there comes a point where the solution needs to be extended in an unusual direction. This challenge presented when a European mobile virtual network operator (MVNO) wanted to add both compliance and archiving functionality to their existing data warehouse. MVNOs differentiate themselves by very competitive pricing on certain parts of the call package, for example ethnic MVNOs target ethnic communities by providing inexpensive calls to their home country. To enable such offerings and to track costs, especially where subscribers start using the service in unexpected and possibly costly ways, is a key business function and this is where the data warehouse is used.

Posted August 09, 2012

Every now and then, the IT industry - vendors and customers alike - takes a common problem, gives it a catchy name, and drives the buzz (and market) rapidly to create new business opportunities for everybody. It's happening again with the "big data" phenomenon. It's here, it's real, and yes - it is probably going to cost you a lot of money over the next few years.

Posted August 09, 2012

Big data is toughfor enterprises to handle, and adding to the challenge is the fact that much of it is unstructured data—business documents, presentations, log files, and social media data. Respondents to a survey of 264 data managers and professionals—subscribers to Database Trends and Applications—almost unanimously agree that unstructured data is on the rise and ready to engulf their current data management systems. The trouble is, their management typically does not understand the scope of the challenge and is failing to recognize the significance of unstructured data assets to the business.

Posted August 09, 2012

Flash memory is taking the data center world by storm and creating new and innovative opportunities to challenge the status quo. This is occurring across numerous use cases: turning SQL server databases into fraud detection powerhouses for the world's largest retailers, scaling MySQL to support the infrastructure behind a premier sport league's mobile application, and capturing massive data volumes in real time with new NoSQL stores like MongoDB. Across the board, these organizations are breaking new ground with unprecedented performance, scaling beyond what was previously possible, and slashing infrastructure spending.

Posted August 09, 2012

Oracle has introduced a new migration tool that aims to make it easier for users to migrate data from SQL Server to MySQL. The new migration tool is integrated into MySQL Workbench, which allows the visual design, development and administration of MySQL Databases. According to Oracle, with MySQL Workbench, SQL Server developers and DBAs can easily convert existing applications to run on MySQL, both on Windows and other platforms. In addition, to address the growing demand from business analysts using MySQL for data marts and analytic applications, Oracle has announced a new "MySQL for Excel" application plug-in that allows users to import, export and manipulate MySQL data, without requiring prior MySQL technical knowledge.

Posted July 25, 2012

Information availability software provider Attunity has partnered with EMC to offer a high-performance big data replication solution for the EMC Greenplum Unified Analytics Platform, as well as dedicated optimizations to Attunity Replicate from EMC Greenplum. "Enterprise architects have to pay special attention to data flow. Data is their critical asset and right now enterprise architects are really challenged with a ‘data bottleneck,'" Matt Benati, Attunity's vice president of global marketing, tells 5 Minute Briefing.

Posted July 25, 2012

The volume of data now being stored by businesses is at a point where the term "big data" almost feels inadequate to describe it. The size of big data sets is a constantly moving target, ranging from a few dozen terabytes to many petabytes of data in a single data set. And it is estimated that, over the next 2 years, the total amount of big data stored by business will be four times today's volumes. As business continues its inexorable shift to the cloud, weblogs continue to fuel the big data fire. But there are plenty of other sources as well - RFID, sensor networks, social networks, social, Internet text and documents, Internet search indexing, call detail records, scientific research, military surveillance, medical records, photography archives, video archives, and large-scale e-commerce transaction records.

Posted July 25, 2012

Big data is one of the most significant industry disruptors in IT today. Even in its infancy, it has shown significant ROI and has almost universal relevance to a wide cross-section of the industry. Why? Big data turns traditional information architecture on its head, putting into question commonly accepted notions of where and how data should be aggregated, processed, analyzed, and stored. Enter Hadoop and NoSQL, the open source data-crunching platform. Although these technologies are hotter than an Internet IPO, you simply can't ignore your current investments - those investments in SQL which drive everything from your data warehouse to your ERP, CRM, SCM, HCM and custom applications.

Posted July 25, 2012

Big data analytics provider Datameer has announced that its technology is now integrated into Dell's Emerging Solutions Ecosystem, which offers complementary hardware, software, and services in a pre-packaged bundle. Built with Dell infrastructure, Datameer analytics technology is offered on top of Cloudera's distribution of Apache Hadoop. The software is well suited to organizations that need to make quick decisions based on structured and unstructured data, such as financial services, retail, telecommunications, and Web 2.0 companies. "The advantage to the end user is they can buy all of this from one source, it's all integrated together, it runs seamlessly, and it's an easy solution for people to purchase and use," Joe Nicholson, Datameer's vice president of marketing and business development, tells DBTA.

Posted July 25, 2012

Jaspersoft, maker of business intelligence (BI) software, today announced availability of Jaspersoft Business Intelligence 4.7, with reports generated by JasperReports Server now give casual users the ability to interact with more of their data. According to Jaspersoft, its open source business model and zero-cost per-user licensing fees make interactive reporting affordable for even the largest scale reporting projects. Additional improvements in Jaspersoft 4.7 include direct native connectivity to big data sources and expanded mobile device support. "This is exciting for a couple of reasons," Mike Boyarski, director of product marketing for Jaspersoft, tells DBTA. "One is that no BI tool out there so far has this level of interactivity for what we call casual BI users."

Posted July 25, 2012

Syncsort, a global leader in high-performance data integration solutions, has certified its DMExpress data integration software for high-performance loading of Greenplum Database. Syncsort has also joined the Greenplum Catalyst Developer Program. Syncsort DMExpress software delivers extensive connectivity that makes it easy to extract and transform data from nearly any source, and rapidly load it into the massively parallel processing (MPP) Greenplum Database without the need for manual tuning or custom coding. "IT organizations of all sizes are struggling to keep pace with the spiraling infrastructure demands created by the sheer volume, variety and velocity of big data," says Mitch Seigle, vice president, Marketing and Product Management, Syncsort.

Posted July 25, 2012

Compuware Corporation has launched a free online cloud service enabling organizations to compare the speed of their website's performance against leading competitor sites. SpeedoftheWeb.org provides a series of industry-relevant indexes with categorized collections of 1,000 of the world's most trafficked websites - as ranked by Alexa and Compuware Gomez benchmarks. As end-user expectations become greater, the cost of poor website performance also grows, both in terms of lost revenues and brand loyalty. In addition, end users are showing less patience for slow-performing websites, according to Compuware. In response to these heightening requirements, organizations need to ensure performance along every spot in the web application delivery chain, Alois Reitbauer, Compuware APM's Web 2.0 and Mobile Performance Expert and author of Java Enterprise Performance, tells DBTA.

Posted July 25, 2012

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25

Sponsors