Database Management Articles
Lectures related to master data bring forth all sorts of taxonomies intended to help clarify master data and its place within an organization. Sliding scales may be presented: at the top, not master data; at the bottom, very much master data; in the middle, increasing degrees of "master data-ness." For the longest of times everyone thought metadata was confusing enough ... oops, we've done it again. And, we have accomplished the establishment of this master data semantic monster in quite a grand fashion.
Posted February 15, 2012
Oracle has announced the availability of Oracle Advanced Analytics, a new option for Oracle Database 11g that combines Oracle R Enterprise with Oracle Data Mining. According to Oracle, Oracle R Enterprise delivers enterprise class performance for users of the R statistical programming language, increasing the scale of data that can be analyzed by orders of magnitude using Oracle Database 11g.
Posted February 14, 2012
Today's organizations must capture, track, analyze and store more information than ever before - everything from mass quantities of transactional, online and mobile data, to growing amounts of "machine-generated data" such as call detail records, gaming data or sensor readings. And just as volumes are expanding into the tens of terabytes, and even the petabyte range and beyond, IT departments are facing increasing demands for real-time analytics. In this era of "big data," the challenges are as varied as the solutions available to address them. How can businesses store all their data? How can they mitigate the impact of data overload on application performance, speed and reliability? How can they manage and analyze large data sets both efficiently and cost effectively?
Posted February 09, 2012
Many types of data change over time, and different users and applications have requirements to access data at different points in time. A traditional DBMS stores data that is implied to be valid at the current point-in-time, it does not track the past or future states of the data. For some, the current, up-to-date values for the data are sufficient. But for others, accessing earlier versions of the data is needed. Temporal support makes it possible to store different database states and to query the data "as of" those different states.
Posted February 09, 2012
InterSystems Corporation, a provider of advanced database, integration and analytics technologies, announced it has become ISO 9001:2008 certified. ISO 9001:2008 is a quality management standard, and for InterSystems, the certification covers all processes related to the product and service creation associated with the InterSystems CACHÉ high-performance database and InterSystems Ensemble integration and development platform that are performed or managed from InterSystems' Cambridge-based headquarters.
Posted February 09, 2012
Ntirety, Inc. announced that it has been successfully audited and certified under the MSPAlliance's (MSPA) Unified Certification Standard for Cloud and Managed Service Providers (UCS). The certification is specifically designed to provide business consumers of cloud and managed services with the assurance that the service provider they hire will meet or exceed the highest principles of quality in areas such as financial stability, facilities, managed services practices, and customer satisfaction.
Posted February 07, 2012
KXEN, a provider of predictive analytics for business users, has certified its flagship product, InfiniteInsight, for Sybase IQ v.15, the column-based analytics server. The combination of the two solutions allows businesses to gain speed and performance in building predictive models and social network analysis to support business decisions.
Posted January 25, 2012
Star Analytics, Inc., a provider of application process automation and integration software, has released the latest version of its data bridging technology, Star Integration Server, which is intended to make it easy to extract and combine data from Oracle systems with other business intelligence (BI) applications and data warehouses.
Posted January 24, 2012
HiT Software, Inc. has announced a new release of its JDBC/DB2 type 4 SQL middleware, which conforms to the Java JDBC 4.1 specification. With this latest release, application developers can take advantage of added support for additional data types, improved security mechanisms and support for IBM DB2 on a wide range of systems and platforms.
Posted January 23, 2012
"Big data" and analytics have become the rage within the executive suite. The promise is immense - harness all the available information within the enterprise, regardless of data model or source, and mine it for insights that can't be seen any other way. In short, senior managers become more effective at business planning, spotting emerging trends and opportunities and anticipating crises because they have the means to see both the metaphorical trees and the forest at the same time. However, big data technologies don't come without a cost.
Posted January 11, 2012
Let's tie together the last several columns on "2012 Might Really be The End of the World." In this series, I discussed several megatrends in the general IT industry that will have a tremendous impact on the database administration (DBA) profession. The megatrends include both software-related (virtualization and cheap cloud database services) and hardware-related (SSDs and massively multi-core CPUs). These technologies have the potential to obviate many of the core competencies of the DBA, with the first two eliminating or lessening the need for server and hardware configuration and provisioning, and the last two diminishing the need for IO tuning and query tuning, respectively. But those are trends that will take years to reach fruition. What about the near future?
Posted January 11, 2012
Along with thousands of IT professionals, I was in the San Francisco Moscone Center main hall last October listening to Larry Ellison's 2011 Oracle Open world keynote. Larry can always be relied upon to give an entertaining presentation, a unique blend of both technology insights and amusingly disparaging remarks about competitors.
Posted January 11, 2012
Retaining the particulars of change over time is a fairly intricate configuration. Audit log or shadow tables are sometimes employed, but on occasion there is a need for the "old" and "new" rows to exist in a single operation table for application use. Far too often, the implementation of temporal data structures is shoddy, loose, and imprecise; rather than the fairly complex dance move such temporal arrangements must perform in actuality. The sub-optimal result is much like one's performance of the Funky Chicken at a friend's wedding; the desired moves are mimicked, after a fashion, but it is unlikely to earn high marks on "So You Think You Can Dance." The usual temporal implementation simply slaps on start and stop dates, debates a little over default date values versus NULLs, then moves on to the next subject.
Posted January 11, 2012
At the outset of each new year, I devote an edition of my column to review the significant data and database-related events of the previous year. Of course, to meet my deadlines, the column is written before the year is over (this column is being written in November 2011), so please excuse any significant news that may have happened late in December.
Posted January 11, 2012
The latest version of expressor software's flagship data integration platform, expressor 3.5, features cloud integration with Melissa Data's Data Quality Tools and Salesforce.com to provide comprehensive BI reporting and CRM integration with on premises applications. The new Salesforce.com and Melissa Data capabilities ship with expressor 3.5 Desktop Edition and Standard Edition.
Posted January 10, 2012
ScaleBase, Inc. has announced the results of its database benchmark test. ScaleBase has achieved 180,000 Transactions per Minute - the highest result for a MySQL database - while running on an Amazon RDS environment. According to the company, the ScaleBase Load Balancer solution proved how well it can scale MySQL, by running a DBT-2 benchmark, which is similar to the standard TPC-C benchmark, on the Amazon EC2 platform with the Amazon RDS database.
Posted January 10, 2012
Oracle has announced the availability of Oracle Solaris Studio 12.3, a C, C++, and Fortran development platform for building fast, scalable enterprise applications for Oracle Solaris systems. According to Oracle, the new release accelerates performance of SPARC T4 and x86-based applications up to 300% by leveraging Oracle's advanced compiler technology.
Posted January 04, 2012
BNP Paribas has implemented Oracle Exadata Database Machine to manage electronic trading floor data. BNP Paribas' data warehouse manages billions of messages in real-time processing a terabyte of raw data daily. A half-rack Oracle Exadata Database Machine has helped BNP Paribas better manage data growth and improve system performance.
Posted January 04, 2012
Sybase has launched the "Mobility Manifesto" site to support and encourage enterprise workers to have access to the devices and applications they want to use. The Mobility Manifesto allows enterprise workers to take a quiz to find out where their company ranks on mobility, share the results with their boss through an auto-fill letter that sets the tone for change in the workplace, and download an ibook including guidance and advice for both end user and IT.
Posted December 21, 2011
SAP AG has announced that since introducing SAP HANA a year ago customer and partner demand for the technology has surged. According to Dr. Vishal Sikka, member of the SAP executive board, Technology & Innovation, leading independent software vendors are adopting the open SAP HANA platform for their existing products and also building completely new applications as well. The company also announced at the recent SAP Influencer Summit 2011 in Boston that SAP HANA is at the core of its platform roadmap, powering both renewed applications without disruption as well as new ones.
Posted December 21, 2011
The first calendar year following SAP's acquisition of Sybase is coming to a close. David Jonker, director, product marketing - Data Management & Analytics, Sybase, discusses key product integrations, IT trends that loom large in Sybase's data management strategies, and the emergence of what Sybase describes as DW 2.0. 2011 has been "a foundational year," with effort focused on making Sybase technologies work with SAP and setting the stage for 2012, says Jonker. "We believe 2012 is going to be a big year for us on the database side."
Posted December 21, 2011
LexisNexis, a pioneer of information technology, has selected big data specialist MarkLogic to power components of the new platform behind Lexis Advance, its legal research solution. LexisNexis, a pioneer of information technology, has selected big data specialist MarkLogic to power components of the new platform behind Lexis Advance, its legal research solution.
Posted December 08, 2011
EMC Corporation has introduced the EMC Greenplum Unified Analytics Platform (UAP), a platform to support big data analytics, that combines the co-processing of structured and unstructured data with a productivity engine that enables collaboration among data scientists. The new EMC Greenplum UAP brings together the EMC Greenplum database for structured data, the enterprise Hadoop offering EMC Greenplum HD for the analysis and processing of unstructured data, and EMC Greenplum Chorus, its new productivity engine for data science teams. Greenplum UAP will be available in the first quarter of calendar 2012.
Posted December 08, 2011
A new blog on the SHARE website places the focus on the mainframe as a big data workhorse and the reigning alternative to internal (or external) cloud provision. Pedro Pereira, authoring the blog in SHARE's "President's Corner," makes several astute observations including identifying security and availability as unknowns in a cloud environment.
Posted December 06, 2011
Join Oracle and Unisphere for a live webcast to learn more about common practices that are most vulnerable to fraud and error, and the best practices and technologies used by leading vs. laggard organizations to drive the hidden costs out of operations and enforce process controls. Speakers will include Thomas J. Wilson, president, Unisphere Research; Joseph McKendrick, analyst, Unisphere Research; and Stephanie Maziol, director GRC Applications, Oracle.
Posted December 06, 2011
Three columns ago, I started a series of articles pointing out that tough times are a-comin' for the DBA profession due to major disruptive changes in the wider IT world (see "2012 Might Really Be the End of the World as We Know It"). In previous columns, I have told you about how our lives will change due to major technological changes caused by things such as Solid State Disks (SSD) and massively multicore CPUs.
Posted December 06, 2011
In a world replete with regulations and threats, organizations today have to go well beyond just securing their data. Protecting this most valuable asset means that companies have to perpetually monitor their systems in order to know who did exactly what, when and how - to their data.
Posted December 01, 2011
The cost for new development can often be easily justified. If a new function is needed, staffing a team to create such functionality and supporting data structures can be quantified and voted up or down by those controlling resources. Money can be found to build those things that move the organization forward; often, the expense may be covered by savings or increased revenue derived from providing the new services.
Posted December 01, 2011
expressor software, a provider of data integration software, has launched a product initiative aimed at simplifying the development of expressor and Teradata Express analytical database applications based on a diverse set of operational data sources. One of the biggest challenges people face is figuring out how to get data loaded into their database in order to begin using it for analysis and development without using traditional ETL solutions which can be very costly, Hugo Sheng, director, field engineering, expressor, tells 5 Minute Briefing.
Posted November 22, 2011
Being a successful database administrator requires far more than technical acumen and database knowledge. DBAs should be armed with a proper attitude as well as sufficient fortitude and personality before attempting to practice database administration. Gaining the technical know-how is important, yes, but there are many sources that offer technical guidance for DBAs. The non-technical aspects of DBA are just as challenging, though. So with that in mind, this month's column will offer 10 "rules of thumb" for DBAs to follow as they improve their soft skills.
Posted November 22, 2011
UC4, an IT automation software vendor, announced a partnership with Basis Technologies International (BTI), a provider of add-on solutions that optimize SAP, intended to deliver targeted automation solutions for SAP customers. The joint offering, dubbed "UC4 MDR powered by BTI," will integrate BTI's Mass Data Runtime - an SAP NetWeaver-based solution that delivers data processing improvements - with UC4's ONE Automation platform. The combination of these two products is designed to help accelerate and orchestrate business processes, applications and infrastructure with SAP environments.
Posted November 16, 2011
Sybase, an SAP company, has unveiled a new guide on big data analytics titled "Intelligence for Everyone: Transforming Business Analytics Across the Enterprise." The objective of the guide is to demonstrate through facts and examples that there are tools and methods to make sense of new and massive data sets, service all users at all levels, and analyze many different data types to provide actionable answers.
Posted November 16, 2011
Oracle has introduced PeopleSoft HCM 9.1 Feature Pack 2 which provides a consumer-like self-service user experience to improve the way employees and managers perform their day-to-day activities. Oracle's Feature Packs enable a quicker response to PeopleSoft customer requests, while enabling customers to choose how and when to deploy the new functionality to their users.
Posted November 16, 2011
Oracle has announced the availability of Oracle Solaris 11, which the company says can meet the security, performance and scalability requirements of cloud-based deployments and enable customers to run their most demanding enterprise applications in private, hybrid, or public clouds.
Posted November 16, 2011
SAP AG has announced the availability of SAP NetWeaver Business Warehouse (SAP NetWeaver BW) 7.3 running on the SAP HANA platform. SAP HANA can enhance the query performance and provide faster data loads in SAP NetWeaver BW, and help customers significantly reduce the total cost of ownership for customers by simplifying the administration with reduced data layers, according to SAP. The announcement was made at the SAPPHIRE NOW and SAP TechEd co-located event in Madrid.
Posted November 16, 2011
Sybase has unveiled a new version of the Sybase IQ high performance column-based analytics database, due to be generally available by the end of November. "This really is an extension into big data - and big data is characterized by a lot of things - but we see the trends in the market around MapReduce and Hadoop in database analytics and we have added those capabilities into IQ 15. 4," Dan Lahl, director of product marketing at Sybase, tells 5 Minute Briefing. With the new release of IQ, he notes, Sybase IQ provides customers a "have it your way" approach.
Posted November 16, 2011
Few things in the world are changing as dramatically as data. Data has tapped out a powerful rhythm to keep time with technology's bleeding edge, leaving many technologies struggling to keep up. It should come as no surprise, then, that many of the data strategies that IT departments developed, and still widely rely upon, are no longer sufficient for today's needs. You can put data marts near the top of that list.
Posted November 10, 2011
It is not magic. Building a successful IT solution takes time. And that time is used in various ways: obtaining an understanding of the goal; mapping out what components are necessary and how those components interact; testing components and their interaction; and finally migrating those components into the production environment - otherwise known as analysis, design, development, testing, and deployment. Regardless of the methodology employed, these functions must always be addressed. Different approaches focus on differing needs and aspects. But any complete methodology must fill in all the blanks for accomplishing each of these tasks.
Posted November 10, 2011
InterSystems Corporation has announced that Kettering Health Network has completed an enterprise-wide transition to the InterSystems Ensemble rapid integration and development platform. Kettering is an integrated delivery network (IDN) comprised of more than 60 state-of-the-art facilities where patients throughout the Dayton, Ohio, area are served by more than 1,200 physicians.
Posted November 10, 2011
Schooner Information Technology, Inc., which develops and markets high-availability (HA) high-performance MySQL software for mission-critical applications, has unveiled SchoonerSQL v5.1, a full build of the MySQL database and its standard InnoDB storage engine, with additional Schooner enhancements.
Posted November 02, 2011
Oracle has entered into an agreement to acquire Endeca Technologies, Inc., a provider of unstructured data management, web commerce and business intelligence solutions. "Together, we will provide best-in-class technology to manage structured and unstructured data together; business intelligence tools to analyze structured and unstructured data together; and a broad suite of packaged applications which extends the value of unstructured data into ERP, Supply Chain, CRM, EPM, Web Commerce, and specialized applications," said Thomas Kurian, executive vice president, Oracle Development, in an announcement issued by Oracle. "This technology will also allow us to integrate more comprehensive unstructured data management into Oracle's engineered systems."
Posted November 02, 2011