▼ Scroll to Site ▼

Newsletters




Artificial Intelligence



Artificial Intelligence Articles

Delta Air Lines is embarking on a multi-year collaborative effort with IBM - including joining the IBM Q Network—to explore the potential capabilities of quantum computing to transform experiences for customers and employees. Through the IBM Q Hub at NC State University, Delta will have access to the IBM Q Network's fleet of universal hardware quantum computers for commercial use cases and fundamental research, including the recently-announced 53-qubit quantum computer, which, the company says, has the most qubits of a universal quantum computer available for external access in the industry, to date.

Posted January 27, 2020

Hitachi Vantara, a wholly owned subsidiary of Hitachi, Ltd., has announced plans to acquire the business of privately held Waterline Data, Inc. Waterline Data's patented "fingerprinting" technology uses AI- and rule-based systems to automate the discovery, classification, and analysis of distributed and diverse data assets to accurately and efficiently tag large volumes of data based on common characteristics.

Posted January 27, 2020

Circonus, provider of a machine data intelligence platform, is receiving $6.8 million in a Series A1 investment to continue advancing its machine data intelligence platform and meet market demand for machine data intelligence across multiple industries.

Posted January 21, 2020

Access Innovations director of business development Bob Kasenchak discusses how to manage the unique challenges of pre-digital source content in this clip from his presentation at Data Summit 2019.

Posted January 21, 2020

Dynatrace, a software intelligence company, is collaborating with Google and Microsoft on the OpenTelemetry project to shape the future of open standards-based observability.

Posted January 14, 2020

We are at the start of a new year and a new decade and technology is poised for fast change. Recently, Ram Chakravarti, CTO at BMC Software offered his predictions for an autonomous digital enterprise, edge computing and IoT solutions, ITSM, AIOps—and more.

Posted January 09, 2020

To fit into modern analytics ecosystems, legacy data warehouses must evolve—both architecturally and technologically—to deliver the agility, scalability, and flexibility that business need to thrive in today's data-driven economy. Alongside new architectural approaches, a variety of technologies have emerged as key ingredients of modern data warehousing, from data virtualization and cloud services, to Hadoop and Spark, and machine learning and automation.

Posted January 09, 2020

erwin, the data governance company, is releasing an updated version of erwin Data Modeler (erwin DM), introducing new innovation across its enterprise modeling and data governance suites.

Posted January 08, 2020

Today's enterprise clouds are evolving, and businesses need to evolve with them by embracing the next-generation cloud model. As a new year—and a new decade—unfolds, here are 5 predictions from Oracle on how new technologies and business models are changing.

Posted January 08, 2020

Pythian VP Lynda Partner stresses the importance of regarding data projects as software projects with attendant DevOps, DataOps, and ML Ops components in this clip from her keynote at Data Summit 2019.

Posted January 06, 2020

This has been a banner year for cybersecurity crime, with hackers targeting consumers, government agencies, and private corporations alike. According to the Ponemon Institute, the average total cost of a data breach is $3.86 million, and 80% of U.S. businesses expect that they will have had a critical breach this year. These numbers are not only sizable; they're alarming.

Posted December 30, 2019

Many organizations are experimenting with AI programs, but most of them face a significant and seemingly intractable problem. Although proof-of-concept (POC) projects and minimum viable products (MVPs) may show value and demonstrate a potential capability, frequently, they are difficult to scale.

Posted December 30, 2019

For many, governance remains a dirty word: It's bureaucratic, restrictive, and slows things down. This perception is diametrically opposed to data governance's true objective, which is to enable rapid yet appropriate exploitation of enterprise data assets.

Posted December 23, 2019

There's no doubt that AI has taken center stage in the enterprise data and analytics world, as evidenced by the mass quantities of related headlines, conferences, and vendor marketing. But hype aside, business executives are now discovering how to leverage AI for improved decision making with augmented or assistive intelligence solutions or the competitive advantages in new products and services. AI is proving its viability in the real world, including enterprise data and analytics.

Posted December 23, 2019

In 2019 artificial intelligence and machine learning continued its upward trajectory in the market, promising to change the future as we know it. To help support data management processes and decision making, artificial and augmented intelligence is being infused into products and services. Machine learning sits in the center of all AI conversations, as combining machine learning with AI and cognitive technologies can make it even more effective in processing large volumes of information. Both technologies can lead to automation of tasks inside and outside the enterprise-another subject that promises to make waves in the future.  Here, executives of leading companies offer 10 predictions for what's ahead in 2020.

Posted December 20, 2019

Dotscience, a provider of DevOps for Machine Learning (MLOps) solutions, is forming partnerships with GitLab and Grafana Labs, along with strengthening integrations with several platforms and cloud providers. The company is deepening integrations to include Scikit-learn, H2O.ai and TensorFlow; expanding multi-cloud support with Amazon Web Services (AWS) and Microsoft Azure; and entering a joint collaboration with global enterprises to develop an industry benchmark for helping enterprises get maximum ROI out of their AI initiatives.

Posted December 18, 2019

The next decade is just around the corner and enterprises in and around the big data space are preparing to pounce upon the next set of trends the new year will bring. The cloud is primed to continue making waves, along with other digital disruptions to improve user experiences. Several industry experts from SAP have offered up what they see as the top trends for 2020. 

Posted December 18, 2019

Oracle business strategist Lee Levitt laments the lack of data-driven organizations in today's business world in this clip from his keynote at Data Summit 2019.

Posted December 18, 2019

The big data ecosystem has changed. No longer is it true that people just want to store massive quantities of data. Action is not only needed but must be taken to sustain the viability of an organization. While some say big data is dead, the concept isn't going anywhere. Instead, it is the notion of inaction on big data that is dead. In addition, the technologies that were built to store and process big data are not loved by the industry; they are merely tolerated and are often maligned. They have been difficult to put into production, to maintain, manage, and even find people with the skills to do all the work.

Posted December 16, 2019

Accenture is acquiring Clarity Insights, a U.S.-based data consultancy with deep data science, artificial intelligence (AI), and machine learning (ML) expertise. The acquisition will add nearly 350 employees, along with a strong portfolio of accelerators, which can help organizations more quickly realize value from their data, to Accenture's Applied Intelligence business. These additions will further equip clients with leading capabilities to meet the growing demand for enterprise-scale AI, analytics, and automation solutions.

Posted December 13, 2019

As the urgency to compete on analytics continues to revolutionize the business world, more and more organizations are moving their data to the cloud to reduce infrastructure costs, increase efficiencies and improve time-to-value. At the same time, there are many success factors to consider, from the strengths and weaknesses of different cloud providers, to integration hurdles, data latency challenges and governance problems.

Posted December 12, 2019

dotData, focused on delivering full-cycle data science automation and operationalization for the enterprise, has achieved Advanced Technology Partner status in the Amazon Web Services (AWS) Partner Network (APN). Achieving APN Advanced Technology Partner status is recognition of dotData's ability to deliver data science automation and machine learning (ML) automation on AWS.

Posted December 12, 2019

GigaSpaces, the provider of Insight Edge, is releasing GigaSpaces Version 15.0, including updates to the InsightEdge Platform and XAP, to operationalize and optimize machine learning. GigaSpaces Version 15.0 powers machine learning operations (MLOps) initiatives, helping enterprises maximize the business value derived from big data.

Posted December 11, 2019

Qualitest, a software testing and quality assurance company, has acquired AI and machine learning company AlgoTrace, marking the first step of Qualitest's growth strategy following an investment from Bridgepoint earlier this year.

Posted December 11, 2019

Oracle OpenWorld, which has for many years been held in San Francisco during the month of September or October, is moving to Las Vegas. "Oracle is excited to offer a modern, state-of-the-art experience for attendees at Oracle OpenWorld and Code One 2020 in Las Vegas," an Oracle spokesperson said in a written statement. "The city and its vast amenities are tailor-made for hosting large-scale events, and we look forward to bringing the industry's most comprehensive technology and developer conference to America's premier hospitality destination."

Posted December 11, 2019

A.M. Turing Award Laureate and database technology pioneer Michael Stonebraker delivered a welcome keynote at Data Summit 2019. He discussed the "fly in the ointment to the data warehouse crowd."

Posted December 11, 2019

Processing big data in real-time for artificial intelligence, machine learning, and the Internet of Things poses significant infrastructure challenges. Whether it is for autonomous vehicles, connected devices, or scientific research, legacy NoSQL solutions often struggle at hyperscale. They've been built on top of existing RDBMs and tend to strain when looking to analyze and act upon data at hyperscale - petabytes and beyond.

Posted December 09, 2019

Prior to 2008, whatever your database question, the answer was always,"Oracle"—or sometimes, "MySQL" or, "SQL Server." The relational database management system (RDBMS) model—really a triumvirate of technologies combining Codd's relational data model, the ACID transaction model, and the SQL language—dominated database management systems completely.

Posted December 09, 2019

Compuware has announced the findings of a global survey of 400 IT leaders, which reveals that manual testing practices, still in widespread use, are one of the biggest challenges large organizations face as they attempt to accelerate digital innovation. The survey examined the processes that organizations have in place to deliver innovation on the mainframe as quickly as in their distributed environments—which are highly reliant on the mainframe.

Posted December 09, 2019

Let's cut right to the chase. Today, "big data" is just data, and the majority of organizations recognize the importance of being data-driven. But data, of whatever size, is only valuable if it's accessible, trustworthy, and usable.

Posted December 02, 2019

Think about the last time you filled out a paper form and contrast that with how many times you've filled out forms online. We live in an era where everything is digital. Forms are online, every click is captured, and even personal lives are documented on social media. The first wave of digitization led to more BI and better data-driven decisions. But, as we head into 2020, the focus has shifted from BI to operational analytics. Traditional BI was focused on enabling executives to make decisions using historical data. It was accelerated by technologies such as Hadoop, which were built for scale but could not deliver results.

Posted December 01, 2019

As we stand at the start of a new year and on the precipice of a new decade—the 2020s, DBTA reached out to industry leaders for their perspectives on not only what's ahead in the year 2020 but also what they see developing as the next decade unfolds.

Posted December 01, 2019

Whether you are reading the news, going to the store, dealing with customer service or sending a package, it has become apparent that AI is becoming part of our daily lives. We can see this on more of a macro level with the automotive industry and its adoption of AI to improve the overall driving experience, as well as the healthcare industry as it uses the technology to automate the process of identifying and ultimately diagnosing high-risk patient groups. Even the agriculture industry is taking advantage of AI to improve operating efficiency and assist with the automation of essential farming processes.

Posted December 01, 2019

ServerFarm, a data center developer and operator, is joining the NVIDIA DGX-Ready Data Center program, connecting service providers and enterprises with AI-ready facilities.

Posted November 27, 2019

Quest Software's Jason Hall discusses three key trends in database management—multi/hybrid, open-source databases, and DevOps--in this clip from his presentation at Data Summit 2019.

Posted November 26, 2019

SAP is releasing the latest version of S/4HANA that includes new integrations, better product design, and more. With the 1911 release, an initial integration is delivered between SAP S/4HANA Cloud and Experience Management solutions from SAP (Qualtrics). With this integration, SAP now becomes an experience management-driven company as we allow selected customers to share their SAP S/4HANA cloud software experience with SAP.

Posted November 20, 2019

Oracle is announcing a new CDM for B2C Service as well as a new high-velocity digital sales solution. The offerings were introduced in recent Oracle blog posts. Oracle Digital Sales will be Oracle's first CX application featuring Oracle's new Redwood user experience, which was introduced at Oracle OpenWorld.

Posted November 20, 2019

Kinetica's Active Analytics Platform now incorporates NVIDIA RAPIDS, a GPU-acceleration boost for machine learning, to deliver precision business predictions across the entire enterprise. The platform can be deployed on-premise or in the cloud via a broad ecosystem, including strategic partners such as NVIDIA, Dell Technologies, and Oracle Cloud Infrastructure.

Posted November 20, 2019

Aerospike, a provider of a patented hardware-optimized NoSQL data platform for real-time transactional and AI/ML-based applications that require machine speed and machine scale, has added new funding to expand the company's geographic presence, develop additional data infrastructure integrations, and grow enterprise partnerships.

Posted November 19, 2019

PlanetScale has announced the general availability of PlanetScale CNDb, a fully managed cloud native database designed on Vitess, a Cloud Native Computing Foundation (CNCF)-hosted open source project that serves massive scale production traffic at large web-scale companies such as YouTube, Slack, and Square.

Posted November 18, 2019

Today, more than ever, businesses rely on data to deliver a competitive edge. However, as applications continue to grow in scale and complexity, so does the challenge of supporting them. DBTA recently held a roundtable webinar with Shivani Gupta, principal product manager, Couchbase; Rick Golba, product marketing manager, Percona; and Craig Chaplin, senior product manager, Magnitude Software, who discussed new data management technologies and techniques for meeting the speed and scalability requirements of modern applications.

Posted November 13, 2019

IBM's new AI-powered monitoring solution is designed to help maintenance and operations leaders better understand and improve the performance of their high-value physical assets.

Posted November 11, 2019

Datameer, a provider of data preparation and exploration, is releasing Neebo, a new platform that enables analytics and data science teams to utilize information assets in hybrid landscapes. 

Posted November 06, 2019

Dotscience is offering new platform advancements that make deploying and monitoring machine learning models on Kubernetes clusters simple and accessible to data scientists. New Dotscience Deploy and Monitor features dramatically simplify the act of deploying ML models to Kubernetes and setting up monitoring dashboards for the deployed models with cloud-native tools Prometheus and Grafana.

Posted November 01, 2019

dotData has raised $23 million in Series A funding, bringing the total amount of funding raised to date to $43 million. The Series A financing round was led by JAFCO with participation by Goldman Sachs, which both join existing dotData seed-round investors NEC Corp.

Posted October 31, 2019

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14

Sponsors