▼ Scroll to Site ▼

Newsletters




Internet of Things

The phenomenon of connected machines is sometimes called the Internet of Things, the Internet of Anything, the Internet of Everything, or M2M (machine to machine). But no matter what the name, the growth of technology related to objects, which never before were network- and computer-enabled, is projected to have far-reaching technological, societal, and economic impact.

The strongest examples of the impact of the Internet of Things are in the industrial sector. Embedded software, sensors, and network connectivity promises to improve the way factories, data centers, oil wells, and cities, airplanes, cars, and even homes are maintained because data can be collected continuously with alerts issues proactively to prevent failures and outages.

According to Cisco's Internet Business Solutions Group, 50 billion devices will be connected by 2020, up from 2010's 12.5 billion. By 2020, data production will be 44 times greater than it was in 2009, and by 2020, more than one-third of the data produced will live in or pass through the cloud, according to Computer Sciences Corp.



Internet of Things Articles

Utilizing data lakes are an alluring option for users with an enormous amount of information, yet questions remain regarding data accuracy, security, and relevancy. Three experts in the big data space, including Anne Buff, business solutions manager for SAS best practices at the SAS Institute, Abhik Roy, database solution engineer at Experion, and Tassos Sarbanes, data architect at Credit Suisse, participated in a roundtable discussion at Data Summit 2016 that focuses on these questions and more regarding data lakes.

Posted May 25, 2016

With newer and newer big data sources exploding onto the scene, traditional data warehouses are being challenged. Satya Bhamidipati, director of business development of big data and advanced analytics at Oracle, discussed data mining and advanced analytics techniques that will enable the monetization of data during a session at Data Summit 2016.

Posted May 25, 2016

Big data represents an enormous shift for IT, said Craig S. Mullins in a presentation at Data Summit 2016 in NYC that looked at what relational database professionals need to know about big data technologies. Mullins, a principal of Mullins Consulting, and the author of the DBA Corner column for DBTA, provided an overview of the changes that have taken place in the DBTA arena in recent years, and the key technologies that are having high impact.

Posted May 25, 2016

Here we go again - yet another world changing innovation. Billions of dollars are being invested to develop and deploy this next-generation industry. Many existing methods of doing business, and businesses themselves, will be disrupted and replaced by this new wave of technology. What is this wondrous new innovation? Why it is the Internet of Things (IoT) of course.

Posted May 25, 2016

Oracle is introducing version 4.0 of its NoSQL database. First introduced in 2011, the Oracle NoSQL Database is a key-value database that evolved from the company's acquisition of BerkeleyDB Java Edition, a mature, high-performance embeddable database. Ashok Joshi, senior director of NoSQL, Berkeley Database, and Database Mobile Server at Oracle, outlined the key enhancements in the new release.

Posted May 25, 2016

Data Summit 2016 kicked off at the New York Hilton Midtown earlier this month with keynote presentations by Ben Wellington, the creator of I Quant NY, and Nicholas Chandra, vice president of Cloud Customer Success at Oracle.

Posted May 25, 2016

Data Summit 2016, held in May in NYC, brought together IT managers, data architects, application developers, data analysts, project managers, and business managers to hear industry-leading professionals deliver educational presentations on industry trends and technologies, networks with their peers, and participate in hands-on workshops. Here are 10 key takeaways from Data Summit 2016:

Posted May 23, 2016

Companies are facing a growing problem: Data is everywhere, clogging up systems and preventing enterprises from gaining meaningful insights. Data virtualization is a way to reduce data proliferation and ensure that all consumers are working from a single source.

Posted May 23, 2016

COLLABORATE, the annual conference presented each year by the OAUG, IOUG and Quest, provides the opportunity to reflect on key changes in the Oracle ecosystem and allows the users groups to engage with their constituents about the areas of greatest importance. With the COLLABORATE 16 conference now behind her, Dr. Patricia Dues, the new president of the OAUG, talked with DBTA about what OAUG members are concerned with now and how the OAUG is helping them address emerging challenges.

Posted May 20, 2016

Syncsort is adding new capabilities to its platform, including native integration with Apache Spark and Apache Kafka. DMX-h v9 allows organizations to access and integrate enterprise-wide data with streams from real-time sources.

Posted May 18, 2016

SnapLogic is unveiling new updates to its SnapLogic Elastic Integration Platform that add the ability to integrate streaming data and power big data analytics in the cloud. The Spring 2016 release adds support for Apache Kafka, Microsoft HDInsight, and Google Cloud Storage, plus multiple enhancements that automate data shaping and management tasks.

Posted May 18, 2016

AtScale, Inc., which provides a self-service BI platform for Hadoop, has raised a Series B round of $11 million, bringing its total funding to date to $20 million. According to Bruno Aziza, chief marketing officer of AtScale, its platform is different from others in three key ways, making it applicable to use cases in an array of industries including healthcare, telecommunications, retail, and financial services.

Posted May 17, 2016

There are many ways to combine structured with unstructured data explained Jana Mikovska, senior consultant as Raytion, and Sebastian Klatt, vice president of business development at Raytion. Their presentation at Data Summit 2016 focused on approaches and advantages of combining the two to uncover knowledge buried in unstructured information.

Posted May 16, 2016

What type of email garners the most attention? How can enterprises hook more customers and be in the know? Matt Laudato, senior manager for Big Data Analytics at Constant Contact, addressed these questions and more during Data Summit 2016, explaining how to use big data analytics to optimize email marketing campaigns.

Posted May 12, 2016

As enterprises search for ways to support modern data applications and keep up with the pace of the revolving door of new technologies and solutions, bottlenecks become more frequent and put a stop to application development.

Posted May 12, 2016

What's ahead for the Internet of Things as far as data privacy, standards for interoperability, and meaningful use cases? John O'Brien, principal advisor of Radiant Advisors; Joe Caserta, president and CEO of Caserta Concepts; and George Corugedo, CTO of RedPoint Global, addressed those questions in a cross-fire panel discussion at Data Summit 2016.

Posted May 12, 2016

Ensuring data is governed properly is a hot topic as more tools and capabilities to analyze and gain insights become available. At the same time, data discovery is becoming more imperative as analysts must be able to move through the process with as little friction as possible.

Posted May 11, 2016

At the center of the new big data movement is the Hadoop framework, which provides an efficient file system and related ecosystem of solutions to store and analyze big datasets. The Hadoop ecosystem was addressed from two points of view in a session at Data Summit 2016. James Casaletto, principal solutions architect, Professional Services at MapR, presented a talk titled "Harnessing the Hadoop Ecosystem," and Tassos Sarbanes, mathematician / data scientist, Investment Banking at Credit Suisse, covered the advantages of HBase in a talk titled "HBase Data Model - The Ultimate Model on Hadoop."

Posted May 10, 2016

Despite the increasing focus on offering more access to more users in organizations, ad hoc querying of big data remains a problem for most, according to Jair Aguirre, data scientist at Booz Allen Hamilton, who presented a session at Data Summit 2016 titled "De-Siloing Data Using Apache Drill."

Posted May 10, 2016

IT and businesses don't always see eye to eye when it comes to overall goals within an enterprise. To address this glaring issue, Anne Buff, business solutions manager and thought leader for SAS Best Practices, a thought leadership organization at SAS Institute, discussed aligning data strategy goals at Data Summit 2016.

Posted May 10, 2016

The world's data has doubled in 18 months' time. The industry estimates that the global amount of storage will reach 40 ZB by 2020. Historically, storage architectures were built on solutions that could only scale vertically. This legacy approach to storage presents significant challenges to being able to store the tremendous quantities of data being created today in a way that is cost-effective and maintains high levels of performance. Today, most of the world's data centers are still using vertical scaling solutions for storage, which means that organizations are seeking alternatives that allow them to scale cheaply and efficiently in order to remain competitive. And now, with software defined storage moving forward, we see the use of more scale-out storage solutions in data centers.

Posted May 04, 2016

Dell is releasing an upgraded version of its Statistica advanced analytics platform, providing tools to address Internet of Things (IoT) analytics requirements and leverage heterogeneous data environments.

Posted May 04, 2016

Oracle database migration can pose a variety of learning curve challenges. However, a platform does exist that can make the transition easier. In a recent DBTA webinar, Bill Brunt, product manager of SharePlex at Dell, discussed how users can reduce downtime, migrate at speed, eliminate risk, and validate success by tapping into SharePlex.

Posted May 03, 2016

The new name for Dell after it merges with EMC later in 2016 will be Dell Technologies. The new name was announced by Michael Dell, chairman and CEO of Dell Inc., at EMC World and in a letter to Dell team members.

Posted May 02, 2016

Qubole is announcing two major changes. It is releasing an open sourced version of its StreamX tool and forming a partnership with Looker.

Posted May 02, 2016

Enterprises are constantly searching for ways to capture, leverage, and analyze data effectively. However, bottlenecks can wreak havoc on the application development process.

Posted April 29, 2016

Accenture and Splunk have formed an alliance that integrates Splunk products and cloud services into Accenture's application services, security, and digital offerings. The goal is to help customers improve business outcomes by mining vast amounts of application and operational data to identify trends and opportunities for improvement that were previously difficult to detect.

Posted April 28, 2016

Cisco is launching an appliance that includes the MapR Converged Data Platform for SAP HANA, making it easier and faster for users to take advantage of big data. The UCS Integrated Infrastructure for SAP HANA is made easy to deploy, speeds time to market, and will reduce operational expenses along with providing users with the flexibility to choose a scale-up (on-premises) or scale-out (cloud) storage strategy.

Posted April 27, 2016

Voting has opened for the 2016 DBTA Readers' Choice Awards. Cloud, in-memory, real-time, virtualization, SaaS, IoT - today, there are many opportunities for data-driven companies to take advantage of more data in more varieties flowing at greater velocity than ever before.

Posted April 27, 2016

Neo Technology, creator of Neo4j, is releasing an improved version of its signature platform, enhancing its scalability, introducing new language drivers and a host of other developer friendly features.

Posted April 26, 2016

Along with an increasing flow of big data that needs to be captured and analyzed, IT departments today also have more solution choices than ever before. However, before making a solution selection, organizations need to understand their requirements and also evaluate the attributes of the possible tools.

Posted April 25, 2016

The COLLABORATE 16 conference for Oracle users kicked off with a presentation by Unisphere Research analyst Joe McKendrick who shared insights from a ground-breaking study that examined future trends and technology among 690 members of three major Oracle users groups.

Posted April 25, 2016

The greatest power in using IoT-derived insights is the ability respond to opportunities or threats immediately. However, enterprises largely have focused on historical reporting and will need to significantly modernize their analytics capabilities—both in understanding current events and predicting future outcomes—to take advantage of the new insights that IoT data can bring.

Posted April 25, 2016

The need for data integration has never been more intense than it has been recently. The Internet of Things and its muscular sibling, the Industrial Internet of Things, are now being embraced as a way to better understand the status and working order of products, services, partners, and customers. Mobile technology is ubiquitous, pouring in a treasure trove of geolocation and usage data. Analytics has become the only way to compete, and with it comes a need for terabytes—and gigabytes—worth of data. The organization of 2016, in essence, has become a data machine, with an insatiable appetite for all the data that can be ingested.

Posted April 25, 2016

Sumo Logic, a provider of cloud-native, machine data analytics services, is unveiling a new platform that natively ingests, indexes, and analyzes structured metrics data, and unstructured log data together in real-time.

Posted April 18, 2016

Teradata, the big data analytics and marketing applications company, is making key investments in the Internet of Things (IoT) and the Analytics of Things (AoT), along with updating its signature platforms.

Posted April 18, 2016

Hortonworks is making several key updates to its platform along with furthering its mission as being a leading innovator of open and connected data solutions by enhancing partnerships with Pivotal and expanding upon established integrations with Syncsort.

Posted April 15, 2016

Everyone within an enterprise agrees that data is an asset, but it's what to do with it that causes divisiveness between business leaders and IT personnel.

Posted April 15, 2016

First created as part of a research project at UC Berkeley AMPLab, Spark is an open source project in the big data space, built for sophisticated analytics, speed, and ease of use. It unifies critical data analytics capabilities such as SQL, advanced analytics, and streaming in a single framework. Databricks is a company that was founded by the team that created and continues to lead both the development and training around Apache Spark.

Posted April 14, 2016

There are many different definitions of the term "big data," some of them reasonable, others not so much. However, the overriding issue for many data professionals, especially those who use more traditional data management tools, is confusion about what to do with big data and how to get the most out of it.

Posted April 08, 2016

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19

Sponsors