Subscribe to the online version of Database Trends and Applications magazine. DBTA will send occasional notices about new and/or updated DBTA.com content.
A microservices architecture is very conducive to continuous development and deployment iterations. As the DevOps environment becomes more focused on rapid deployment, adaptability and growth, microservices are the perfect tool to build versatile systems that allow for continuous improvement and scaling with little to no down time for the user. By breaking down a large suite of features into discrete functions, where each runs as its own service that is not tied to any specific server or dependencies, developers are able to create a loosely coupled system of independent functionalities. In all, each microservice can do one or a few things very well.
Trends and Applications
Businesses have a great deal of experience developing and implementing data protection strategies that allow them to recover from attacks on their on-premise IT environments. However, increasingly, enterprises need to begin considering a new threat to their IT environments. This threat is malicious actors using "zero-day" vulnerabilities—vulnerabilities that are so new, they cannot be patched before they are exploited—to attack and bring down the major cloud providers that organizations are increasingly relying on to host critical applications and data.
Database security has always been important, and with the compliance requirements of new regulations such as GDPR and California Consumer Privacy Act (CCPA), it's an issue that reaches across the organization, into the board room and to customers. This attention is putting new pressure on DBAs to secure production data and development and testing databases. Here are some relatively simple database security best practices and security checks that are easily executed, and can help organizations better understand and strengthen their defensive security posture
What if the reason the BI implementation was failing was not the users or their willingness to work together, but that they were using the wrong analytics platform
Today, data is critical to every organization and every department within every organization. Yet, all the disparate systems for handling it are creating new challenges. Joe Caserta, founder and president of Caserta, a technology consulting and implementation firm focused on data and analytics strategies and solutions, recently discussed the current state of data integration and what is needed to overcome today's problems.
There is a sea change underway in enterprise architecture. Just a few years ago, enterprise administrators were fearful of the security implications of trusting an outside provider to protect their data assets. Although security is still a cloud concern—one which predominates at the time of cloud migration, and even grows stronger post-implementation—the use of cloud platforms has gained widespread acceptance.
The data warehouse and data lake each solve different business problems and impose their own unique challenges.Organizations shouldn't write off data warehouses—as they evolve, they are taking on new roles in digital enterprises. Data lakes may add a great deal of flexibility to an enterprise data strategy, but they are supported by fast-breaking technologies that require constant vigilance.
On June 11, 2019, the National Institute of Standards & Technology (NIST) released an updated white paper, detailing several action plans (https://csrc.nist.gov) for reducing software vulnerabilities and cyber-risk. In the paper, titled "Mitigating the Risk of Software Vulnerabilities by Adopting a Secure Software Development Framework (SSDF)," NIST provided organizations with solid guidelines to avoid the nasty—not to mention expensive—consequences of a data breach.
Semantically enabled machine reasoning is an efficient form of AI that can help with basic tenets like data quality and completeness, and that can scale to provide automated pattern recognition for decision support in mission-critical applications. AI - delivered by semantic technologies - opens a wealth of opportunity to improve efficiency in all types of enterprise business applications. In this very powerful new era, errors are reduced, data insights are more sophisticated and quickly gleaned, and staff is freed to focus on excellent service, new product development, and overall business growth.
Columns - Database Elaborations
Every organization needs a data warehouse. A data warehouse has never been a one-size-fits-all kind of solution. Variations exist and should be accepted.
Columns - DBA Corner
Truly, the speed and performance of your production database systems encompasses a wide range of parameters and decisions that are made well before implementation. DBAs need to understand the options available, the factors that impact performance and development with each DBMS option, and work to keep the IT organization up-to-speed and educated on all of the available choices.
Columns - MongoDB Matters
Despite the failed promises of the data lake, the concept retains some resonance in larger enterprises, and so MongoDB has chosen to leverage the term for one of its latest offerings. MongoDB's Atlas Data Lake bears only superficial similarity to Hadoop-powered data lakes. Nevertheless, it's a useful feature that stands to see significant uptake.
Entrinsik is releasing Informer 5.1 with new features including multi-tenancy, Mapping Suites, and business workflow enhancements. This latest release of Informer introduces a new multi-tenancy architecture that enables VARS and ISVs who develop and deploy software to quickly serve multiple customers using a single instance of Informer.
The organization's new solution provided database access through a simple client front-end that could be installed on any standard PC. Employees could connect to the system from a laptop or home office whenever they wanted.