Join us for an engaging virtual hands-on lab...
Locking in a Data Vault
So, I’m playing a little with words here. I’m certainly not advocating locking anybody or anything in a Data Vault. I want to share how you can lock in success as you design and deliver your new Data Vault. I assume you have your business people fully on board as discussed in this recent blog. If not, I advise you to go back and do that first. This blogpost is aimed to specifically assist your development team.
Most of us are challenged by change. And developers are little different. They are typically very comfortable with a set of design approaches and tools learned in the past and it routinely frames their perspective on how to tackle the future. Combining the comfort of old ways with the tight timeframes and pressures of today’s business requests seldom leads to taking time to explore new options. As a result, it is easy for teams to be weighed down by outdated, limiting approaches to data infrastructure.
What we’ve learned with the evolution of the Data Vault methodology and data warehouse automation (DWA) over the past decade is that some areas within the data warehouse development process are broken. Dan Linstedt and the other contributors to the Data Vault model in the early 2000’s recognized early on that the traditional data models were not able to meet the quality and agility goals of a data warehouse serving a modern data-focused business. I have provided some of this background in this recent white paper.
The Data Vault is constructed from some very carefully defined primitives, such as hubs, links and satellite tables, that must be defined and populated in specific ways to work as intended. If developers use old approaches or, worse still, make up new ones themselves, disaster will follow.
In Data Vault 2.0, Linstedt has provided a methodology to drive best practice in the design of the data model and in the development of the function that populates it. Methodologies are great: I rely on a wonderful methodology for manually raising my computer screen to the ideal height as I write this post. But, within development teams, such behavior will lead to inconsistent approaches to development; result in delays in future maintenance as other developers struggle to understand different coding styles; and ultimately will lead to a skills loss for your organization when your cleverest developer dies in a freak coding accident.
WhereScape® Data Vault Express addresses these issues by encoding the templates of the Data Vault components, and employing best practices in population processes and development methods within an automated, metadata-driven design and development environment. Starting in initial design collaboration between IT and business people, design choices are encoded in metadata to auto-generate the code and scripts responsible for defining Data Vault tables and populating them with the correct data, ensuring design consistency and completeness, and coding conformity to a single set of standards. Traceability is enforced and maintenance eased. Additionally, as your developers work, all is documented automatically—a task few enjoy or have the time to complete.
Locking in the Data Vault is all about maintaining consistency, ensuring complete documentation, and auto-generating best-practice model and code assets across design and development. As I discuss in this white paper Meeting the Six Data Vault Challenges and within this recent recorded webcast, data warehouse automation is the logical foundation. And while change is hard, development teams will benefit greatly from an openness to doing it differently.
Coming soon, some thoughts on Living in a Data Vault.
You can find the other blog posts in this series here:
Dr. Barry Devlin is among the foremost authorities on business insight and one of the founders of data warehousing, having published the first architectural paper on the topic in 1988. Barry is founder and principal of 9sight Consulting. A regular blogger, writer and commentator on information and its use, Barry is based in Cape Town, South Africa and operates worldwide.
Simplify Cloud Migrations: Webinar Highlights from Mike Ferguson
Migrating your data warehouse to the cloud might feel like navigating uncharted territory, but it doesn’t have to be. In a recent webinar that we recently hosted, Mike Ferguson, CEO of Intelligent Business Strategies, shared actionable insights drawn from his 40+...
2025 Data Automation Trends: Shaping the Future of Speed, Scalability, and Strategy
As we step into 2025, data automation isn’t just advancing—it’s upending conventions and resetting standards. Leading companies now treat data as a powerful collaborator, fueling key business decisions and strategic foresight. At WhereScape, we’re tuned into the next...
Building Smarter with a Metadata-Driven Approach
Think of building a data management system as constructing a smart city. In this analogy, the data is like the various buildings, roads, and infrastructure that make up the city. Each structure has a specific purpose and function, just as each data point has a...
Your Guide to Online Analytical Processing (OLAP) for Business Intelligence
Streamline your data analysis process with OLAP for better business intelligence. Explore the advantages of Online Analytical Processing (OLAP) now! Do you find it hard to analyze large amounts of data quickly? Online Analytical Processing (OLAP) is designed to answer...
Mastering Data Warehouse Design, Optimization, And Lifecycle
Building a data warehouse can be tough for many businesses. A data warehouse centralizes data from many sources. This article will teach you how to master data warehouse design, optimization, and lifecycle. Start improving your data strategy today. Key Takeaways Use...
Revisiting Gartner’s First Look at Data Warehouse Automation
At WhereScape, we are delighted to revisit Gartner’s influential technical paper, Assessing the Capabilities of Data Warehouse Automation (DWA), published on February 8, 2021, by analyst Ramke Ramakrishnan. This paper marked a significant milestone for the data...
Unveiling WhereScape 3D 9.0.5: Enhanced Flexibility and Compatibility
The latest release of WhereScape 3D is here, and version 9.0.5 brings a host of updates designed to make your data management work faster and smoother. Let’s dive into the new features... Online Documentation for Enhanced Accessibility With the user guide now hosted...
What Makes A Really Great Data Model: Essential Criteria And Best Practices
By 2025, over 75% of data models will integrate AI—transforming the way businesses operate. But here's the catch: only those with robust, well-designed data models will reap the benefits. Is your data model ready for the AI revolution?Understanding what makes a great...
Guide to Data Quality: Ensuring Accuracy and Consistency in Your Organization
Why Data Quality Matters Data is only as useful as it is accurate and complete. No matter how many analysis models and data review routines you put into place, your organization can’t truly make data-driven decisions without accurate, relevant, complete, and...
Common Data Quality Challenges and How to Overcome Them
The Importance of Maintaining Data Quality Improving data quality is a top priority for many forward-thinking organizations, and for good reason. Any company making decisions based on data should also invest time and resources into ensuring high data quality. Data...
Related Content
Simplify Cloud Migrations: Webinar Highlights from Mike Ferguson
Migrating your data warehouse to the cloud might feel like navigating uncharted territory, but it doesn’t have to be. In a recent webinar that we recently hosted, Mike Ferguson, CEO of Intelligent Business Strategies, shared actionable insights drawn from his 40+...
2025 Data Automation Trends: Shaping the Future of Speed, Scalability, and Strategy
As we step into 2025, data automation isn’t just advancing—it’s upending conventions and resetting standards. Leading companies now treat data as a powerful collaborator, fueling key business decisions and strategic foresight. At WhereScape, we’re tuned into the next...
Building Smarter with a Metadata-Driven Approach
Think of building a data management system as constructing a smart city. In this analogy, the data is like the various buildings, roads, and infrastructure that make up the city. Each structure has a specific purpose and function, just as each data point has a...
Your Guide to Online Analytical Processing (OLAP) for Business Intelligence
Streamline your data analysis process with OLAP for better business intelligence. Explore the advantages of Online Analytical Processing (OLAP) now! Do you find it hard to analyze large amounts of data quickly? Online Analytical Processing (OLAP) is designed to answer...