Join WhereScape at Big Data & AI World—the...
From Data Warehouse Automation to Data Architecture Automation
For a long time, the data warehouse architecture was the sole ruler of data delivery to decision-making processes, but not anymore. It now has to share the stage with other data architectures, such as the data lake, data hub, and data lakehouse. Because the data in these new data architectures is structured, organized and used differently, a new breed of generators is required: the data architecture automation tools.
The benefits of using generators are clear. They accelerate development, ease maintenance, create run-time platform independence, improve performance, and so on.
Not every task is suitable for automation and for which suitable generators can be developed. The best tasks suitable for automation are repetitive by nature and can be expressed as formal algorithms that indicate which steps to perform, what to do in special cases and how to react if something goes wrong. In other words, these tasks can be formalized.
Many of the tasks involved in designing, developing and maintaining data warehouse architectures are repetitive and can be formalized, making them highly suited for automation. For example, when an enterprise data warehouse uses a data vault design technique and the physical data marts use star schemas, both can be generated from a central data model including the ETL code to copy the data from the warehouse to the data marts.
Data architecture automation tools can be referred to as the third generation of generators used in automating the development of data architectures to support decision-making processes. The first generation is formed by tools such as ETL, BI and data modeling tools. For example, ETL tools transform high-level specifications to lower-level code to do the actual ETL work, many BI tools can be considered to be generators because they generate SQL statements that extract data from databases, and some data science tools enable data scientists to work at a high conceptual level from which code is generated.
All these generators help to accelerate development and ease maintenance, but they are all limited to generate just one component of an entire data architecture. Therefore, multiple independent generators are required to generate the complete architecture. Since these generators require similar specifications, they are defined multiple times, or in other words, they are duplicated. It is a challenge to keep all these scattered specifications consistent, to ensure that they work together optimally, and to guarantee that if one specification is changed, all the duplicate specifications are changed accordingly.
The principles that apply to generators of individual platform components can be applied to generators of entire data architectures. That is why they were succeeded by the second generation of generators, the data warehouse automation tools that generate entire data warehouse architectures. They do not generate code for one component of the architecture, but for several. Traditional data warehouse automation tools generate, for example, staging areas, enterprise data warehouses, physical data marts, the ETL solutions that copy data from one database to another, and metadata. Several of these tools have been on the market for years and have proven their worth. They all store all the metadata specifications once and reuse them when generating, for example, the data warehouse tables, the data mart tables, and the ETL logic to copy the data.
The main restriction of several data warehouse automation tools is that they only generate traditional data warehouse architectures which can only support a restricted set of data consumption forms.
Today, organizations also want to deploy data hubs, data lakes, and data lakehouses. These are used to support new forms of data consumption. For example, in these new data architectures, data is copied to a data hub and from there to a data warehouse, or the data architecture consist of a data lake that stores data from a data warehouse, transactional databases and external data sources.
Supporting other data architectures requires generators that can be adapted to generate data architectures composed of other types of data stores than those supported by more traditional data warehouse architectures. The term data warehouse automation is probably a misnomer for these tools, it is too restrictive. Data architecture automation tool is more suitable. With the increasing need by organizations to become more data-driven, or in in other words, to use data more widely, effectively and efficiently, the need for generators that can generate any kind of data architecture to support any form of data consumption, the need for adaptable data architecture automation tools has increased accordingly.
Simplify Cloud Migrations: Webinar Highlights from Mike Ferguson
Migrating your data warehouse to the cloud might feel like navigating uncharted territory, but it doesn’t have to be. In a recent webinar that we recently hosted, Mike Ferguson, CEO of Intelligent Business Strategies, shared actionable insights drawn from his 40+...
2025 Data Automation Trends: Shaping the Future of Speed, Scalability, and Strategy
As we step into 2025, data automation isn’t just advancing—it’s upending conventions and resetting standards. Leading companies now treat data as a powerful collaborator, fueling key business decisions and strategic foresight. At WhereScape, we’re tuned into the next...
Building Smarter with a Metadata-Driven Approach
Think of building a data management system as constructing a smart city. In this analogy, the data is like the various buildings, roads, and infrastructure that make up the city. Each structure has a specific purpose and function, just as each data point has a...
Your Guide to Online Analytical Processing (OLAP) for Business Intelligence
Streamline your data analysis process with OLAP for better business intelligence. Explore the advantages of Online Analytical Processing (OLAP) now! Do you find it hard to analyze large amounts of data quickly? Online Analytical Processing (OLAP) is designed to answer...
Mastering Data Warehouse Design, Optimization, And Lifecycle
Building a data warehouse can be tough for many businesses. A data warehouse centralizes data from many sources. This article will teach you how to master data warehouse design, optimization, and lifecycle. Start improving your data strategy today. Key Takeaways Use...
Revisiting Gartner’s First Look at Data Warehouse Automation
At WhereScape, we are delighted to revisit Gartner’s influential technical paper, Assessing the Capabilities of Data Warehouse Automation (DWA), published on February 8, 2021, by analyst Ramke Ramakrishnan. This paper marked a significant milestone for the data...
Unveiling WhereScape 3D 9.0.5: Enhanced Flexibility and Compatibility
The latest release of WhereScape 3D is here, and version 9.0.5 brings a host of updates designed to make your data management work faster and smoother. Let’s dive into the new features... Online Documentation for Enhanced Accessibility With the user guide now hosted...
What Makes A Really Great Data Model: Essential Criteria And Best Practices
By 2025, over 75% of data models will integrate AI—transforming the way businesses operate. But here's the catch: only those with robust, well-designed data models will reap the benefits. Is your data model ready for the AI revolution?Understanding what makes a great...
Guide to Data Quality: Ensuring Accuracy and Consistency in Your Organization
Why Data Quality Matters Data is only as useful as it is accurate and complete. No matter how many analysis models and data review routines you put into place, your organization can’t truly make data-driven decisions without accurate, relevant, complete, and...
Common Data Quality Challenges and How to Overcome Them
The Importance of Maintaining Data Quality Improving data quality is a top priority for many forward-thinking organizations, and for good reason. Any company making decisions based on data should also invest time and resources into ensuring high data quality. Data...
Related Content
Simplify Cloud Migrations: Webinar Highlights from Mike Ferguson
Migrating your data warehouse to the cloud might feel like navigating uncharted territory, but it doesn’t have to be. In a recent webinar that we recently hosted, Mike Ferguson, CEO of Intelligent Business Strategies, shared actionable insights drawn from his 40+...
2025 Data Automation Trends: Shaping the Future of Speed, Scalability, and Strategy
As we step into 2025, data automation isn’t just advancing—it’s upending conventions and resetting standards. Leading companies now treat data as a powerful collaborator, fueling key business decisions and strategic foresight. At WhereScape, we’re tuned into the next...
Building Smarter with a Metadata-Driven Approach
Think of building a data management system as constructing a smart city. In this analogy, the data is like the various buildings, roads, and infrastructure that make up the city. Each structure has a specific purpose and function, just as each data point has a...
Your Guide to Online Analytical Processing (OLAP) for Business Intelligence
Streamline your data analysis process with OLAP for better business intelligence. Explore the advantages of Online Analytical Processing (OLAP) now! Do you find it hard to analyze large amounts of data quickly? Online Analytical Processing (OLAP) is designed to answer...