Agile-tecture Data Factory
Defining a Data Architecture is a key pattern when working in the data domain.
Its always tempting to boil the ocean when defining yours, don’t!
And once you have defined your data architecture, find a way to articulate and share it with simplicity.
Here is how we articulate the AgileData Data Agile-tecture.
Layers in the data system
In it goes!
The data is collected from Systems of Capture and loaded into the History layer as it arrives.
The quality, structure and type of the data
isn’t important at this stage.
The objective is just to get the data into the data system as fast as possible; subsequent data layers will handle quality and integration issues.
As the data arrives into the data system we inspect it and tag it with additional data that will be useful later.
We tag the date it arrived, and we check it’s the data we expected to collect and in the format we expected it.
We scan the data and store the profile of the data with it. We look for any obvious anomalies or defects and flag them if we find them.
Rack & Stack
The collected data is stored in the History layer.
The History layer is immutable and remembers every change to the data for future reference so that trends can be observed; in effect it serves as the organisation’s memory, retaining a persistent copy of the data and its structure over time.
The data is stored in its original format, just as it was collected from the source applications.
Sort it out
In the Combined layer, data is sorted into Business Concepts, Detail and Events, applying business context.
This is the first layer in the data system where the data structure is changed.
We are beginning the work to make data understandable and to integrate the raw data that has been collected and stored.
Apply some colour?
Once the data has been separated into Business Concepts, Details and Events, business logic can be applied in the Combined layer to make the data more useful.
We may clean the data to make it fit for purpose, we may augment the data to created new inferred data values.
This is the first layer in the data system where the data values are manipulated if required.
We always have the History layer available to audit and view what the data originally looked
like when it was captured.
Make it special
We change the data in the Combined layer to make it fit for our specific
We combine Business Concepts or Events to create a single view of this data.
We create metrics that make it easy to measure a business outcome on an ongoing basis.
Once collected, validated, cleaned, augmented and combined, the data is delivered in a consumable format in the consume layer.
The data format will be based on patterns which are simple and familiar to the end consumer. Typically, large wide tables or star schemas.
No further changes are made to the data values at this stage, but the structure is altered to suit the requirements of last mile tools.
This provides the last mile tools the right data, in the right format, in the right way.
At your service
Last Mile tools consume the data and deliver Information Products to consumers the way they prefer
Some consumers prefer silver service, some prefer self-service, and some need the results of machine-driven models.
The last mile focusses on delivering information the way they want it and when they need it.
Keep making data simply magical
AgileData.io provides both a Software as a Service product and a recommended AgileData Way of Working. We believe you need both to deliver data in a simply magical way.
We have designed and implemented a flexible data arcthitecture, so our customers don’t have to.