Blogs
Because sharing is caring
Attribution Model Patterns with Yorgos Moschovis
Shane Gibson and Yorgos Moschovis discuss the multifaceted world of marketing attribution. They explore the complexity of tracking customer behaviour across online and offline channels, the impact of privacy regulations, and various attribution models.
Understanding Concepts, Details, and Events: The Fundamental Building Blocks of AgileData Design
Reducing the complexity and effort to manage data is at the core of what we do. We love bringing magical UX to the data domain as we do this.
Every time we add a new capability or feature to the AgileData App or AgileData Platform, we think how could we just remove the need for a Data Magician to do that task at all?
That magic is not always possible in the first, or even the third iteration of those features.
Our AgileData App UX Capability Maturity Model helps us to keep that “magic sorting hat” goal at the top of our mind, every time we add a new thing.
This post outlines what that maturity model is and how we apply it.
Building a vibrant community with Scott Hirleman
In the episode of the AgileData podcast, Shane Gibson chats with Scott Hirleman, the founder of the data mesh community.
They delve into the nuances of cultivating and sustaining thriving communities.
The duo touch upon the broader patterns that can be applied to both external and internal communities within organisations, and the essence of being agile and responsive to the community’s evolving needs.
Upgrading Python: A Plumbing Adventure in the Google Stack
In the ever-evolving world of AgileData DataOps, it was time to upgrade the Python version that powers the AgileData Platform.
We utilise micro-services patterns throughout the AgileData Platform and a bunch of Google Cloud Services. The upgrade could have gone well, or caused no end of problems.
Read more on our exciting plumbing journey.
AgileData App UX Capability Maturity Model
Reducing the complexity and effort to manage data is at the core of what we do. We love bringing magical UX to the data domain as we do this.
Every time we add a new capability or feature to the AgileData App or AgileData Platform, we think how could we just remove the need for a Data Magician to do that task at all?
That magic is not always possible in the first, or even the third iteration of those features.
Our AgileData App UX Capability Maturity Model helps us to keep that “magic sorting hat” goal at the top of our mind, every time we add a new thing.
This post outlines what that maturity model is and how we apply it.
Unveiling the Magic of Change Data Collection Patterns: Exploring Full Snapshot, Delta, CDC, and Event-Based Approaches
Change data collection patterns are like magical lenses that allow you to track data changes. The full snapshot pattern captures complete data at specific intervals for historical analysis. The delta pattern records only changes between snapshots to save storage. CDC captures real-time changes for data integration and synchronization. The event-based pattern tracks data changes triggered by specific events. Each pattern has unique benefits and use cases. Choose the right approach based on your data needs and become a data magician who stays up-to-date with real-time data insights!
Layered Data Architectures with Veronika Durgin
Shane Gibson and Veronika Durgan discuss layered data architecture, data management, and the challenges of integrating software engineering with data analytics. They advocate for the ELT (Extract, Load, Transform) approach over traditional ETL methods and emphasise the importance of understanding data provenance to increase trust. The hosts also discuss the concept of data lakes and the idea of a “data lakehouse,” merging file storage with cloud compute. The conversation concludes with the importance of defining data layers and their policies, the value of automation in data handling, and the need for clear data governance.
How can data teams use Generative AI with Shaun McGirr
In this episode of the Agile Data Podcast, hosts Shane Gibson and Shaun McGirr delve deep into the transformative capabilities of large language models (LLMs), such as ChatGPT-3, and their potential to revolutionise data teams. Drawing on Shaun's diverse experiences...
The challenge of parsing files from the wild
In this instalment of the AgileData DataOps series, we’re exploring how we handle the challenges of parsing files from the wild. To ensure clean and well-structured data, each file goes through several checks and processes, similar to a water treatment plant. These steps include checking for previously seen files, looking for matching schema files, queuing the file, and parsing it. If a file fails to load, we have procedures in place to retry loading or notify errors for later resolution. This rigorous data processing ensures smooth and efficient data flow.
AgileData App
Explore AgileData features, updates, and tips
Consulting
Learn about consulting practises and good patterns for data focused consultancies
DataOps
Learn from our DataOps expertise, covering essential concepts, patterns, and tools
Data and Analytics
Unlock the power of data and analytics with expert guidance
Google Cloud
Imparting knowledge on Google Cloud's capabilities and its role in data-driven workflows
Journey
Explore real-life stories of our challenges, and lessons learned
Product Management
Enrich your product management skills with practical patterns
What Is
Describing data and analytics concepts, terms, and technologies to enable better understanding
Resources
Valuable resources to support your growth in the agile, and data and analytics domains
AgileData Podcast
Discussing combining agile, product and data patterns.
No Nonsense Agile Podcast
Discussing agile and product ways of working.
App Videos
Explore videos to better understand the AgileData App's features and capabilities.
Data Consulting Patterns with Joe Reis
Join Shane Gibson as he chats with Joe Reis on his experience in building and running a successful data and analytics consulting company.
The Enchanting World of Data Modeling: Conceptual, Logical, and Physical Spells Unraveled
Data modeling is a crucial process that involves creating shared understanding of data and its relationships. The three primary data model patterns are conceptual, logical, and physical. The conceptual data model provides a high-level overview of the data landscape, the logical data model delves deeper into data structures and relationships, and the physical data model translates the logical model into a database-specific schema. Understanding and effectively using these data models is essential for business analysts and data analysts, create efficient, well-organised data ecosystems.
Shane Gibson – Making Data Modeling Accessible
TD:LR Early in 2023 I was lucky enough to talk to Joe Reis on the Joe Reis Show to discuss how to make data modeling more accessible, why the world's moved past traditional data modeling and more. Listen to the episode...
AgileData Cost Comparison
AgileData reduces the cost of your data team and your data platform.
In this article we provide examples of those costs savings.
Cloud Analytics Databases: The Magical Realm for Data
Cloud Analytics Databases provide flexible, high-performance, cost-effective, and secure solution for storing and analysing large amounts of data. These databases promote collaboration and offer various choices, such as Snowflake, Google BigQuery, Amazon Redshift, and Azure Synapse Analytics, each with its unique features and ecosystem integrations.
Data Warehouse Technology Essentials: The Magical Components Every Data Magician Needs
The key components of a successful data warehouse technology capability include data sources, data integration, data storage, metadata, data marts, data query and reporting tools, data warehouse management, and data security.
Unveiling the Definition of Data Warehouses: Looking into Bill Inmon’s Magicians Top Hat
In a nutshell, a data warehouse, as defined by Bill Inmon, is a subject-oriented, integrated, time-variant, and non-volatile collection of data that supports decision-making processes. It helps data magicians, like business and data analysts, make better-informed decisions, save time, enhance collaboration, and improve business intelligence. To choose the right data warehouse technology, consider your data needs, budget, compatibility with existing tools, scalability, and real-world user experiences.
Martech – The Technologies Behind the Marketing Analytics Stack: A Guide for Data Magicians
Explore the MarTech stack based on two different patterns: marketing application and data platform. The marketing application pattern focuses on tools for content management, email marketing, CRM, social media, and more, while the data platform pattern emphasises data collection, integration, storage, analytics, and advanced technologies. By understanding both perspectives, you can build a comprehensive martech stack that efficiently integrates marketing efforts and harnesses the power of data to drive better results.
Anatomy of a Data Product
A graphical overview of the components required for a Data Product
Unveiling the Magic of Data Clean Rooms: Your Data Privacy Magicians
Data clean rooms are secure environments that enable organisations to process, analyse, and share sensitive data while maintaining privacy and security. They use data anonymization, access control, data usage policies, security measures, and auditing to ensure compliance with privacy regulations, making them indispensable for industries like healthcare, finance, and marketing.
Data Lineage Patterns – Tomas Kratky
Guests Tomas Kratky Shane GibsonResourcesJoin Shane Gibson as he chats with Tomas Kratky on his experience in defining data lineage and DataOps patterns.Subscribe | Apple Podcast | Spotify | Google Podcast | Amazon...
Free Google Analytics 4 (GA4) online courses
TD:LR There is some great free course content to help you upskill in Google Analytics 4 (GA4) Here are the ones we recomend.Discover the Next Generation of Google Analytics Find out how the latest generation of Google...
Observability – Raj Joseph
Join Shane Gibson as he chats with Raj Joseph on his experience in defining data observability patterns.Guests Raj JosephShane GibsonResourcesSubscribe | Apple Podcast | Spotify | Google Podcast | Amazon Audible |...
5E’s
As Data Consultants your customers are buying and outcome based on one of these patterns – effort, expertise, experience or efficiency.
We outline what each of these are, how they are different to each other and how to charge for delivering them.
Conceptually Modeling Concepts, Details and Events in AgileData
Join Shane and Nigel as they discuss how and why we define a conceptual model of Concepts, Details and Events in AgileData and how we map these to a physical Data Vault model.
Agile-tecture Information Factory
Defining a Data Architecture is a key pattern when working in the data domain.
Its always tempting to boil the ocean when defining yours, don’t!
And once you have defined your data architecture, find a way to articulate and share it with simplicity.
Here is how we articulate the AgileData Data Agile-tecture.
Data Architecture as a Service (DAaaS)
TD:LR Data Architecture as a Service (DAaaS), is it Buzzwashing or not? As is often the case, it depends on your point of view. Our point of view? Nope its a real thing.
Myth: using the cloud for your data warehouse is expensive
TD:LR Cloud Data Platforms promise you the magic of storing your data and unlimited elastic compute for cents. Is it too good to be true? Yes AND No. You can run a cloud platform for a low low cost, but its will take...
Agile and Product – Justin Bauer
Join Shane Gibson as he chats with Justin Bauer on his experience combining the worlds of agile and product in a data driven company.Guests Justin BauerShane GibsonResourcesSubscribe | Apple Podcast | Spotify | Google...
AgileData WoW Q&A – Hamish Gray and May-Lyn Hu
Join Shane Gibson, Hamish Gray and May-Lyn Hu as they talk through some of the experiences they have had in their organisation applying agile and data together in a new way of working.Guests Hamish Gray May-Lyn Hu...
Observability, Tick
TD:LR Data observability is not something new, its a set of features every data platform should have to get the data jobs done. Observability is crucial as you scale Observability is very on trend right now. It feels...