Blogs

Because sharing is caring

Data Operating Models patterns with Dylan Anderson

Join Shane Gibson as he chats with Dylan Anderson about the patterns required to define a Data Operating Model.

What is a Fractional role and why should you care?
What is a Fractional role and why should you care?

A fractional team or a fractional role is one where an organisation gets access to experienced professionals who are hired on a part time or as needed basis.

Fractional professionals typically work across multiple organisations at the same time, providing their expertise where and when it is needed most.

This fractional pattern allows organisations to get access to skills and capabilities that they cannot find, or afford to hire as permanent members of their team.

AgileData Feature #03 – Catalog Browse and Search
AgileData Feature #03 – Catalog Browse and Search

Data Persona’s can use the Catalog to quickly search, find and access the Data Assets (Tiles) they need to complete a data task, such as supporting the development of Information Products or using the data for analysis.

Topic Tags and Dynamic Filtering allows Data Assets to be easily found using the Organisations business terminology.

With the Menu Anywhere capability they can drill out to any other screen in the AgileData App, retaining the context of the Tile, removing the need to go to the screen and manually searching for that Data Asset again.

Any time a data user performs a data task in the AgileData App, or an automated data task is executed on the AgileData Platform, the Catalog is automagically updated. The Catalog data is treated as a first class citizen, not as data exhaust.

The Catalog is part of a wider ensemble of features that let them search or browse to locate essential Data Assets quickly, view comprehensive details for each set of data, and track the flow of data similarly to parcel tracking.

AgileData Feature #02 – Single Sign-in
AgileData Feature #02 – Single Sign-in

With Single Sign-On (SSO), users gain streamlined access to the AgileData App, eliminating the need for repeatedly logging in and reducing friction in their data day.

This feature leverages Google Identity and Access Management (IAM) to provide seamless and secure authentication, ensuring that users can easily connect to the platform using their existing credentials.

AgileData’s SSO is not only secure but also scalable, designed to meet the needs of growing teams while maintaining the highest standards of data protection and access control.

AgileData App

Explore AgileData features, updates, and tips

Network

Learn about consulting practises and good patterns for data focused consultancies

DataOps

Learn from our DataOps expertise, covering essential concepts, patterns, and tools

Data and Analytics

Unlock the power of data and analytics with expert guidance

Google Cloud

Imparting knowledge on Google Cloud's capabilities and its role in data-driven workflows

Journey

Explore real-life stories of our challenges, and lessons learned

Product Management

Enrich your product management skills with practical patterns

What Is

Describing data and analytics concepts, terms, and technologies to enable better understanding

Resources

Valuable resources to support your growth in the agile, and data and analytics domains

AgileData Podcast

Discussing combining agile, product and data patterns.

No Nonsense Agile Podcast

Discussing agile and product ways of working.

App Videos

Explore videos to better understand the AgileData App's features and capabilities.

5 core Data Collection Patterns
5 core Data Collection Patterns

At AgileData, delivering our Fractional Data Service has revealed the diverse challenges of integrating data from varied organisations, industries, and systems. To scale effectively, we’ve adopted five core data collection patterns based on our “Define it Once, Reuse it Often” (DORO) principle:

1. Push
2. Pull
3. Stream
4. Share
5. File Drop

These patterns are supported by a toolkit of tested technologies like Dataddo, Meltano, and Google services, allowing us to solve new data challenges quickly. Our approach ensures flexibility and scalability, always starting with the question: Push, Pull, Stream, Share, or File Drop?

An Experiment – Top Data Trends for 2025 with Coalesce and Google NotebookLM
An Experiment – Top Data Trends for 2025 with Coalesce and Google NotebookLM

Join two LLM generated guests as they discuss the Top Data Trends of 2025 Whitepaper published by Coalesce.

This is a different episode.  Instead of a human guest, we have two robot guests. 

I decided to try and experiment. My experiment was, can I upload a white paper to LLM, have it generate a podcast listen to that podcast in my daily walk and see whether that summary removes the need for me to actually read the white paper.

So in this case, I have grabbed a white paper  called Top Data Trends for 2025 from Coalesce, uploaded it to the Google Notebook LLM and got it to generate a podcast with two hosts chatting about the white paper.

Have a listen, let me know what you think.

AgileData Feature #01 – Marketplace
AgileData Feature #01 – Marketplace

Information Consumers can quickly search, find and access the Information Products they need to answer their business questions.

This feature enables Information Consumers to search and find all the available dashboards, reports and analytical models in the AgileData App, regardless of what third party Last Mile tool they are created in.

It also enables the Information Consumer to quickly open that report or dashboard directly from the Marketplace, when those reports are accessible by a web URL

NZ Scaleup AgileData achieves Google Cloud Ready – BigQuery Designation
NZ Scaleup AgileData achieves Google Cloud Ready – BigQuery Designation

AgileData has achieved Google Cloud Ready – BigQuery designation, streamlining data management for customers and partners. This certification confirms the integration’s functionality and reliability, reducing complexity through a low-code interface. By leveraging Google Cloud’s infrastructure and BigQuery, AgileData empowers business leaders to rapidly gain insights and make informed decisions efficiently.

The last (for now) of our #AgileDataDiscover summaries!
The last (for now) of our #AgileDataDiscover summaries!

Shane Gibson and Nigel Vining completed a 30-day public experiment using a Large Language Model for legacy data warehouse discovery. They confirmed it was feasible, viable, and valuable, securing their first paying customer. They’re showcasing their progress at Big Data London in September under their new product, AgileData Disco.

#AgileDataDiscover weekly wrap No.5
#AgileDataDiscover weekly wrap No.5

We are in the final phase of building a new product, AgileData Disco, aimed at efficiently discovering and documenting data platforms. We are exploring various Go-to-Market strategies like SLG and PLG. Pricing strategies include options like pay per output or subscription models. We are building in public to gather feedback and refine their approach.

#AgileDataDiscover weekly wrap No.4
#AgileDataDiscover weekly wrap No.4

We review feedback, highlight emerging use cases like legacy data understanding, data governance, and automated data migration. New patterns are needed for moving from prototype to MVP. Challenges include managing tokens, logging responses, and secure data handling. The GTM strategy focuses on Partner/Channel Led Growth.

#AgileDataDiscover weekly wrap No.3
#AgileDataDiscover weekly wrap No.3

We focus on developing features such as secure sign-in, file upload, data security, and access to Google’s LLM. Challenges include improving the menu system and separating outputs into distinct screens for clarity. Feedback drives their iterative improvements.

#AgileDataDiscover weekly wrap No.2
#AgileDataDiscover weekly wrap No.2

We discuss the ongoing development of a new product idea, emphasising feasibility and viability through internal research (“McSpikeys”). Initial tests using LLMs have been promising, but strategic decisions lie ahead regarding its integration. The team grapples with market validation and adjusting their workflow for optimal experimentation.

#AgileDataDiscover weekly wrap No.1
#AgileDataDiscover weekly wrap No.1

We are tackling challenges in migrating legacy data platforms by automating data discovery and migration to reduce costs significantly. Our approach includes using core data patterns and employing tools like Google Gemini for comparative analysis. The aim is to streamline data handling and enable collaborative governance in organisations. Follow their public build journey for updates.

Introducing Hai, AgileData 2024 Data Intern
Introducing Hai, AgileData 2024 Data Intern

I’m Hai, a name that intriguingly means “hi” in English. Originally from Vietnam, I now find myself in Australia, studying Data Science and embracing an internship at AgileData.io. This journey is not just about academic growth but also about applying my knowledge in practical, impactful ways. Join me as I explore the blend of technology and community, aiming to make a difference through data.

Defining self-service data
Defining self-service data

Everybody wants self service data, but what do they really mean when they say that.

If we gave them access to a set of highly nested JSON data, and say “help your self”, would that be what they expect?

Or do they expect self service to be able to get information without asking a person to get it for them.

Or are they expecting something in between.

I ask them which of the five simple self service patterns they want to find, which form of self service they are after.

There are 3 strategic / macro data use cases
There are 3 strategic / macro data use cases

I often ask which of these three macro data use cases the Organisations believed were its priorities to achieve their business strategy:

Providing data to Customers
Supporting Internal Processes
Providing data to External Organisations

Each of these three strategic / macro data use cases come with specific data architectures, data work and also impact the context of how you would design your agile data ways of working.

Building the Data Plane while flying it
Building the Data Plane while flying it

In the data domain you typically have to balance between building the right thing and building the thing right.

The days of being able to spend 6 months or a year on “Sprint Zero” creating your data platform have gone.

One team I worked with called it “building the airplane as you fly it”

Here are 5 patterns I have seen data teams adopt to help them do this.