DataOps 101: How to Focus Your Data Teams on Work that Matters
In data analytics, how do we define “work that matters” and deliver it?
As data analytics have become ubiquitous across industries, and organizations have expanded their IT and technology departments, the task of managing critical data projects has fallen to Project Managers and Scrum Masters.
But finding the best way to deliver value to the business while keeping teams focused on work that matters has caused those professionals to throw up their hands in exasperation. In the case of analytics requests in particular, a dashboard is too big to ship all at once, but individual metrics aren’t useful. To deliver, project managers should instead work toward delivering a minimum viable product (MVP).
It’s costly to focus on poorly-defined tasks that may not be what the business really needs – just as it’s expensive to execute them inefficiently. But identifying which tasks matter the most and how to prioritize them is a surprising challenge in and of itself.
So how do we define the “work that matters” and deliver it?
In this blog, we’ll take a detailed look at how to identify (1) real problems that users need solved with dashboards and (2) methods that data teams can employ to stay focused on impactful tasks and limit extraneous work.
What you'll learn:
- How to identify real problems that users need solved with dashboards and align solutions with business goals by asking the right questions.
- Methods for prioritizing impactful tasks, defining a minimum viable product (MVP), and focusing team efforts effectively while minimizing lower value work.
- The importance of vertical slicing in project management to deliver measurable business value and streamline the dashboard development process.
Typical approaches to program management
Project management professionals generally look to a few potential approaches for managing workload:
- Waterfall (or traditional project management) is not the best fit for anything that should get to end users quickly. It requires lengthy phases with phase gates, and the feedback loop with stakeholders and end users is long enough that we can’t apply it before or shortly after delivery. It is costly to deliver something that doesn’t hit the mark, and shorter feedback loops mean we can pivot earlier to ensure market fit.
- Scrum (the most common Agile framework) is better suited to software development. And many of its principles don’t apply (or aren’t easily applied) to data.
But neither of these methods fully satisfies the needs of a data team that’s working, for example, on delivering useful dashboards based on accurate data models. And defining the contributions of data work can be, as we explored in a previous blog, difficult to frame in terms of business value.
To answer analytics requests, start by working backwards from the problem
Let’s start with stakeholder requests. Although focusing on valuable work is a complex issue, unclear work orders and oversized tasks can be one of the greatest hurdles to success.
It’s Monday. Finance has sent in a request to have a dashboard built. They’ve laid out the broad concept and listed a lot of charts and graphs they need to see – even attached a series of 5,000-line spreadsheets to show you what they’re working with. It’s overwhelming and the objective of the dashboard is unclear. Oh, and they need it by last week.
Stakeholders often ask for tools or dashboards without fully understanding what they truly need. As data professionals, your job isn’t to deliver every request at face value but to align solutions with business goals by “getting to the bottom of it”. At the very least, your end user should be able to answer three questions:
- What decisions will this data enable?
- What outcomes are you hoping to achieve?
- Are there alternative ways you can address your core problem?
By focusing on the “why” behind a request, you’ll not only save valuable time, but also ensure your solutions drive real impact.
Don’t take “why” at face value. Get to the core of the dashboard request.
That said: Don’t take “why” at face value. Generally, users are effusive about everything they’d like to have, so an experienced interviewer will push them to define what is a “must have” versus a “nice to have”. Anything that is a “nice to have” shouldn’t be included in the MVP – or first iteration – and can wait until future iterations.
When the teammate who gathers the requirements from end users interviews them, they should get an understanding of which metrics and dashboard elements are most important to the end users. The team should create them in that order until we’ve developed the MVP. The MVP would be the dashboard with all metrics that the end users have said are absolutely required for the dashboard to be usable.
In the example above, your conversation might initially look like a list of metrics and tables that users say they need, but, after further probing, it should be clear what the user is trying to do.
For instance:
Let’s say a user has asked for a count of open stores during a period of time that can be filtered with a date picker. But the purpose of this metric is actually to filter out closed stores so they can compare apples to apples when it comes to profits. Based on the initial requirement, if a store closed 6 months ago and the filter is set to include the last year, that store’s profits would be included in the results. Once we understand the actual use case, we understand that we should only return stores that were open during the selected date range that are still open today.
Asking the right questions and assigning the appropriate interviewer for requirement gathering can make all the difference between going on a wild goose chase and sprinting straight toward the goal.
Who should conduct user interviews?
Dashboards are a great place to engage a UX professional to work with end users and develop a design. They’re highly visual, easily mocked up, and end users are very aware of the data required to do their jobs. Some organizations struggle with determining where in the development cycle design fits in; we think of design as part of defining the requirements for a dashboard.
In our experience, however, most organizations are not using UX professionals and are instead relying on their product managers, project managers, and developers to define the requirements. This isn’t optimal because these team members most likely don’t have professional training in interviewing end users effectively and helping them “dive deep” to discover their actual wants. But we can only work with what tools are available to us – and that often means that we'll end up with engineers defining the requirements.
How to manage work on your MVP dashboard
Let’s take our example one step further.
With input from the engineers, you can break down the design of the MVP dashboard into different actionable stories based on the elements that end users have indicated are their highest priority. (One story = one unit of work).
In software development, the general rule is to “vertically slice” your stories. This means that you write your stories so that they include all the functionality needed to deliver a piece of business value.

An end user cannot use the data model without a front end, and the front end is the element that ultimately delivers this measurable value. Instead, it makes more sense to accept that the development of the data model is unrealized data value, complete it, and break up the work of the dashboard into different elements that deliver more tangible value. The question is how to do this effectively.
Vertically slice your cake (and eat it too) to deliver tangible business value from data projects
So if we were to use the concept of vertical slicing for data, the data model would be our base and represent a horizontal layer. Then we would write stories that would break down the front end into the slices that create the individual components of a dashboard and we would build it piece by piece. You can then have users validate the data for each of those pieces as it’s made or, if preferred, once the MVP is finished. Contrast this with using only horizontal slicing, which is a slower process that decomposes problems into technical layers (rather than including all functionalities needed in each ‘story’).
Why should you care?
When organizations are too prescriptive about how stories are written, things become imbalanced between how people do their work and how management is telling them to do it. By being realistic that the data model will be mostly complete by the time we start on any of the visual elements of a dashboard, we more accurately reflect how the engineers do their work. This allows us to see bottlenecks in the DataOps processes and have more accurate metrics. We will also have more accurate documentation in our work management platform that will create the record of history of what was created when.
How DI Squared helps with delivery modernization
DI Squared specializes in Delivery Modernization and Data Strategy, and we’ve helped companies to increase productivity by 40% through tailored training, coaching, and Agile implementation. Book your 1:1 with a data product delivery specialist today.
More insights
