Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

Dashboards Guess. AI Answers the Questions That Matter.

Many Jira reporting tools present information through dashboards.

Dashboards are not answers.
They are assumptions — representations of questions someone decided in advance were important.

Once those assumptions are fixed, every other question becomes harder to ask.

If you’ve ever opened a reporting screen and wondered “What am I supposed to look for here?”, you’ve already experienced this limitation.

Real questions appear after the data is explored

In real teams, understanding work is not a predefined process. Questions emerge naturally:

  • On which days do issues return to a previous status most often?

  • Does rework happen more in the morning or the afternoon?

  • Which issues repeatedly move backwards instead of forward?

  • How many times does an issue enter a given status?

  • How long does work actually spend in each part of the workflow?

These are not exotic questions.
They are the result of looking at real data and wanting to understand it better.

But questions like these are difficult to express when analysis is constrained to predefined visual structures.

Asking questions is simpler than learning interfaces

Recently, I took a different approach.

I exported raw Jira data and simply asked AI to help me analyze it.

Specifically, I asked AI to work with data exported from two of the applications released by our company:

 

 

 

These tools focus on providing complete, structured exports:

  • Issue change history

  • Time spent in each status

  • Entry counts per status

This kind of data is ideal for interpretation and cross-analysis.

Using these exports, I asked AI to:

  • Create an Excel file listing each issue and the time it spent in each status

  • Add columns showing which user re-entered an issue into a given status

  • Create cycle time columns, after I explicitly provided:

    • the name of each cycle

    • and the workflow steps that should be considered part of that cycle

  • Create charts in the same Excel file highlighting where work was looping or returning

  • Show patterns by day of the week and by time of day

I didn’t write formulas.
I didn’t design dashboards.
I didn’t need to know Excel.

I asked questions — and received a file with tables and charts ready to explore.

Cycles are defined by intent, not by tools

Every team defines cycles differently, because workflows differ by issue type and process.

Instead of forcing a predefined interpretation, I simply described:

  • what I considered a cycle

  • and which workflow steps belonged to it

AI applied that definition consistently across the data.

No rigid assumptions.
No configuration screens.
Just results aligned with how work is actually understood by the team.

From one question to full automation

The next step is obvious.

I plan to write down the exact questions I ask AI, alongside the exported CSV data and the last report AI built for me, as a reminder template.
With that, the entire process can be automated:

  • Export structured data

  • Feed it to AI

  • Apply the same set of questions

  • Generate updated Excel files with tables and charts

No dashboards to rebuild.
No reports to reconfigure.

Once questions are written down, they become reusable.
The analysis becomes repeatable.

At that point, the only limit is curiosity.

Raw data becomes an interface when AI is involved

Raw data used to feel inaccessible.
Now it is conversational.

Excel, in this context, is not something you need to master — it is simply a container. A portable file that holds answers, tables, and charts that can be explored further or shared.

Instead of navigating menus and options, you describe what you want to understand. The data responds.

History explains how things changed

Time explains how long they stayed that way

Change history explains decisions and reversals.
Time-based data shows where work waits, loops, or quietly stalls.

When both are available in a clean, structured form, they can be combined and queried freely — without predicting the questions in advance.

The future of analysis is curiosity-driven

The most valuable insights rarely start with a dashboard.
They start with curiosity.

“I wonder if this happens more often mid-week.”
“I wonder if rework happens later in the day.”

AI makes it possible to follow those questions immediately, using data that is complete, exportable, and easy to interpret.

Dashboards guess.
Raw data exposes the truth.
AI turns questions into answers.



A personal note


I’ve always liked Bob Dylan’s song “The Times They Are a-Changin’”.
It feels particularly relevant right now.

Having access to raw data from Issue History Dashboard for Jira and Time in Status Dashboard for Jira, bringing that data to AI, and asking questions changes the way you think about analysis.

It’s like the difference between choosing something from a restaurant menu and having your own cook.

When you’re limited to a menu, you pick from what’s offered.
When you bring the best ingredients and have a cook, you ask for what you actually want.

Raw data is the ingredient.
AI is the cook.

Once you experience that, it’s hard to go back to predefined options.

The times really are changing.



 

0 comments

Comment

Log in or Sign up to comment
TAGS
AUG Leaders

Atlassian Community Events