A little while ago, Atlassian rolled out the Audio Briefings. Now, there's the new feature: Audio Narrations, which read your page from top to bottom, word for word.
We tested this across a few different page setups, and wanted to share our experience, especially from an accessibility point of view.
There’s a lot of potential here, let's take a look at it!
Audio Briefings are meant to summarize a page. When we tested them on a short fictional company overview, the narration jumped straight into the story without context, which made the summary feel more like a snippet than an overview.
It’s a good reminder that clear structure still matters, for humans and AI alike. A short intro or info box at the top can go a long way toward helping any reader, or listener, follow your content.
Audio Narrations take a different approach. They read:
the page title,
the author,
and the entire page, in order.
When a page uses columns, the narration handles them predictably: left column first, then right. It’s simple, but it works surprisingly well.
This alone makes Narrations feel more helpful than Briefings for anyone who prefers listening over reading.
When Narrations reach an image, they currently say only that a file is attached, without identifying it as an image or reading alt text.
That said, the feature is new, and it’s clear where it could grow. Meaningful image descriptions can make or break the experience for people who rely on audio or screen readers.
Tip: Until Narrations support alt text, add it anyway! It helps accessibility tools today, and it future-proofs your content for tomorrow.
For tables, Narrations do quite a nice job: they announce the size of the table, read the headers, and then go row by row.
It’s thorough, sometimes very thorough.
Tip: Keep key insights outside the table, so listeners don’t have to wait through a long readout to get the essentials.
When Narrations encounter a macro, they simply say: “There’s a macro here.”
The good news? It means Narrations recognize that something special is on the page.
The not-as-good news? They don’t yet describe what’s inside, whether it’s a chart, a status, a roadmap, or something from an app.
Tip: Add a short sentence before or after a macro to help listeners understand what they’re about to encounter.
Example: “The chart below shows our Q4 timeline.”
We really appreciate the direction Atlassian is taking here. Features like this can make Confluence meaningfully more accessible, not just for users with visual impairments, but for anyone who prefers to listen while working, commuting, or multitasking.
Is Audio Narrations perfect yet? Not quite. But the foundation is solid, and it’s exciting to see accessibility being built into the product rather than added on top.
What about you? Have you tried Narrations yet?
We share tips like this in the Weekly Dose of Confluence newsletter.
Patricia Modispacher _K15t_
3 comments