Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

Introducing Atlassian‑Hosted LLMs

Hello Rovo community!

Coming off the heels of our Team 25 Europe announcement, our Atlassian-hosted LLMs offering is now generally available! Every aspect of Rovo has been designed responsibly with a security-first mindset. Anchored by our responsible technology principles, we’ve built robust guardrails for data protection, advanced governance capabilities, and continually expand Rovo’s compliance certifications to support your evolving trust requirements. In addition to the built-in security and granular AI governance controls available, we built the Atlassian-hosted LLMs offering to support our customers who have stricter security and compliance needs for their LLM usage. With Atlassian-hosted LLMs, all AI-powered features will rely solely on LLMs that are hosted within the Atlassian Cloud boundary. This keeps your content within our trust boundary while delivering fast, high-quality responses based on leading open models we manage and improve continuously. These models include Llama, GPT-OSS, and potentially more over time to ensure the quality and latency you expect.

What this means for you

  • Data stays within Atlassian’s Virtual Private Cloud (VPC) for LLM processing

  • Consistent quality with curated, fine-tuned open models (e.g., Llama family) and future upgrades without extra work

  • One control path: when enabled at the org level, eligible AI features use Atlassian-hosted models by default

How Atlassian-hosted LLMs work

When your org opts in, supported AI features switch to models hosted entirely within Atlassian’s cloud platform. We operate and secure the inference layer, so your prompts and context don’t leave Atlassian’s trust boundary for model processing.

 AI Trust Overview.png AI data flow.png

Who should use this

Atlassian-hosted LLMs are a strong fit for security and compliance conscious organizations. They are especially useful for teams in regulated industries or teams with strict internal data handling policies that want the advantages of AI without increased data egress or vendor complexity. By keeping inference within Atlassian’s trusted cloud boundary, these teams get enterprise-grade safeguards while enabling modern AI use cases across Jira, Confluence, and Rovo.

*Note: Today, Atlassian-hosted LLMs are only available for organizations on an Enterprise plan.

Getting started

To start using Atlassian-hosted LLMs, your org admins must opt in. Once completed, Atlassian-hosted LLMs will be enabled for all sites in your organization. Org admins can opt in by filing a support ticket here. Learn more in our documentation here.

We are excited to release our Atlassian-hosted LLMs and look forward to hearing your feedback!

0 comments

Comment

Log in or Sign up to comment
TAGS
AUG Leaders

Atlassian Community Events