Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

Introducing Atlassian‑Hosted LLMs

Hello Rovo community!

Coming off the heels of our Team 25 Europe announcement, our Atlassian-hosted LLMs offering is now generally available! Every aspect of Rovo has been designed responsibly with a security-first mindset. Anchored by our responsible technology principles, we’ve built robust guardrails for data protection, advanced governance capabilities, and continually expand Rovo’s compliance certifications to support your evolving trust requirements. In addition to the built-in security and granular AI governance controls available, we built the Atlassian-hosted LLMs offering to support our customers who have stricter security and compliance needs for their LLM usage. With Atlassian-hosted LLMs, all AI-powered features will rely solely on LLMs that are hosted within the Atlassian Cloud boundary. This keeps your content within our trust boundary while delivering fast, high-quality responses based on leading open models we manage and improve continuously. These models include Llama, GPT-OSS, and potentially more over time to ensure the quality and latency you expect.

What this means for you

  • Data stays within Atlassian’s Virtual Private Cloud (VPC) for LLM processing

  • Consistent quality with curated, fine-tuned open models (e.g., Llama family) and future upgrades without extra work

  • One control path: when enabled at the org level, eligible AI features use Atlassian-hosted models by default

How Atlassian-hosted LLMs work

When your org opts in, supported AI features switch to models hosted entirely within Atlassian’s cloud platform. We operate and secure the inference layer, so your prompts and context don’t leave Atlassian’s trust boundary for model processing.

 AI Trust Overview.png AI data flow.png

Who should use this

Atlassian-hosted LLMs are a strong fit for security and compliance conscious organizations. They are especially useful for teams in regulated industries or teams with strict internal data handling policies that want the advantages of AI without increased data egress or vendor complexity. By keeping inference within Atlassian’s trusted cloud boundary, these teams get enterprise-grade safeguards while enabling modern AI use cases across Jira, Confluence, and Rovo.

*Note: Today, Atlassian-hosted LLMs are only available for organizations on an Enterprise plan.

Getting started

To start using Atlassian-hosted LLMs, your org admins must opt in. Once completed, Atlassian-hosted LLMs will be enabled for all sites in your organization. Org admins can opt in by filing a support ticket here. Learn more in our documentation here.

We are excited to release our Atlassian-hosted LLMs and look forward to hearing your feedback!

2 comments

Rebekka Heilmann _viadee_
Community Champion
February 4, 2026

It says in the documentation that Atlassian‑hosted LLMs are hosted in a US data center. Is that generally true for Rovo or just for this Atlassian-hosted LLM only option?

Any Roadmap for EU hosting. I feel like most Enterprises who need this feature have an issue with US hosted Models, not 3rd party models per se.

Ashwini Rattihalli
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
February 18, 2026

@Rebekka Heilmann _viadee_ thanks for your question.

Today, Atlassian-hosted LLMs are hosted in the US and do not store customer data. Data Residency applies to data at rest (i.e. storage), and Rovo supports this today.

We’re actively exploring how best to support EU-hosted models. If you’re open to a conversation, I’d love to better understand the specific needs driving the requirement for EU-hosted models so we can design the right solution.

Please feel free to email me (arattihalli@atlassian.com) and we can set up a chat.

Comment

Log in or Sign up to comment
TAGS
AUG Leaders

Atlassian Community Events