Hello, we have a Confluence DB that is about 11GB of data and the LINKS table is about 7GB of data. We have over 50 million LINKS but only 53K CONTENT rows. Some content ids have several million links associated with them. I believe this is why our index rebuild is not finishing. Is there any safe links cleanup we can do?
Did it hang at certain percentage? There are few possibilities why index is failing/not completed and it includes the followings but I'm not too sure if LINKS has something to do with it and it also depends on what was thrown in <confluence-home>/logs/atlassian-confluence.log:
What error can be seen in the logs related to index?
same case of the problem here!
I'm supporting a Confluence installation that contains over 5 million entries in LINKS table, but only about 30.000 entries in CONTENT. I've never seen this relation befor.
The re-index is working fine on this installation. But the XML export is absolutely useless with default settings.
I need to give tomcat 8GB of memory to prevent a crash. - And than it still needs over 2h! to export to XML (without attachments!).
Now the question is: Where are all these links comming from?
And i have the same question as @Dragon Moon: "Is there any safe links cleanup i can do?"
Statuspage customers logged more than 194 years of collective incidents in 2018. That’s a whopping 87% increase from the 104 years logged in 2017 , and we aren’t even through December yet....
Connect with like-minded Atlassian users at free events near you!Find a group
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no AUG chapters near you at the moment.Start an AUG
You're one step closer to meeting fellow Atlassian users at your local meet up. Learn more about AUGs