Confluence daily backup was not working in intermittently. (Confluence version : 5.9.4)
We back up Confluence contents including attachments regularly(every day)
We had set up to backup automatically every four o'clock, and we have recently encountered an issue.(Previously used normally)
2018-01-04 04:03:50,112 ERROR [scheduler_Worker-6] [confluence.importexport.impl.BackupJob] executeJob Error while running the scheduled backup com.atlassian.confluence.importexport.ImportExportException: java.io.FileNotFoundException: /var/atlassian/application-data/confluence/temp/xmlexport-20180104-020000-100/attachments/18583484/18583446/1 (No such file or directory) at com.atlassian.confluence.importexport.impl.FileXmlExporter.doExportInternal(FileXmlExporter.java:85) at com.atlassian.confluence.importexport.impl.FileXmlExporter.doExport(FileXmlExporter.java:53) at com.atlassian.confluence.importexport.DefaultImportExportManager.doExport(DefaultImportExportManager.java:124) at com.atlassian.confluence.importexport.DefaultImportExportManager.exportAs(DefaultImportExportManager.java:94) at sun.reflect.GeneratedMethodAccessor7580.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:106) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204) at com.sun.proxy.$Proxy111.exportAs(Unknown Source) at com.atlassian.confluence.importexport.impl.BackupJob.executeJob(BackupJob.java:72) at com.atlassian.confluence.setup.quartz.AbstractClusterAwareQuartzJobBean.surroundJobExecutionWithLogging(AbstractClusterAwareQuartzJobBean.java:66) at com.atlassian.confluence.setup.quartz.AbstractClusterAwareQuartzJobBean.executeInternal(AbstractClusterAwareQuartzJobBean.java:47) at org.springframework.scheduling.quartz.QuartzJobBean.execute(QuartzJobBean.java:86) at com.atlassian.scheduler.quartz1.Quartz1JobFactory$ClassLoaderProtectingWrappedJob.execute(Quartz1JobFactory.java:65) at org.quartz.core.JobRunShell.run(JobRunShell.java:223) at com.atlassian.confluence.schedule.quartz.ConfluenceQuartzThreadPool.lambda$runInThread$185(ConfluenceQuartzThreadPool.java:16) at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:549) Caused by: java.io.FileNotFoundException: /var/atlassian/application-data/confluence/temp/xmlexport-20180104-020000-100/attachments/18583484/18583446/1 (No such file or directory) at java.io.FileInputStream.open0(Native Method) at java.io.FileInputStream.open(FileInputStream.java:195) at java.io.FileInputStream.<init>(FileInputStream.java:138) at com.atlassian.core.util.zip.FileArchiver.addToArchive(FileArchiver.java:51) at com.atlassian.core.util.zip.FolderArchiver.compressFolder(FolderArchiver.java:82) at com.atlassian.core.util.zip.FolderArchiver.compressFolder(FolderArchiver.java:91) at com.atlassian.core.util.zip.FolderArchiver.compressFolder(FolderArchiver.java:91) at com.atlassian.core.util.zip.FolderArchiver.compressFolder(FolderArchiver.java:91) at com.atlassian.core.util.zip.FolderArchiver.compressFolder(FolderArchiver.java:91) at com.atlassian.core.util.zip.FolderArchiver.doFolderArchive(FolderArchiver.java:55) at com.atlassian.core.util.zip.FolderArchiver.doArchive(FolderArchiver.java:35) at com.atlassian.core.util.FileUtils.createZipFile(FileUtils.java:285) at com.atlassian.confluence.importexport.impl.FileXmlExporter.doExportInternal(FileXmlExporter.java:79) ... 21 more
Whenever the current phenomenon occurs, the attachment file number(In the above case, attachment number is 18583484) that failed to access was different.
When the xmlexport-20180104-020000-100.zip file was unzipped, the attachment(18583484/18583446/1) was in a zero file size. Is this related?
If the backup succeeded, that file size is not zero.
On the other hand, I do not assume that the attached file does not exist.
It is assumed that the specific logic did not create the attachment number during the backup process.
We want to estimate the cause of the problem.
If you know anything about the related part, please ask for your help.
Can you please first, for sanity check, ensure the right folder permissions on the Confluence home folder by enabling full read and write access or check for the right user that start up the Confluence server?
Please also double check if the backup path in the error above matches the default backup path from http://<confluence-url>/admin/dailybackupadmin.action, and look at the Confluence database. Check for the entry 'backupPath' in XML stored in BANDANA table
select * from BANDANA where BANDANAKEY = 'atlassian.confluence.settings';
Lastly, you will want to check over Anti-virus or Firewall settings to see if it is blocking the access. You may want to also speak with any server admin you have if they have any information about what might be causing the permissions error.
Thank you for your reply!!
confluence backup process is create zip file(per 1 day) included confluence attachments(default path : /var/atlassian/application-data/confluence/attachments).
Whenever the current phenomenon occurs, the attachment file number was different.
In some cases the backup succeeds, the attachments of the previously failed number are backed up normally.
I checked the permissions on the attachment number, so the permissions were different for each directory.
drwxr-xr-x. 3 confluence confluence 24 10월 24 15:12 0
drwxr-xr-x. 3 confluence confluence 23 4월 15 2016 1
drwxr-xr-x. 3 confluence confluence 24 3월 31 2017 100
drwxr-xr-x. 3 confluence confluence 23 8월 4 18:05 11
drwx------. 3 confluence confluence 16 2월 16 2015 113
drwx------. 3 confluence confluence 16 2월 23 2015 114
drwx------. 4 confluence confluence 36 11월 3 2015 115
By the way, if it's really a matter of permissions, wiil not it keep failing?
The account is running as a confluence account.
conflue+ 153430 1 5 2017 ? 1-17:53:39 /opt/atlassian/confluence/jre//bin/java -Djava.util.logging.config.file=/opt/atlassian/confluence/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager -Xms2048m -Xmx4096m -XX:+UseG1GC -Djava.awt.headless=true -Xloggc:/opt/atlassian/confluence/logs/gc-2017-12-11_16-57-53.log -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=5 -XX:GCLogFileSize=2M -XX:-PrintGCDetails -XX:+PrintGCTimeStamps -XX:-PrintTenuringDistribution -Djava.endorsed.dirs=/opt/atlassian/confluence/endorsed -classpath /opt/atlassian/confluence/bin/bootstrap.jar:/opt/atlassian/confluence/bin/tomcat-juli.jar -Dcatalina.base=/opt/atlassian/confluence -Dcatalina.home=/opt/atlassian/confluence -Djava.io.tmpdir=/opt/atlassian/confluence/temp org.apache.catalina.startup.Bootstrap start
The backupPath matches the database.
I checked the linux environment with your advice, and checked the selinux section.(enforcing) Could that be affected?
Thanks again for your reply!^^
I meant to say that you should check the permissions for the Confluence user on the backup directory itself, as mentioned previously:
Please also double check if the backup path in the error above matches the default backup path from http://<confluence-url>/admin/dailybackupadmin.action, and look at the Confluence database. Check for the entry 'backupPath' in XML stored in BANDANA tableselect * from BANDANA where BANDANAKEY = 'atlassian.confluence.setti
However, as it is intermittent, it sounds like another issue.
Let me know if you have any questions.
1. Even if We include attachments from backup, it will succeed (intermittently). if We perform a manual backup on the date that the automatic backup fails, it will succeed.
2. The capacity is approximately 25~26 GB. Will this amount of capacity be affected?
3. Thank you about information.
In recommendation, it is understood that the following files(confluence home directory) and DB dump are executed separately. To restore, insert the dump file and change the home directory. Are you fit?
We want to perform backups every day. In this situation, it is estimated that the full compression of the confluence home directory and db dumping each time will also require a lot of server resources. what do you think about this?
4. We have not changed it, including database and configuration, We have not been configured as a proxy.
5. we using the postgres (version : 9.2.15)
6. There are no plans to update yet. If we are going to updating confluence, this issue is resolved?
Thank you for your reply.
The issue is simply that your instance is too large for the automated backups, and these are not meant to be used with a production instance, and are less reliable for recovery.
You will want to follow the Production Backup Strategy that I let you know about.
pg_dump dbname > outfile.
You can speak to your server administrator about automating this on your environment.
It is true that the backup may use a lot of server resources, and for this reason, I would recommend doing it during your daily maintenance period.
An update will not fix this, as we continue to recommend using an alternate Production Backup Strategy for production instances, especially large ones. However, with such an old version of Confluence, you will continue to run into issues. It's always best practice to use a version that has not yet reached End-of-Life.
I hope that has clarified things for you but do let me know if you have any questions.
Thank you for your reply Shannon.
we'll have a plan to change the XML backup to Production Backup Strategy.
I have two questions about your reply.
1. what is the <conf-home>/config directory? We are not found that directory in <conf-home>.
2. If we have a new Confluence installation,
Even if only a small amount of data(confluence.cfg.xml, attachments, index, confluence db dump) is moved, Is it possible to use the same as before?
In the description, It says that the rest(thumnail, viewfile ...) are generated automatically except for the minimum number of files(confluence.cfg.xml, attachment, index, confluence db dump). Is it true?
In Migrating Confluence step 4, "the home directory from your old Confluence server" words is.. I think this words are ambiguous.
Would you be able to successfully use Confluence if We moved only minimal files(confluence.cfg.xml, attachment, index, confluence db dump) instead?
Thanks for your sincere answers. Shannon!
Ui joong Kim.
Once you follow the steps exactly from Migrating Confluence you should have no problems using Confluence again on the new server.
Let me know if you have any questions!
Hi Community! 2018 was filled with changes for our team, both big and small, and we've taken a lot of time to both celebrate our wins and recognize areas of improvement. One thing that we're a...
Connect with like-minded Atlassian users at free events near you!Find a group
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no AUG chapters near you at the moment.Start an AUG
You're one step closer to meeting fellow Atlassian users at your local meet up. Learn more about AUGs