Hi,
I upgrade confluence version v6.4.2 to v6.8 last month.
I use embedded database because contents are so small.
After upgraded, backup file does not created.
[question 1]
Do I must use external database for example postgreSQL?
[question 2]
Why don't backup file created?
I can find some error at catalina.out log file
================================
03-May-2018 00:34:09.059 INFO [http-nio-8090-exec-2] com.sun.jersey.server.impl.application.WebApplicationImpl._initiate Initiating Jersey application, version 'Jersey: 1.19.4 05/24/2017 03:20 PM'
03-May-2018 00:34:40.887 WARNING [ContainerBackgroundProcessor[StandardEngine[Standalone]]] org.apache.catalina.valves.StuckThreadDetectionValve.notifyStuckThreadDetected Thread "http-nio-8090-exec-9" (id=345) has been active for 69,151 milliseconds (since 5/3/18 12:33 AM) to serve the same request for http://doc.brighticsiot.samsungsds.com:8090/admin/dobackup.action?atl_token=30f56ffb88296029a119c9f5eb2882384b6e2ce5&archiveBackup=true&backupAttachments=true&backup=Back+Up and may be stuck (configured threshold for this StuckThreadDetectionValve is 60 seconds). There is/are 2 thread(s) in total that are monitored by this Valve and may be stuck.
java.lang.Throwable
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
at java.io.DataOutputStream.flush(DataOutputStream.java:123)
at org.h2.value.Transfer.flush(Transfer.java:97)
at org.h2.engine.SessionRemote.done(SessionRemote.java:598)
at org.h2.command.CommandRemote.prepare(CommandRemote.java:69)
at org.h2.command.CommandRemote.(CommandRemote.java:46)
at org.h2.engine.SessionRemote.prepareCommand(SessionRemote.java:476)
at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1188)
at org.h2.jdbc.JdbcPreparedStatement.(JdbcPreparedStatement.java:73)
at org.h2.jdbc.JdbcConnection.prepareStatement(JdbcConnection.java:276)
at com.mchange.v2.c3p0.impl.NewProxyConnection.prepareStatement(NewProxyConnection.java:387)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$5.doPrepare(StatementPreparerImpl.java:146)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$StatementPreparationTemplate.prepareStatement(StatementPreparerImpl.java:172)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl.prepareQueryStatement(StatementPreparerImpl.java:148)
at org.hibernate.loader.plan.exec.internal.AbstractLoadPlanBasedLoader.prepareQueryStatement(AbstractLoadPlanBasedLoader.java:241)
at org.hibernate.loader.plan.exec.internal.AbstractLoadPlanBasedLoader.executeQueryStatement(AbstractLoadPlanBasedLoader.java:185)
That does suggest the backup routine is running into problems with the database connection.
The h2 database is not suitable for use in production, and you should not be using it. Scale is only one of the reasons it's not suitable, this sort of problem is another.
I would stand up a production class server with a supported database back-end, then migrate the spaces one at a time over to it (you can't do it en-masse because you can't get a full backup)
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.