I have installed Confluence 5.1 standalone on an Amazon ec2 ubuntu image. Confluence works perfectly, but when I try to install upm or gliffy plugin via marketplace or manually, I receive an "unexpected error" message. I use mysql and first I increased the 'max allowed packet size' to 32M, and then to the maximum, but I still receive the error. I don't know if there is something I have to consider with Amazon ec2, but I hope this is an known error for you and you have experience with confluence on Amazon ec2. I would appreciate very much if you can help me as I would start using confluence as soon as possible.
Hello Slaven Imhof,
To have a better idea about the problem that can be happening could you please check your 'atlassian-confluence.log' (<confluence-home-directory>/logs folder) and then paste here the error stack-trace that appears in the log? This will give us a better idea about what can be the problem.
Just as a side note, would be recommendable also to take a look with the support from Amazon ec2, because as this is not supported platform maybe we can have some problem from the Amazon side here.
Thank you, I will look at Amazon ec2 support as well. I increased the 'max allowed packet', but I still get the mysql error. Here are some lines form the log file, as the number of characters is limited. Please tell me if you need more.
2013-04-01 17:04:10,938 WARN [pool-4-thread-2] [rest.resources.install.InstallTask] call Unexpected error in install task
-- referer: http://ec2-54-228-195-193.eu-west-1.compute.amazonaws.com:8090/plugins/servlet/upm/marketplace | url: /rest/plugins/1.0/available/featured | userName: admin
java.lang.RuntimeException: java.lang.RuntimeException: There was a problem evicting or flushing a PluginData object
Caused by: java.lang.RuntimeException: There was a problem evicting or flushing a PluginData object
Caused by: net.sf.hibernate.exception.GenericJDBCException: could not insert: [com.atlassian.confluence.plugin.persistence.PluginData#1245185]
Caused by: com.mysql.jdbc.PacketTooBigException: Packet for query is too large (27296117 > 1048576). You can change this value on the server by setting the max_allowed_packet' variable.
Would you please check for the size of the currently set max_allowed_packet by querying:
SHOW VARIABLES LIKE 'max_allowed_packet'
What value does it show?
PO Box 7775 #35255
San Francisco, CA, 94120-7775
The 'max_allowed_packet' shown is 1048576. As mentioned, I set the value to
the maximum. So what I see now is a value 'slave_max_allowed_packet' which
is 1073741824. Maybe this is an Amazon rds specialty. I don't know how to
change the 'max_allowed_packet'. Is this known to you?
Please check that you do not have more than one my.cnf file. I have seen situations before where it was set in one place, such as /etc/mysql and another file at /etc . If you like, please open a ticket with us at Gliffy, at support.gliffy.com, and we can work through the problem with you a bit quicker.
It would be remiss of me to mention, you can always install postgresql, which is the recommended database server by Atlassaian, and is not subject to this limitation.
Hi awesome community! In this article, I would like to describe the one of the toolset (service) for the analyze some problems on different Java-based instances, of course, as Atlassian admini...
Connect with like-minded Atlassian users at free events near you!Find a group
Connect with like-minded Atlassian users at free events near you!
Unfortunately there are no AUG chapters near you at the moment.Start an AUG
You're one step closer to meeting fellow Atlassian users at your local meet up. Learn more about AUGs