Are you in the loop? Keep up with the latest by making sure you're subscribed to Community Announcements. Just click Watch and select Articles.

×
Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Celebration

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root

Avatar

1 badge earned

Collect

Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!

Challenges
Coins

Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.

Recognition
Ribbon

Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!

Leaderboard

Log4j log monitoring

The Microsoft Monitoring Agent has limitations they refer to as "criteria" which must be adhered to in order for it to successfully collect logs (see https://docs.microsoft.com/en-us/azure/azure-monitor/agents/data-sources-custom-logs) i.e. log rotation not supported which is the default of Confluence and Jira. ( Servers are also on Azure Vm.s)

log4j in Jira and Confluence out of the box does not support the requirements (or at least I am not experienced enough to figure it out).

Is there any tools or solutions for this ? We need to collect our logs here but our workaround of "appending the contents to a single file for each day from which Microsoft Monitoring Agent is able to collect logs successfully and upload " causes too much IO in the system and causes performance problems & crashes for jira & confluence. 

Any suggestions  ? How can we send these logs to Azure shortly? 

Kind regards,

1 answer

@Ozzy  Did you manage to have a solution for this ?

Well instead of using windows, having linux servers for it kinda automatically solves the issue.. When we changed the servers we used linux this time.

Like Nic Brough -Adaptavist- likes this

But I see that these constraints still apply on Linux no ? Can you help withe below queries ?

1. Which all log files are you forwarding to Log Analytics ? audit log, catalina log, custom script logs etc ?

 

2. Are you using File pattern or directory ? I think directory must have older log files as well..does Azure Analytics handle them fine ?

 

3. Are you by any chance using App Insights & VM Insights as well ?

Thank you in Advance !

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events