I am looking for a way to pull more than 100 alerts at a time using the API for a given timeframe.
I know how to do this by setting the offset in my api call but it liimits to 100 for each call. Then I have to change the offset.
Is there a better way to get this data as I need to pull all alerts during a given timeframe for ingest into Elastic? Sometimes there may be more than 100 alerts.
Does anyone have a script available that might provide an example? Language does not matter.
Another thought...is anyone already doing this type of ingest into Elastic that could provide a working soluton to the problem?
Thank you
Yes, opsgenie alert api only return 100 alerts when using https://api.opsgenie.com/v2/alerts api, and below python code can be used to keep retrieving all alerts until all the pages are read. Every api response has the property paging['next'] that can be used to set the api url to the next page of alerts.
import requests
api_key = "<YOUR API KEY>"
api_url = "https://api.opsgenie.com/v2/alerts"
headers = {
"Content-Type": "application/json",
"Authorization": f"GenieKey {api_key}"
}
# Specify your desired parameters, such as query, sort, etc.
params = {
}
while api_url:
response = requests.get(api_url, headers=headers, params=params)
if response.status_code == 200:
alerts = response.json()
all_alerts.extend(alerts.get("data", []))
print (f"Retrieved {len(alerts.get('data', []))} alerts. Total alerts: {len(all_alerts)}")
paging = alerts['paging']
try:
api_url = paging['next']
except KeyError:
api_url = None
print('All alerts read')
else:
print(f"Error: {response.status_code}, {response.text}")
break
print(f"Total number of alerts: {len(all_alerts)}")
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.