-
Notifications
You must be signed in to change notification settings - Fork 2.4k
Description
What did you do?
I have created alerts in alertmanager via POST /api/v2/alerts, explicitly setting the startsAt & endsAt values. When the alerts are requested at GET /api/v2/alerts, both are shown correctly. When alertmanager reports those alerts to a custom webhook, the received endsAt is:
- If the alert is not resolved:
0001-01-01T00:00:00Z. - If the alert is resolved: The resolved date.
What did you expect to see?
I expect to receive the same values in the webhook as received when requesting the alerts manually to GET /api/v2/alerts, regardless of the status of the alert.
What did you see instead? Under which circumstances?
I'm receiving different values depending if the alerts are resolved or not, while when requesting the alerts via GET /api/v2/alerts, the endsAt date is always present with a "non-empty" value.
This makes impossible to retrieve a informative endsAt while using send_resolved: false.
Environment
-
System information:
Linux 5.15.0-69-generic x86_64 -
Alertmanager version:
alertmanager, version 0.25.0 (branch: HEAD, revision: 258fab7cdd551f2cf251ed0348f0ad7289aee789) build user: root@abe866dd5717 build date: 20221222-14:51:36 go version: go1.19.4 platform: linux/amd64Also reproduced using this docker image
-
Prometheus version:
Not using prometheus.
-
Alertmanager configuration file:
global: resolve_timeout: 10m route: group_by: ['...'] group_wait: 1s group_interval: 1s repeat_interval: 1s receiver: "log" routes: [] receivers: - name: "log" webhook_configs: - url: http://localhost:10080/notify send_resolved: false
Steps to reproduce
Alertmanager setup:
- Copy the configuration file to your current directory.
- Run alertmanager in the container:
docker run -d --net=host --name prometheus-alertmanager-container -e TZ=UTC -v $(pwd)/alertmanager.yml:/etc/alertmanager/alertmanager.yml ubuntu/prometheus-alertmanager:0.23-22.04_beta - Verify the configuration file is correcly loaded:
curl localhost:9093/api/v2/receivers
The output should be:[{"name":"log"}]
Webhook setup:
- Copy the following code to a file in the current directory named
notification-logger.py.from flask import Flask, request, Response import json app = Flask(__name__) @app.route('/notify', methods=['POST']) def log_notification(): notification = request.json print(json.dumps(notification, indent=2)) return Response('OK', status=200) app.run(host='localhost', port=10080)
- Run the webhook server:
python3 notification-logger.py.
Create & check alerts:
- Create an alert (you may need to modify the value in endsAt to provide a future date):
curl -XPOST localhost:9093/api/v2/alerts -H 'Content-type: application/json' -d '[{"startsAt": "2023-04-27T12:23:33.414Z", "endsAt": "2023-04-28T12:23:32.414Z", "generatorURL": "http://some.domain.com/some/path", "annotations": {}, "labels": {"alertname": "test-alert", "severity": "warning"}}]' - Retrieve the alert via
GET /api/v2/alerts:The output should be:curl -XGET localhost:9093/api/v2/alerts 2>/dev/null | python3 -m json.toolNotice how the value in endsAt is the same as provided in the POST request.[ { "annotations": {}, "endsAt": "2023-04-28T12:23:32.414Z", "fingerprint": "41f99308f4b244ea", "receivers": [ { "name": "log" } ], "startsAt": "2023-04-27T12:23:33.414Z", "status": { "inhibitedBy": [], "silencedBy": [], "state": "active" }, "updatedAt": "2023-04-27T10:47:35.242Z", "generatorURL": "http://some.domain.com/some/path", "labels": { "alertname": "test-alert", "severity": "warning" } } ] - Check out the log in the webhook server. It should be the following:
Notice how the endsAt value in the alert is
{ "receiver": "log", "status": "firing", "alerts": [ { "status": "firing", "labels": { "alertname": "test-alert", "severity": "warning" }, "annotations": {}, "startsAt": "2023-04-27T12:23:33.414Z", "endsAt": "0001-01-01T00:00:00Z", "generatorURL": "http://some.domain.com/some/path", "fingerprint": "41f99308f4b244ea" } ], "groupLabels": { "alertname": "test-alert", "severity": "warning" }, "commonLabels": { "alertname": "test-alert", "severity": "warning" }, "commonAnnotations": {}, "externalURL": "http://ubnt:9093", "version": "4", "groupKey": "{}:{alertname=\"test-alert\", severity=\"warning\"}", "truncatedAlerts": 0 }0001-01-01T00:00:00Zinstead of the provided value.
Metadata
Metadata
Assignees
Labels
Type
Projects
Status