Alerts
Configure and manage real-time monitoring alerts for your AI systems
Alerts
The Cortif.AI alert system provides real-time notifications when anomalies or issues are detected in your AI systems. Configure custom thresholds and receive notifications through multiple channels.
Alert Object
interface Alert {
id: string;
projectId: string;
title: string;
description?: string;
severity: 'low' | 'medium' | 'high' | 'critical';
status: 'open' | 'acknowledged' | 'resolved' | 'closed';
type?: 'performance' | 'accuracy' | 'drift' | 'anomaly' | 'custom';
metadata?: {
threshold?: number;
actualValue?: number;
metric?: string;
source?: string;
tags?: string[];
};
createdAt: string;
updatedAt: string;
}List Alerts
Retrieve alerts with filtering and pagination options.
GET /api/alerts?page=1&limit=10&severity=high&status=open&projectId=proj_123
Cookie: session-token=your_session_tokenQuery Parameters:
page(optional): Page number (default: 1)limit(optional): Items per page (default: 10, max: 100)severity(optional): Filter by severity levelstatus(optional): Filter by alert statustype(optional): Filter by alert typeprojectId(optional): Filter by specific project
{
"success": true,
"data": [
{
"id": "alert_789",
"projectId": "proj_123",
"title": "Model Accuracy Degradation",
"description": "Model accuracy dropped below 95% threshold",
"severity": "high",
"status": "open",
"type": "accuracy",
"metadata": {
"threshold": 0.95,
"actualValue": 0.89,
"metric": "accuracy",
"source": "validation_pipeline",
"tags": ["production", "critical"]
},
"createdAt": "2024-01-15T14:30:00Z",
"updatedAt": "2024-01-15T14:30:00Z"
}
],
"pagination": {
"page": 1,
"limit": 10,
"total": 5,
"totalPages": 1
}
}curl -X GET "https://api.cortif.ai/api/alerts?severity=high&status=open" \
-H "Cookie: session-token=your_session_token"Create Alert
Create a new alert manually or through automated monitoring.
POST /api/alerts
Content-Type: application/json
Cookie: session-token=your_session_token
{
"projectId": "proj_123",
"title": "High Latency Detected",
"description": "Response time exceeded acceptable threshold",
"severity": "medium",
"type": "performance",
"metadata": {
"threshold": 100,
"actualValue": 250,
"metric": "response_time_ms",
"source": "api_monitoring",
"tags": ["latency", "performance"]
}
}Required Fields:
projectId: Associated project IDtitle: Alert title (string, 1-200 characters)severity: Alert severity level
Optional Fields:
description: Detailed descriptiontype: Alert categorymetadata: Additional alert data
{
"success": true,
"data": {
"id": "alert_456",
"projectId": "proj_123",
"title": "High Latency Detected",
"description": "Response time exceeded acceptable threshold",
"severity": "medium",
"status": "open",
"type": "performance",
"metadata": {
"threshold": 100,
"actualValue": 250,
"metric": "response_time_ms",
"source": "api_monitoring",
"tags": ["latency", "performance"]
},
"createdAt": "2024-01-15T15:00:00Z",
"updatedAt": "2024-01-15T15:00:00Z"
}
}curl -X POST https://api.cortif.ai/api/alerts \
-H "Content-Type: application/json" \
-H "Cookie: session-token=your_session_token" \
-d '{
"projectId": "proj_123",
"title": "High Latency Detected",
"severity": "medium",
"type": "performance",
"metadata": {
"threshold": 100,
"actualValue": 250,
"metric": "response_time_ms"
}
}'Get Alert
Retrieve a specific alert by ID.
GET /api/alerts/alert_789
Cookie: session-token=your_session_token{
"success": true,
"data": {
"id": "alert_789",
"projectId": "proj_123",
"title": "Model Accuracy Degradation",
"description": "Model accuracy dropped below 95% threshold",
"severity": "high",
"status": "open",
"type": "accuracy",
"metadata": {
"threshold": 0.95,
"actualValue": 0.89,
"metric": "accuracy",
"source": "validation_pipeline",
"tags": ["production", "critical"]
},
"createdAt": "2024-01-15T14:30:00Z",
"updatedAt": "2024-01-15T14:30:00Z",
"history": [
{
"id": "hist_123",
"action": "created",
"details": "Alert created by automated monitoring",
"userId": "system",
"createdAt": "2024-01-15T14:30:00Z"
}
]
}
}curl -X GET https://api.cortif.ai/api/alerts/alert_789 \
-H "Cookie: session-token=your_session_token"Update Alert Status
Update the status of an existing alert.
PATCH /api/alerts/alert_789
Content-Type: application/json
Cookie: session-token=your_session_token
{
"status": "acknowledged",
"notes": "Investigating the accuracy drop. Checking recent data changes."
}Allowed Status Transitions:
open→acknowledged,resolved,closedacknowledged→resolved,closedresolved→closed,open(if issue reoccurs)
{
"success": true,
"data": {
"id": "alert_789",
"projectId": "proj_123",
"title": "Model Accuracy Degradation",
"description": "Model accuracy dropped below 95% threshold",
"severity": "high",
"status": "acknowledged",
"type": "accuracy",
"metadata": {
"threshold": 0.95,
"actualValue": 0.89,
"metric": "accuracy",
"source": "validation_pipeline",
"tags": ["production", "critical"]
},
"createdAt": "2024-01-15T14:30:00Z",
"updatedAt": "2024-01-15T15:15:00Z"
}
}curl -X PATCH https://api.cortif.ai/api/alerts/alert_789 \
-H "Content-Type: application/json" \
-H "Cookie: session-token=your_session_token" \
-d '{
"status": "acknowledged",
"notes": "Investigating the accuracy drop"
}'Send Alert Notification
Manually trigger alert notifications to configured channels.
POST /api/alerts/send
Content-Type: application/json
Cookie: session-token=your_session_token
{
"alertId": "alert_789",
"channels": ["email", "slack", "webhook"],
"message": "Urgent: Model accuracy has degraded significantly",
"recipients": ["team@company.com", "#alerts-channel"]
}Available Channels:
email: Email notificationsslack: Slack channel notificationswebhook: Custom webhook notificationssms: SMS notifications (if configured)
{
"success": true,
"data": {
"notificationId": "notif_123",
"alertId": "alert_789",
"channels": ["email", "slack"],
"status": "sent",
"sentAt": "2024-01-15T15:20:00Z",
"results": {
"email": {
"status": "delivered",
"recipients": ["team@company.com"]
},
"slack": {
"status": "delivered",
"channel": "#alerts-channel"
}
}
}
}curl -X POST https://api.cortif.ai/api/alerts/send \
-H "Content-Type: application/json" \
-H "Cookie: session-token=your_session_token" \
-d '{
"alertId": "alert_789",
"channels": ["email", "slack"],
"message": "Urgent: Model accuracy has degraded significantly"
}'Alert Rules & Automation
Configure automated alert rules for continuous monitoring.
Create Alert Rule
POST /api/alerts/rules
Content-Type: application/json
Cookie: session-token=your_session_token
{
"name": "Accuracy Threshold Rule",
"projectId": "proj_123",
"conditions": {
"metric": "accuracy",
"operator": "less_than",
"threshold": 0.95,
"duration": "5m"
},
"actions": {
"severity": "high",
"channels": ["email", "slack"],
"template": "Model {{project.name}} accuracy dropped to {{value}} (threshold: {{threshold}})"
},
"enabled": true
}{
"success": true,
"data": {
"id": "rule_456",
"name": "Accuracy Threshold Rule",
"projectId": "proj_123",
"conditions": {
"metric": "accuracy",
"operator": "less_than",
"threshold": 0.95,
"duration": "5m"
},
"actions": {
"severity": "high",
"channels": ["email", "slack"],
"template": "Model {{project.name}} accuracy dropped to {{value}} (threshold: {{threshold}})"
},
"enabled": true,
"createdAt": "2024-01-15T16:00:00Z"
}
}Alert Analytics
Get alert statistics and trends.
GET /api/alerts/analytics?projectId=proj_123&period=7d&groupBy=severity
Cookie: session-token=your_session_tokenQuery Parameters:
projectId(optional): Filter by projectperiod: Time period (1h,24h,7d,30d)groupBy: Group results by (severity,type,status)
{
"success": true,
"data": {
"period": "7d",
"totalAlerts": 45,
"breakdown": {
"critical": 3,
"high": 12,
"medium": 20,
"low": 10
},
"trends": [
{
"date": "2024-01-15",
"count": 8,
"severity": {
"critical": 1,
"high": 3,
"medium": 4,
"low": 0
}
}
],
"mttr": "2h 15m",
"resolutionRate": 0.87
}
}Error Handling
Common error responses for alert operations:
| Status Code | Error | Description |
|---|---|---|
400 | INVALID_INPUT | Invalid request data |
401 | UNAUTHORIZED | Authentication required |
403 | FORBIDDEN | Insufficient permissions |
404 | ALERT_NOT_FOUND | Alert doesn't exist |
422 | VALIDATION_ERROR | Input validation failed |
429 | RATE_LIMITED | Too many requests |
SDK Examples
JavaScript/TypeScript
import { CortifClient } from '@cortif/sdk';
const client = new CortifClient();
// List alerts
const alerts = await client.alerts.list({
severity: 'high',
status: 'open',
projectId: 'proj_123'
});
// Create alert
const newAlert = await client.alerts.create({
projectId: 'proj_123',
title: 'High Error Rate',
severity: 'high',
type: 'performance',
metadata: {
threshold: 0.01,
actualValue: 0.05,
metric: 'error_rate'
}
});
// Update alert status
await client.alerts.updateStatus('alert_789', {
status: 'acknowledged',
notes: 'Team is investigating'
});
// Send notification
await client.alerts.sendNotification({
alertId: 'alert_789',
channels: ['email', 'slack'],
message: 'Critical alert requires immediate attention'
});Python
from cortif import CortifClient
client = CortifClient()
# List alerts
alerts = client.alerts.list(
severity='high',
status='open',
project_id='proj_123'
)
# Create alert
new_alert = client.alerts.create({
'project_id': 'proj_123',
'title': 'High Error Rate',
'severity': 'high',
'type': 'performance',
'metadata': {
'threshold': 0.01,
'actual_value': 0.05,
'metric': 'error_rate'
}
})
# Update alert status
client.alerts.update_status('alert_789', {
'status': 'acknowledged',
'notes': 'Team is investigating'
})Best Practices
Alert Management Tips
- Severity Levels: Use appropriate severity levels to avoid alert fatigue
- Clear Titles: Write descriptive alert titles that explain the issue
- Actionable Descriptions: Include context and suggested actions
- Proper Routing: Configure notifications to reach the right team members
- Regular Review: Periodically review and tune alert thresholds
- Documentation: Document alert response procedures
Integrations
Slack Integration
Configure Slack notifications for alerts:
{
"type": "slack",
"webhook_url": "https://hooks.slack.com/services/...",
"channel": "#alerts",
"username": "Cortif.AI",
"icon_emoji": ":warning:"
}Email Integration
Set up email notifications:
{
"type": "email",
"smtp_server": "smtp.company.com",
"port": 587,
"username": "alerts@company.com",
"recipients": ["team@company.com", "oncall@company.com"]
}Webhook Integration
Configure custom webhook notifications:
{
"type": "webhook",
"url": "https://api.company.com/alerts",
"method": "POST",
"headers": {
"Authorization": "Bearer token",
"Content-Type": "application/json"
}
}Next Steps
- Configure monitoring runs to generate alerts
- Set up team notifications for alert routing
- Create evidence packs for alert investigation
- Integrate with external tools for enhanced alerting