Complete guide to using API Gateway Pro
API Gateway Pro provides a powerful edge-based API gateway that sits between your applications and backend services.
All requests go through the proxy endpoint:
curl https://raysep.com/api/proxy/YOUR_ENDPOINT \
-H "Authorization: Bearer YOUR_API_KEY"
Check if the API Gateway is operational.
GET /api/health
Example:
curl https://raysep.com/api/health
Response:
{
"status": "healthy",
"timestamp": "2024-11-01T12:00:00Z",
"version": "1.0.0",
"region": "iad"
}
Get current pricing plans.
GET /api/pricing
Example:
curl https://raysep.com/api/pricing
Route requests through the gateway to your backend APIs.
GET /api/proxy/{endpoint}
POST /api/proxy/{endpoint}
PUT /api/proxy/{endpoint}
DELETE /api/proxy/{endpoint}
Headers:
Authorization: Bearer YOUR_API_KEY (required)Example:
curl https://raysep.com/api/proxy/users/123 \
-H "Authorization: Bearer abc123..." \
-H "Content-Type: application/json"
Get your usage statistics and analytics.
GET /api/admin/stats
Headers:
Authorization: Bearer YOUR_API_KEYResponse:
{
"user": {
"id": "user_123",
"tier": "premium"
},
"usage": {
"today": {
"requests": 1234,
"cached": 456,
"cost": 0.12
},
"thisMonth": {
"requests": 45678,
"cost": 4.56
}
},
"performance": {
"uptime": 99.98,
"p50Latency": 142,
"cacheHitRate": 37.2
}
}
All API requests require authentication using your API key in the Authorization header:
Authorization: Bearer YOUR_API_KEY
Alternatively, you can pass it as a query parameter:
?apiKey=YOUR_API_KEY
Rate limits are applied based on your plan:
| Plan | Requests/Month | Rate Limit |
|---|---|---|
| Free | 10,000 | 60/minute |
| Basic | 100,000 | 100/minute |
| Premium | 500,000 | 200/minute |
When you exceed the rate limit, you'll receive a 429 response:
{
"error": "Rate limit exceeded",
"limit": 100,
"remaining": 0,
"resetAt": 1699000000
}
The gateway automatically caches GET requests based on the response's Cache-Control headers. Benefits:
Control caching behavior with headers:
# Disable caching for a request
curl ... -H "Cache-Control: no-cache"
# Request fresh data (bypass cache)
curl ... -H "Cache-Control: no-store"
The API uses standard HTTP status codes:
200 - Success400 - Bad Request (missing parameters)401 - Unauthorized (invalid API key)429 - Too Many Requests (rate limit exceeded)500 - Internal Server Error503 - Service Unavailable (backend down)All errors return JSON:
{
"error": "Error message here",
"code": "ERROR_CODE",
"details": "Additional information"
}
Set appropriate Cache-Control headers on your backend responses:
Cache-Control: public, max-age=300 # Cache for 5 minutes
Implement exponential backoff when you receive 429 responses:
function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
async function callAPI() {
let retries = 0;
while (retries < 5) {
const response = await fetch(url, options);
if (response.status === 429) {
await sleep(Math.pow(2, retries) * 1000);
retries++;
continue;
}
return response;
}
}
Regularly check your stats endpoint to avoid surprises:
# Get current usage
curl https://raysep.com/api/admin/stats \
-H "Authorization: Bearer YOUR_API_KEY"
Need help? We're here for you:
const axios = require('axios');
const client = axios.create({
baseURL: 'https://raysep.com/api/proxy',
headers: {
'Authorization': `Bearer ${process.env.API_KEY}`
}
});
// Make a request
const response = await client.get('/users/123');
console.log(response.data);
import requests
import os
API_KEY = os.environ['API_KEY']
BASE_URL = 'https://raysep.com/api/proxy'
headers = {
'Authorization': f'Bearer {API_KEY}'
}
response = requests.get(f'{BASE_URL}/users/123', headers=headers)
print(response.json())
const API_KEY = 'your-api-key';
const BASE_URL = 'https://raysep.com/api/proxy';
async function getUser(id) {
const response = await fetch(`${BASE_URL}/users/${id}`, {
headers: {
'Authorization': `Bearer ${API_KEY}`
}
});
return response.json();
}
getUser(123).then(data => console.log(data));