Rate Limits
The Transfer API implements rate limiting to ensure fair usage and maintain service quality for all clients.
Overview
Rate limits control the number of API requests you can make within a specific time window. Exceeding these limits results in temporary request rejection until the limit resets.
Why Rate Limits?
- Prevent abuse - Protect against excessive API usage
- Ensure availability - Maintain service quality for all clients
- Fair resource allocation - Distribute API capacity equitably
- System stability - Prevent overload and service degradation
Good Practice
Design your integration to stay well below rate limits with efficient request patterns and caching.
Rate Limit Values
Standard Endpoints
| Endpoint Category | Limit | Window | Per |
|---|---|---|---|
| Standard Operations | 100 requests | 1 minute | Client ID |
| Batch Operations | 10 requests | 1 minute | Client ID |
| Transfer Queries | 100 results | 1 page | Request |
Detailed Breakdown
- Standard Endpoints
- Batch Operations
- Query Limits
Limit: 100 requests per minute
Applies to:
GET /transfer/{id}- Get single transferPUT /transfer/{id}/reserve- Reserve transferPUT /transfer/{id}/provide-password- Provide passwordPUT /transfer/{id}/sign- Sign transferPUT /transfer/{id}/register- Register transferDELETE /transfer/{id}- Revoke transferPOST /transfer- Create single transfer
Example:
Window: 12:00:00 - 12:00:59
Allowed: 100 requests maximum
Reset: 12:01:00 (new window starts)
Calculation:
- Per client ID
- Rolling 60-second window
- Resets every minute
Limit: 10 requests per minute
Applies to:
POST /transfers- Create multiple transfersGET /transfers- List/query transfers (with filters)
Why lower limit?
- Batch operations are more resource-intensive
- Each request can affect multiple transfers
- Query operations can return large datasets
Example:
Window: 12:00:00 - 12:00:59
Allowed: 10 batch requests maximum
Each request can create up to 100 transfers
Total transfers per minute: up to 1,000
Best practice:
// ✅ Good: Create 500 transfers with 5 batch requests
const batches = chunkArray(transfers, 100);
for (const batch of batches) {
await createBatchTransfer(batch);
await sleep(6000); // 6 seconds between batches
}
// ❌ Bad: Create 500 transfers individually
for (const transfer of transfers) {
await createTransfer(transfer); // Will hit rate limit!
}
Limit: 100 results per page
Applies to:
GET /transferswith pagination
Pagination parameters:
limit(max: 100)offset(starting position)
Example request:
GET /transfer/rest/v1/transfers?limit=100&offset=0 HTTP/1.1
Example response:
{
"_links": {
"self": {"href": "/transfers?limit=100&offset=0"},
"next": {"href": "/transfers?limit=100&offset=100"}
},
"_embedded": {
"transfers": [
// ... 100 transfers
]
},
"total": 250,
"count": 100
}
Retrieving all results:
async function getAllTransfers(filters) {
const allTransfers = [];
let offset = 0;
const limit = 100;
let hasMore = true;
while (hasMore) {
const response = await getTransfers({ ...filters, limit, offset });
allTransfers.push(...response.transfers);
offset += limit;
hasMore = response.transfers.length === limit;
// Respect rate limits
if (hasMore) await sleep(1000);
}
return allTransfers;
}