Data Quality
Send Complete Transaction Data
Include the full, original transaction string:- Good
- Bad
Use Accurate Country Codes
The country code significantly affects matching accuracy:- Use the transaction’s origin country, not user’s country
- Use ISO 3166-1 alpha-2 codes (US, NL, GB)
- Default to account country if unknown
Set Correct Transaction Type
Thetype field affects category selection:
| Transaction | Type |
|---|---|
| Purchases, payments, fees | expense |
| Salary, refunds, deposits | income |
Performance Optimization
Deduplicate Similar Transactions
Group identical transactions before enriching:Rate Limit Management
Control request flow to stay within limits:Error Handling
Implement Robust Retry Logic
Handle Partial Results
Don’t discard partial results, use available data:Security
Protect API Keys
- Store keys in environment variables
- Never commit keys to version control
- Use separate keys for dev/staging/prod
- Rotate keys periodically
Make Requests Server-Side
Never expose your API key in client-side code:Monitoring and Observability
Track Key Metrics
Monitor these metrics for your integration:| Metric | Why It Matters |
|---|---|
| Success rate | Detect issues early |
| Latency (p50, p95, p99) | Identify performance problems |
| Partial result rate | Track data quality |
| Error rate by code | Understand failure patterns |
| Credit consumption | Manage costs |
Structured Logging
Architecture Patterns
Async Processing for Bulk Data
For large volumes, process asynchronously:Graceful Degradation
Design for partial failures:Checklist
Use this checklist when building your integration:- API key stored securely in environment
- Full transaction strings sent, not truncated
- Caching implemented for duplicate transactions
- Retry logic with exponential backoff
- Rate limiting handled gracefully
- Partial results processed correctly
- Errors logged with requestId
- Metrics and monitoring in place
- Credit usage tracked