Skip to main content
The Cobi Data API is the foundation for all data integration with the Cobi platform.

API Structure

All Data API endpoints follow the same URL pattern:
POST /v1/data/:type
Where :type is one of:
1

Customers

Upload customer profile information
2

Transactions

Upload transaction records with savings and discounts
3

Offers

Upload offer details for promotions and discounts
4

Shops

Upload shop information
5

Branches

Upload branch information (requires shops to exist first)
6

Categories

Upload category information for product classification
7

Shop Categories

Upload shop-category associations (requires shops and categories to exist first)
8

Institutes

Upload educational institution information
9

Courses

Upload course information
10

Tags

Upload tag information
11

Shop Tags

Upload shop-tag associations (requires shops and tags to exist first)
12

Keywords

Upload keyword information
13

Shop Keywords

Upload shop-keyword associations (requires shops and keywords to exist first)
14

Student Favourites

Upload student-favourites associations (requires customer and shops to exist first)
15

Interactions

Record user interactions with items (TAG, OFFER, SHOP) for analytics
Data Dependencies: Some data types have dependencies on other types. For example, branches require valid shop_id references. It’s important to upload data in the correct order: 1. Upload shops first 2. Then upload branches that reference those shops

Common Features

Each endpoint accepts a batch of records (up to 5000) in a single request:
{
  "records": [
    { /* record 1 */ },
    { /* record 2 */ },
    // ...up to 5000 records
  ]
}
This batch processing capability allows for efficient data uploads and reduces the number of API calls needed.
All endpoints require API key authentication via the Authorization header:
Authorization: Bearer YOUR_API_KEY
Content-Type: application/json
Contact our support team to obtain your API key.
All endpoints return a consistent response format:
{
  "request_id": "string", // Unique ID for the overall request
  "type": "branches", // Type of data being processed
  "batches": [
    {
      "batch": 1, // Sequential batch number
      "status": 200, // HTTP-like status code (200, 207, 400, 500)
      "data": {
        "status": "success", // "success", "partial", or "error"
        "type": "branches", // Matches the top-level type
        "message": "Records uploaded successfully", // Summary message
        "request_id": "string",
        "batch_id": "string",
        "error": "" // Optional — empty or null on success; contains error details otherwise
      }
    }
  ]
}
FieldTypeDescription
request_idstringUnique ID for the overall request
typestringType of data being processed (e.g., “branches”)
batchesarrayArray of batch results
batches[].batchintSequential batch number
batches[].statusintHTTP-like status code (200, 207, 400, 500)
batches[].dataobjectBatch result details
batches[].data.statusstring”success”, “partial”, or “error”
batches[].data.typestringMatches the top-level type
batches[].data.messagestringSummary message of the batch result
batches[].data.request_idstringUnique request ID for the batch
batches[].data.batch_idstringUnique batch ID
batches[].data.errorstringError details (empty or null on success)

Available Endpoints

Best Practices

  • Performance
  • Data Quality
  • Security
  • Upload up to 5000 records at a time for optimal performance - Consider scheduling regular batch uploads rather than individual API calls - Implement retry logic with exponential backoff for failed requests
Need help? Contact our support team at [email protected]