Rate Limiting

While communicating with MetaDefender Cloud APIs, you will need to use the authentication mechanism for the given API endpoint and provide your apikey. Each apikey has daily limits, and you can check yours by logging in to MetaDefender Cloud with your OPSWAT account credentials. Additionally, the MetaDefender Cloud server returns custom headers in each response that will help you track your current API usage.

If you don't have an apikey, see our guide: Onboarding Process for MetaDefender Cloud API Users

Each MetaDefender Cloud apikey has limits for each family of APIs (Prevention, Reputation, etc.), and every response from MetaDefender Cloud contains custom headers that inform clients about the current limits.

Description of Custom Headers

  • X-RateLimit-Limit - Your current limit for a given family of APIs.
  • X-RateLimit-Remaining - The number of requests remaining in the current time window, usually set to 24 hours.
  • X-RateLimit-Reset-In - The number of seconds remaining in the current time window.
  • X-RateLimit-Used - The number of requests used in the current time window.

Custom Header Example

Copy

Exceeding Allowed Rate Limit

When the rate limit is exceeded (user performing more requests per day than their license limits) an HTTP 429 code is returned with the following body:

Copy

The limit is reset after 24 hours from the first request. For example, if an apikey starts calling the API at 11:00 AM and finishes up the rate limit by 22:00 PM, the rate limit will be reset the next day at 11:00 AM.

Prevention API Rate Limiting

Multiscanning Rate Limit

  • When a file is uploaded for multiscanning, the limit is subtracted by 1.
  • When an archive is uploaded for multiscanning, and the user requests an unarchiving, every file inside the archive will be extracted and scanned up to the license limit (see the licensing page for details).
  • If the uploaded archive contains embedded archives and unarchiving was requested by the user, the embedded archive is also extracted and the files inside are scanned.
  • Every file extracted and scanned from archives is counted as a separate file and the daily limit will be deducted with the total number of files inside the archive plus one because the archive itself is scanned and counted as an individual file. This rule also applies for embedded archives.
  • If the uploaded file is not an archive but the unarchiving header is sent (rule: unarchive ) the file will not be unarchived and the limit will be subtracted by 1.

For Example:

  • When uploading an archive without unarchiving header, only the archive itself will be scanned as an individual file and the rate limit will be subtracted by one
  • When uploading an archive with 40 files inside with the unarchiving header, the rate limit will be subtracted by 41 (40 files inside + the archive itself)
  • When uploading an archive with 10 files inside with the unarchiving header, and one of the 10 files is also an archive with 5 files, the rate limit will be subtracted by 16 (10 files + 5 files in the embedded archive + the archive itself)

If there is at least one infected file inside the archive, the scan results of the archive will be marked as infected, even if the archive itself is not detected as infected by any engine.

Deep CDR Rate Limit

  • When a file is uploaded and the Deep CDR header is sent (rule: sanitize ), if the file format is supported by Deep CDR, the limit is subtracted by two: one for Multiscanning and one for the Deep CDR analysis
  • When an archive is uploaded with the header rule multiscan_sanitize_unarchive, every sanitizable file inside the archive will be sanitized. In addition to the scanned files, the limit will be subtracted by one for every file sanitized inside the archive
  • If the original file is infected, the sanitized version of the file will be scanned using multiscanning free of charge to prove no infection is detected

Reputation API rate limiting

Not Found Results

  • When a hash lookup returns not found it is subtracted from the limit at a ratio of 5:1 (for every 5 hashes not found, 1 is subtracted from your rate limit).

For Example:

  • If doing 20 hash lookups, and only 10 of them return results (known hashes), the limit will be subtracted by 12 (10 found hashes + 2 times 5 not found hashes)

Bulk Lookups

  • When doing a bulk hash lookup request, the limit is subtracted for every successful response
  • Before doing the lookup on our backend, we eliminate duplicate hashes to avoid subtracting the limit for the same hash multiple times
  • The not found hashes are subtracted at a ratio of 5:1

For Example:

  • Doing a bulk hash lookup for 20 hashes, where only 10 are found in our database, the limit will be subtracted by 12 (10 found hashes + 2 times 5 not found hashes)
Type to search, ESC to discard
Type to search, ESC to discard
Type to search, ESC to discard