This section of the user guide describes how you can programmatically interact with the MetaDefender Storage Security REST API. Below are some common tasks that can be done using the available REST APIs:
- Authenticate to obtain a JSON Web Token(JWT)
- Start or stop a process(scan)
- Add / remove storage units
About this REST API
The exposed endpoint is located by default at http(s)://md-storage-server/api/ (for example, the authentication endpoint is available at http(s)://md-storage-server/api/user/authenticate). All requests are handled by the NGINX web server before being proxied to the backend API Gateway service.
All endpoints perform authentication and authorization checks. For these checks to succeed, a valid token should be presented in the Authorization header in the form of Bearer
Please note that all issued tokens have a timestamp and signature associated in order to prevent long-term usage without re - authentication.The lifespan of the token is currently set to 60 minutes, meaning you will have to request a new token before it expires in order to avoid error responses.
Useful links
Fetch all accounts
Retrieve a list of all configured accounts
Add an account
Add a new storage account to the system
Update an account
Update an existing account's configuration
Fetch account by ID
Retrieve a single account's details by ID
Delete an account
Permanently delete an account and its storage references
Fetch available storage units for an account
List all storage units linked to a particular account
Fetch the number of accounts
Get the total number of accounts in the system
Fetch account sources
Fetch account sources
Audit
List audit information
Fetch audit logs
Configuration
Import or export configuration file
Export configuration file
Export the current configuration settings to an archive. The file will be encrypted using the provided password.
Import configuration file
Get enabled modules
Connector
connector
Retrieve external loggers
Update external logger state
Update a Syslog server configuration
Add a new Syslog server configuration
Update a Kafka server configuration
Add a new Kafka server configuration
Delete external logger
File
Retrieve processed files information
Enumerate processed files
Rescan a file on demand
This request is used to update a scanned file with passwords, in case it is an encrypted archive andit could not be scanned because the passwords to decrypt it were not provided. It can also be used withoutproviding any passwords to simply rescan a specific file from a finished scan.
Rescan multiple files on demand
This request is used to rescan and update existing files from a scan based on a filter.
Fetch processing results for a file
File processing is done asynchronously and each analysis request is tracked by a file ID. Because processing a file is a potentially time-consuming operation, scheduling a file for processing and retrieving the results needs to be done using two separate API calls.
This request needs to be made multiple times until the analysis is complete. Analysis completion can be tracked using the processingState and the progress values from the response..
Retrieve archive scan results
Retrieve the archive scan results using either the scanResultId or the parentId
Cancel a file in an ongoing scan
Group
Manage your groups
Fetch all groups
Add a group
Update a group
Fetch group by ID
Delete a group
Fetch the number of groups
Health Status
API that responds with 200 OK if application is running.
Get health status
Onboarding
Manage onboarding
Fetch onboarding configuration
Finish onboarding
Accept Eula
Remediations
Remediations information
Get Remediations by workflow id
Get Remediation by id
Delete Remediation
Add Remediation
Update Remediation
Report
Generate reports
Get scans report
Download PDF Report
Get scan by ID
Start a scan
To scan a specific folder using the optional Folder parameter, provide the absolute folder path: For Amazon S3 / S3 Compatible Types, Azure Blob, Azure Files, Google Cloud, Alibaba Cloud and Oracle Native: Including the Folder Location integrated in MDSS and excluding the Bucket Name, Container, etc. For Box: With or without the "All Files" folder For Sharepoint and Onedrive: Excluding the Document Library, Site, or Group For OPSWAT MFT Storage: The desired Folder Path when integrating with user For NFS / SMB / SFTP / FTP / SharePoint OnPrem: Only the folder path beyond your configured storage root (do not include the base path set during integration) Provide the absolute folder path in the following format "{"Folder":"PATH_TO_SCAN"}
Stop a scan
Delete scans
Fetch last completed scan
Get an active scan by scan ID
Get Real-Time scan by storage Id
Scan Instance
List, add, update and delete Scan Instances
Get Scan Instances
Update an existing Scan Instance
Add a new Scan Instance
Get Scan Instance by ID
Delete a Scan Instance
Test
Scan Pool
List, add, update and delete Scan Pools
Get Scan Pools
Update an existing Scan Pool
Add a new Scan Pool
Get Scan Pool by ID
Delete a Scan Pool
Scan Workflow Snapshot
List scan workflow snapshots
Get all scan workflow snapshots by scan ID
Settings
List or update your settings
Get notifications configuration
Update notifications configuration
Fetch SMTP configuration
Update SMTP configuration
Online license activation
Offline license activation
Get license details
Deactivate license
Fetch retention configuration
Update retention configuration
Generate encryption key
Generate a new encryption key, replacing the old one if it exists.
Get encryption key creation date
Get the creation date of the encryption key.
Get SSL configuration
Get current SSL configuration status.
Update SSL configuration
Update SSL configuration of MDSS by uploading a certificate and a key to enable SSL.
Get Single User Session
Get Simultaneous Sessions allowed for a Single User.
Update Single User Session
Update Simultaneous Sessions allowed for a Single User.
Get Teemetry configuration.
Get MDSS Open Telemetry configuration.
Update Telemetry configuration.
Update MDSS Open Telemetry configuration.
Sso
Sso Authentication and Sso configuration update
Get Sso Configuration
Allows retrieval of the current SSO configuration.
Update Sso Configuration
Allows updating the current Sso Configuration.
Storage
Manage your storage units
Fetch all storage units
Fetch storage by ID
Delete a storage
Update a storage
Note! The following are templates for what is expected in the Credentials, CredentialsFile and Source fields. Please provide the correct values for your storage integration instead of null/false
Alibaba Cloud storage units:
Credentials: "{"Endpoint":null,"AccessKeyId":null,"AccessKeySecret":null,"UseRamRole":false}"
Source: "{"BucketName":null,"FolderLocation":null}"
Amazon S3 / S3 Compatible storage units:
Credentials: "{"ServiceUrl":null,"AccessKeyId":null,"SecretAccessKey":null,"RegionEndpoint":null,"AssumeRoleArn":null,"UseIamRole":false}"
Source: "{"BucketName":null,"FolderLocation":null}"
Azure Blob storage units:
Credentials: "{"TenantId":null,"ClientId":null,"ClientSecret":null,"StorageAccount":null}"
Source: "{"Container":null}"
Azure Files storage units:
Credentials: "{"AccountName":null,"AccountKey":null,"ShareName":null}"
Source: "{"FolderLocation":null}"
Box storage units:
CredentialsFile: upload the credentials file
Source: "{"FolderLocation":null}"
Dell Isilon / SMB Compatible storage units:
Credentials: "{"User":null,"Password":null,"Server":null}"
Source: "{"SharePath":null}"
Google Cloud storage units:
CredentialsFile: upload the credentials file
Credentials: "{"UseAdc":false}"
Source: "{"BucketName":null,"FolderLocation":null}"
Graph storage units:
Credentials: "{"TenantId":null,"ClientId":null,"ClientSecret":null}"
Source: "{"Group":null}"
Fetch storage users
Add a storage
Note! The following are templates for what is expected in the Credentials, CredentialsFile and Source fields. Please provide the correct values for your storage integration instead of null/false
Alibaba Cloud storage units:
Credentials: "{"Endpoint":null,"AccessKeyId":null,"AccessKeySecret":null,"UseRamRole":false}"
Source: "{"BucketName":null,"FolderLocation":null}"
Amazon S3 / S3 Compatible storage units:
Credentials: "{"ServiceUrl":null,"AccessKeyId":null,"SecretAccessKey":null,"RegionEndpoint":null,"AssumeRoleArn":null,"UseIamRole":false}"
Source: "{"BucketName":null,"FolderLocation":null}"
Azure Blob storage units:
Credentials: "{"TenantId":null,"ClientId":null,"ClientSecret":null,"StorageAccount":null}"
Source: "{"Container":null}"
Azure Files storage units:
Credentials: "{"AccountName":null,"AccountKey":null,"ShareName":null}"
Source: "{"FolderLocation":null}"
Box storage units:
CredentialsFile: upload the credentials file
Source: "{"FolderLocation":null}"
Dell Isilon / SMB Compatible storage units:
Credentials: "{"User":null,"Password":null,"Server":null}"
Source: "{"SharePath":null}"
Google Cloud storage units:
CredentialsFile: upload the credentials file
Credentials: "{"UseAdc":false}"
Source: "{"BucketName":null,"FolderLocation":null}"
Graph storage units:
Credentials: "{"TenantId":null,"ClientId":null,"ClientSecret":null}"
Source: "{"Group":null}"
Add batch of storages
Note! The following are templates for what is expected in the Credentials, CredentialsFile and Source fields. Please provide the correct values for your storage integration instead of null/false
Alibaba Cloud storage units:
Credentials: "{"Endpoint":null,"AccessKeyId":null,"AccessKeySecret":null,"UseRamRole":false}"
Source: "{"BucketName":null,"FolderLocation":null}"
Amazon S3 / S3 Compatible storage units:
Credentials: "{"ServiceUrl":null,"AccessKeyId":null,"SecretAccessKey":null,"RegionEndpoint":null,"AssumeRoleArn":null,"UseIamRole":false}"
Source: "{"BucketName":null,"FolderLocation":null}"
Azure Blob storage units:
Credentials: "{"TenantId":null,"ClientId":null,"ClientSecret":null,"StorageAccount":null}"
Source: "{"Container":null}"
Azure Files storage units:
Credentials: "{"AccountName":null,"AccountKey":null,"ShareName":null}"
Source: "{"FolderLocation":null}"
Box storage units:
CredentialsFile: upload the credentials file
Source: "{"FolderLocation":null}"
Dell Isilon / SMB Compatible storage units:
Credentials: "{"User":null,"Password":null,"Server":null}"
Source: "{"SharePath":null}"
Google Cloud storage units:
CredentialsFile: upload the credentials file
Credentials: "{"UseAdc":false}"
Source: "{"BucketName":null,"FolderLocation":null}"
Graph storage units:
Credentials: "{"TenantId":null,"ClientId":null,"ClientSecret":null}"
Source: "{"Group":null}"
Get batch of add storages results
Fetch all storage units by filters (storage type or status)
Add a scan schedule
Update a scan schedule
Fetch scan schedules
Delete scan schedules
Retrieve storage user name
Retrieve storage user name
User
Create, list and update user accounts
Fetch active users
Register a user
Create a user
Authenticate with a username and password
Authenticate with a username and password to obtain a JWT token,Most of the APIs require authentication in the form of providing a JWT token. Call this API in order to receive a token but please note that it's only valid for an hour so it should be periodically refreshed.
Fetch current user information
Retrieve information about the currently authenticated user.
Refresh user token
Refresh an expired accessToken using a valid refreshToken
Logout
Update user role
Fetch users
Delete a user
Update current user
Request password reset
Reset user password
User Tour
Finalize and list user tours
Fetch tours completed by a user
Mark a specific Tour as completed by a user
Webhook
File event trigger of real time processing
This endpoint takes the parameters and tries to find the file in the specified storage and applies the workflow for the specified file
File event trigger of real time processing by storage client ID
This endpoint takes the parameter and tries to find the file in the specified storage and applies the workflow for the specified file
File event trigger of real time processing by storage client ID
This endpoint takes the parameter and tries to find the file in the specified storage and applies the workflow for the specified file
File event trigger of real time processing for Box
This endpoint takes the parameter and tries to find the file in the specified Box storage and applies the workflow for the specified file
File event trigger of real time processing for Box
This endpoint takes the parameter and tries to find the file in the specified Box storage and applies the workflow for the specified file