This document will describe how to use the REST APIs to queue two or more files to be processed in sequence.  This process requires use of SAFR REST APIs.


To start, create a feed per this guide:


https://support.safr.com/en/support/solutions/articles/69000283233-process-files-with-virgo-video-feeds

 

Note: Create as many feeds as you want to run files in parallel.  If you are on a slow machine, only create one feed.  This way the Task API will only run 1 feed in parallel.


Resource Type

In virgo, you can add a "resource-type" value to the processor.  A processor is the machine running Virgo.  The resource type can be used in the task assignment API to limit tasks to certain machines.  


Video Feeds window GUI only allows resource type to be set to values of camera, live_stream or video_file.  If you wish to use other values, you will need to use the PUT /worker/config API to set it.

 

One you add a feed, proceed to below to set up the POST /task API

 

Set up the Processor and Feed

  1. Make sure processor has 'resource-type' as 'video_file" (the task you submit must also be of this type)
  2. Make sure the feed is in either 'EOS' or 'FAILED' status.  Tasks will only be added to feeds in either of those statuses.

Get Information needed by Task API


Go here: https://virga.real.com/docs/index.html

 

First you need to sign into the Swagger pages so you can run API commands as follows:

  1. Click “Authorize” near top of page and enter “USER:PASSWORD” in the dialog and click “Authorize”


Second, we will use the GET /config/worker API to get the JSON needed for the POST /task API.

  1. Scroll down to the GET /config/worker API unde the "Configuration Query Interfaces" section.
  2. Click the "Try it out" button
  3. Scroll down and click "Execute" (no need to change any arguments)
  4. You should see output as shown below:
  5. Click the "Download" button to save the text to a local file or scroll down and copy just the feed portion and tenant and resource-type.
  6. With above ready, proceed to next step.



Use the POST /task API to submit a file to be processed

Note: Only feeds in EOS, Failed will be given new tasks.  Feeds in Inactive, OK or Error not given new tasks.  Feeds in Error status are considered in retry state and thus are not assigned tasks.


  1. Scroll down to the POST /task API
  2. Click "Try it out"
  3. You should see following
  4. Insert the tenant, resource-type.  Resource type is optional and allows you to limit which processors are assigned.  
  5. Also copy the feed properties you got from above,  Paste in only the properties and not the surrounding curly braces.

    It should look something like this when you are done (your's may vary depending on your feed settings and the resource-type value should be set to your client_id obtained above.
    {
      "feed": {
          "mode": "Enrolled and Stranger Monitoring",
          "name": "Queue1",
          "directory": "main",
          "source": "Server1",
          "site": "VOD Farm",
          "enabled": true,
          "input.type": "file",
          "input.video-clock.enabled": false
          "input.stream.url": "C:\\Files\\vidoefile2.mp4",
      },
      "matching": {
          "resource-type": "video_file",
          "tenant": "yoursafraccountname"
      }
    }
  6. Click "Execute" button

If it ran successfully, should see output like the following:



The Code "200" means the task was accepted.  The returned "task-id" could be used to check status of the job in the GET /task/<taskid> API.



This will queue a task to be run in the Video feed window.  The task will be run by the next available feed on the account matching the same "tenant" and resource-type.  Assignment will be to any processor (computer) connected to the SAFR account.  The files must be accessible using same path on every computer connected to the account.


You can run the command multiple times with different filenames.  Each time it is run, the file is queued to be run on the next available feed.


⚠️ If the same file is submitted a 2nd time while the first is still processing, the second request will be ignored.


To queue files to be processed via curl script, you can copy the curl command that is printed out when the Execute command is run and modify the filename as desired.