/
ARES Kudo Script jobs

ARES Kudo Script jobs

We provide an endpoint to run a script on a drawing and an endpoint to request the status of a script job.

Run a script job

Endpoint

GET /startScriptJob

Query params

Source Data: One is required

  • file - Identifier of designated file in backend system

  • fileUrl - URl to the file to open. Read only access (optional to the file parameter)

Optional for “file” Source

  • sessionId - Session id (optional, will be passed in header of requests to the API server)

  • token - Token (optional, will be passed in header of requests to the API server)

API Server

Partner Authentication (SaaS version only)

  • auth - application id associated with the partner (not require if wrapped into api token)

  • userId - partner defined user id of the user that is calling the editor url - mandatory to pass to facility counting of distinct users

File Conversion / Export (On-premise version only)
Note: file conversion can be achieve also with SaaS version by running appropriate commands from script, see https://graebert.atlassian.net/wiki/spaces/KUDOPARTNER/pages/1180107318 and https://graebert.atlassian.net/wiki/spaces/KUDOPARTNER/pages/1180729641

  • outputPath - optional path to store resulting file(s) like script output and all files generated by the script (e.g. an exported pdf file); if not specified the files are written to current working directory

  • fileType - PDF, since 1.151: DWG, DXF, PNG, SVG, BMP, JPG

  • dwgVersion - 2018, R32, 2013, R27, 2010, R24, 2007, R21, 2004, R18, 2000, R15 (only for DWG and DXF)

Scripts

  • script - http(s) url of the script file (Saas: mandatory; On-premise: if specified fileType is ignored)

  • scriptOutputFile - It specifies the filename where the script output is written to (optioanl; SaaS: The file is also uploaded to partner S3 content bucket logs folder)

Resources (optional)

  • resourceBundleURL - It specifies the location (URL) of a zip bundle that we download and extract to a specific location in the application's session folder

Output bucket (SaaS version only - optional)

  • storageBucket - it specifies designated output S3 bucket where the resulting file(s) like script output and all files generated by the script are uploaded to (if not specified the default S3 content bucket that is associated with your application identifier is used)

  • storageBucketRegion - region the storageBucket is located in

 

Note for SaaS: Instead of passing all parameters individually, you can enclose the parameters in a generated apitoken that will be passed as a single “apitoken” query parameter (see https://graebert.atlassian.net/wiki/spaces/KUDOPARTNER/pages/1930526874).

Response

Text: JobID::{jobId}

jobId - unique job id that can be used to request job status.

Script file

Is a text file with extension *.scr that contains commands to execute.

Important: The script should always end with _EXIT command to make sure the server session is exited properly.

Example:

_CIRCLE 15,15 5 _CIRCLE 25,15 5 _EXIT

Get script job status

Endpoint

GET /scriptJobStatus

Query parameters

  • jobId - job id returned by start script job request

  • auth - application id associated with the partner (Saas version only)

Response

Text: Job Status::{status}

status  - job status (Started, Failed or Successful)

Script job output

If scriptOutputFile parameter was provided with /startScriptJob request, a file with this name will be uploaded to partner’s S3 content bucket (or optional storageBucket) logs folder.

If the parameter was not passed, a default output file named {file}.txt will be created.

 

Considerations for self hosted version

wt_config.xml for job storage table (dynamodb assumed). Set to empty for local storage and disable aws access.

<property name="scriptDb"></property> <property name="aws-access">false</property>



 

Related content

Job Samples
Job Samples
More like this
Process to request token
Process to request token
Read with this
ARES CLI - Execute bim2pdf Jobs via API
ARES CLI - Execute bim2pdf Jobs via API
More like this
API Server: Connecting ARES Kudo to a data backend
API Server: Connecting ARES Kudo to a data backend
More like this