GET Request Output

Once POST Compute Request has been submitted on different models, the computed metadata is stored and can be accessed by using requestId. The time taken to compute metadata from different models is dependant on the time duration of the Conversation and the number of models enabled.

A 60 minute long conversation should take around 15-20 minutes to get to processed status.

To track the status of requested models the Get Processing State route can be used. This route can be integrated with your code by using long-polling to check the state periodically until the processing of the all the requestId are in processed state. For more information refer to the Example given below.

Get Processing state

The state of a particular request ID can take the following values.

Get Processing State

GET https://api.marsview.ai/cb/v1/conversation/get_txn/:txnId

Using this method the processing state of all the requests on a transaction can be fetched. The processing state is found under enableModels.state.status section of the JSON response body. For more information refer to the sample response below. This route additionally provides file information of the uploaded conversation.

Path Parameters

Headers

{
    "status": true,
    "data": {
        "txnId": "txn-bxllq268b0kppew42m-1623239507613",
        "userId": "venkatesh.prasad@marsview.ai",
        "file": {
            "title": "Title",
            "description": "Description",
            "userId": "venkatesh.prasad@marsview.ai",
            "type": "link",
            "deleted": true,
            "s3Availability": true,
            "fileMetadata": {
                "duration": 29
            },
        },
        "enableModels": [
            {
                "requestId": "req-bxllq268cpkppf3jnf-1623239854394",
                "state": {
                    "status": "processed"
                },
                "processId": "process-bxllq268cpkppf3jne-1623239854394",
                "requestDate": "2021-06-09T11:57:34.394Z",
                "modelType": "speech_to_text",
                "modelConfig": {
                    "automaticPunctuation": true,
                    "customVocabulary": [
                        "Marsview"
                    ],
                    "piiDetection": true
                }
            }
        ]
    }
}

Get Request Metadata

GET https://api.marsview.ai/cb/v1/conversation/fetch_metadata/:txnId

Path Parameters

Headers

{
    "status": true,
    "data": {
        "transcript": [
            {
                "sentence": " What is great communication well very simply it is making your data and your theory understandable.",
                "startTime": 590,
                "endTime": 6100.000000000001,
                "speakers": [
                    "1"
                ],
                "keywords": [
                    {
                        "keyword": "great communication",
                        "metadata": [],
                        "type": "DNN"
                    },
                    {
                        "keyword": "theory",
                        "metadata": [
                            "BUSINESS_PHRASE"
                        ],
                        "type": "DNN"
                    },
                    {
                        "keyword": "data",
                        "metadata": [
                            "BUSINESS_PHRASE"
                        ],
                        "type": "Techphrase"
                    }
                ],
                "keySentence": "What is great communication well very simply it is making your data and your theory understandable."
            },
            {
                "sentence": " How do you do that?",
                "startTime": 6150,
                "endTime": 6840,
                "speakers": [
                    "1"
                ],
                "keywords": [],
                "keySentence": "How do you do that?"
            },
            {
                "sentence": " Well first of all you let the audience know why they should listen then you put your data together in a meaningful way and then finally you need to make sure the audience know what information is important and that they need to take away a great presenter makes their data.",
                "startTime": 7410,
                "endTime": 22780,
                "speakers": [
                    "1"
                ],
                "keywords": [
                    {
                        "keyword": "meaningful way",
                        "metadata": [],
                        "type": "DNN"
                    },
                    {
                        "keyword": "data",
                        "metadata": [
                            "BUSINESS_PHRASE"
                        ],
                        "type": "DNN"
                    },
                    {
                        "keyword": "information",
                        "metadata": [
                            "BUSINESS_PHRASE"
                        ],
                        "type": "DNN"
                    },
                    {
                        "keyword": "great presenter",
                        "metadata": [],
                        "type": "DNN"
                    },
                    {
                        "keyword": "audience",
                        "metadata": [
                            "BUSINESS_PHRASE"
                        ],
                        "type": "DNN"
                    },
                    {
                        "keyword": "make",
                        "metadata": [
                            "Development"
                        ],
                        "type": "Techphrase"
                    },
                    {
                        "keyword": "data",
                        "metadata": [
                            "BUSINESS_PHRASE"
                        ],
                        "type": "Techphrase"
                    }
                ],
                "keySentence": ""
            },
            {
                "sentence": " Easy tomorrow what is the one question your audience really wants an answer to",
                "startTime": 23170,
                "endTime": 28680,
                "speakers": [
                    "1"
                ],
                "keywords": [
                    {
                        "keyword": "audience",
                        "metadata": [
                            "BUSINESS_PHRASE"
                        ],
                        "type": "DNN"
                    },
                    {
                        "keyword": "question",
                        "metadata": [],
                        "type": "DNN"
                    },
                    {
                        "keyword": "answer",
                        "metadata": [
                            "BUSINESS_PHRASE"
                        ],
                        "type": "DNN"
                    }
                ],
                "keySentence": "Easy tomorrow what is the one question your audience really wants an answer to"
            }
        ]
    }
}

Example: How to fetch speech to text output for a Transaction ID using Long-polling

This example will demonstrate how to long-poll for Speech-to-text model processing to complete and fetch the Speech-to-text model metadata once the model.state.status is in processed state.

For more information on how to send a compute request on a Particular Transaction ID refer to the Compute Metadata section of the documentation.

Step 1: Get the authentication token.

Using your 'API Key' and 'API Secret'you can generate the token as shown below.

curl --location --request POST 'https://api.marsview.ai/cb/v1/auth/create_access_token' \
--header 'Content-Type: application/json' \
--data-raw '{
    "apiKey":    "{{Insert API Key}}",
    "apiSecret": "{{Insert API Secret}}",
	  "userId":    "demo@marsview.ai"
}'

Step 2: Long-poll on the processing state

Using the Get Processing State method the Processing request status can be fetched periodically (every 300 seconds in this case).

  • If the process is in uploaded state we continue to poll until the process is in either processed or error state.

  • If the process is in processed state or error state we can move to Step 3.

Step 3: Fetching metadata

Once a particular request is in processed state Metadata for that model can be fetched using the Get Request Metadata method.

Shown below is an example Python code snippet for step 2 and step 3

import requests
txn_id  = "replace this with your Transaction ID"
auth_token = "Bearer <API TOKEN>"

processing_state_url = "https://api.marsview.ai/cb/v1/conversation/get_txn/{txn_id}"
metadata_url         = "https://api.marsview.aicb/v1/conversation/fetch_metadata/{txnId}"

def get_speech_to_text_state():
  payload={}
  headers = {
    'authorization': '{}'.format(auth_token)
  }
  response = requests.request("GET", processing_state_url.format(txn_id=user_id), headers=headers, json=payload)
  print(response.text)
  if response.status_code == 200 and response.json()["status"] == "true":
    return response.json()["data"]["enableModels"]["state"]["status"]
  else:
    raise Exception("Custom exception")

def get_speech_to_text_metadata():
  payload={}
  headers = {
    'authorization': '{}'.format(auth_token)
  }
  
  response = requests.request("GET", metadata_url.format(txn_id=user_id), headers=headers, json=payload)
  if response.status_code == 200 and response.json()["status"] == "true":
    return response.json()["data"]["transcript"]
  else:
    raise Exception("Custom exception")

def long_poll_results():
    while True:
      state = get_speech_to_text_state()    
      if state == "processed":
        return get_speech_to_text_metadata()
      elif state == "uploaded":
        time.sleep(300)
      elif state == "error":
        raise Exception("An error has occured during the processing of this request")
if __name__ == "__main__": 
    long_poll_results()

Last updated