Stream API

QuickStart

Creating a stream to then upload data to a DataSet only requires these steps:

  1. Create a stream
  2. Create a stream execution
  3. Upload your data
  4. Finalize your execution

After creating a stream, you can create multiple executions to upload batches of data.

NOTE: In order to utilize this Quickstart you will need to obtain an access token or you can leverage any of Domo’s SDKs which will also handle authentication.

Step 1: Create a stream

The first step required to upload data to your Domo instance is to create a Stream. Include a DataSet and an update method in your request body.

The supported update methods are "APPEND" and "REPLACE".

Sample Request

See it in your language

See this sample request in Java, Python.
$ curl 'https://api.domo.com/v1/streams' -i -X POST -H 'Content-Type: application/json' -H 'Accept: application/json' -H 'Authorization: bearer <your-valid-oauth-access-token>' -d '{
  "dataSet" : {
    "name" : "Leonhard Euler Party",
    "description" : "Mathematician Guest List",
    "rows" : 0,
    "schema" : {
      "columns" : [ {
        "type" : "STRING",
        "name" : "Friend"
      }, {
        "type" : "STRING",
        "name" : "Attending"
      } ]
    }
  },
  "updateMethod" : "APPEND"
}'

The response contains a Stream ID. Keep track of this ID as you will use it for all uploads. If you ever need to find a Stream ID that is associated with a DataSet, there is a search API. See the detailed Stream API documentation for more details.

Sample Response

HTTP/1.1 201 Created
Location: https://api.local.domo.com/v1/streams/42
Content-Type: application/json;charset=UTF-8
Content-Length: 470

{
  "id" : 42,
  "dataSet" : {
    "id" : "0c1e0dbe-9f71-4625-9b50-b79e6e4266f2",
    "name" : "Leonhard Euler Party",
    "description" : "Mathematician Guest List",
    "rows" : 0,
    "columns" : 0,
    "owner" : {
      "id" : 27,
      "name" : "DomoSupport"
    },
    "createdAt" : "2016-05-27T17:53:04Z",
    "updatedAt" : "2016-05-27T17:53:10Z"
  },
  "updateMethod" : "APPEND",
  "createdAt" : "2016-05-27T17:53:05Z",
  "modifiedAt" : "2016-05-27T17:53:05Z"
}

Step 2: Create a stream execution

After creating a Stream, you now need to create an Execution. An Execution does a few things. It creates a line item in the DataSet history, it triggers all animations in the product (for example, the data importing arc in the Data Warehouse UI), and it also tells Domo that you're about to start sending data.

The primary reason to tell Domo that you're starting to upload data (and one of the key benefits of Streams) is that it allows you to break up your data into smaller chunks and parallelize those uploads at the same time. This can result in huge performance increases and protects you from potential HTTP network issues. For example, if you break a million rows up into one hundred 10,000 chunks, if one upload fails, you only need re-upload that one failed chunk.

Sample Request

See it in your language

See this sample request in Java, Python.
$ curl 'https://api.domo.com/v1/streams/42/executions' -i -X POST -H 'Content-Type: application/json' -H 'Accept: application/json' -H 'Authorization: bearer <your-valid-oauth-access-token>'

The response contains the Execution ID. Each Stream has it's own incrementing Execution sequence. The Execution ID is important to keep track of as you'll use it as a path parameter with each part (or chunk) upload you perform.

Sample Response

HTTP/1.1 200 OK
Content-Type: application/json;charset=UTF-8
Content-Length: 227

{
  "id" : 1,
  "startedAt" : "2016-05-26T22:20:21Z",
  "currentState" : "ACTIVE",
  "createdAt" : "2016-05-26T22:20:21Z",
  "modifiedAt" : "2016-05-26T22:20:21Z"
}

Step 3: Upload your data

As previously mentioned, this step can be parallelized. How small should you make each part? That depends on a couple factors. Generally, the smaller the part, the less likely it is to make it to your Domo instance without a network disruption. However, there is a lot of overhead incurred by creating too many requests. We've found that for narrow DataSets (those with around 100 columns or less), somewhere between 10,000 and 100,000 rows works well.

The part ID specified on the request is used to track ordering of the parts as they're reassembled on your Domo instance. The client is responsible for incrementing these IDs with each part. It's preferable to start with 1 for each Execution and increment with each subsequent part.

Please Note: it is preferable that these requests be compressed using gzip.

Sample Request

See it in your language

See this sample request in Java, Python.
$ echo '"Pythagoras","FALSE"
"Alan Turing","TRUE"
"George Boole","TRUE"
' | gzip -c | curl 'https://api.domo.com/v1/streams/42/executions/1/part/1' -i -X PUT -H 'Content-Type: text/csv' -H 'Accept: application/json' -H 'Authorization: bearer <your-valid-oauth-access-token>' -d @-

Sample Response

HTTP/1.1 200 OK
Content-Type: application/json;charset=UTF-8
Content-Length: 227

{
  "id" : 1,
  "startedAt" : "2016-05-27T15:16:01Z",
  "currentState" : "ACTIVE",
  "createdAt" : "2016-05-27T15:15:59Z",
  "modifiedAt" : "2016-05-27T15:15:59Z"
}

If you receive a status code of 200, your data has been received and it is stored until you finish the execution where it is then reassembled prior to indexing. If you receive a non 200 response, something has happened and you'll need to upload that part again.

Step 4: Finalize your execution

When you are done uploading all your parts (or chunks of data). Send a final request that tells your Domo instance that you're done.

Sample Request

See it in your language

See this sample request in Java, Python.
$ curl 'https://api.domo.com/v1/streams/42/executions/1/commit' -i -X PUT -H 'Accept: application/json' -H 'Authorization: bearer <your-valid-oauth-access-token>'

HTTP Response

HTTP/1.1 200 OK
Content-Type: application/json;charset=UTF-8
Content-Length: 227

{
  "id" : 1,
  "startedAt" : "2016-05-27T15:16:01Z",
  "currentState" : "ACTIVE",
  "createdAt" : "2016-05-27T15:15:59Z",
  "modifiedAt" : "2016-05-27T15:15:59Z"
}

Next steps

Congrats. you now have a DataSet with data that you’ve uploaded from a Stream.

You may want to learn how to manage streams in more detail or explore these other topics:




Need additional help?

No problem, we'd love to help. Explore our documentation, answers to frequently asked questions, or join other developers in Domo's Developer Forum.  For further help, feel free to email us or contact our sales team.