A few notes on integration with salesforce, note - some of the data has been omitted.

Soap Requests

username: <omitted username> pwd: <omitted password>

Get your Security token and append it to your requests (In plain text here, but bad practice)

To get your security token (If you don’t have it), go to your user settings -> go to reset my security token and a new one will be emailed to you.

<omitted security token>

Making requests

First generate the wsdl file. To do this go to (In salesforce env):

  • Setup > api > Generate Enterprise WSDL (Self explanatory).
  • You can use wzdler to create the soap bodies.

Create a login Soap Request, matching mostly this:

  • The password needs to have the token suffixed after it.
<Envelope xmlns="http://schemas.xmlsoap.org/soap/envelope/">
        <login xmlns="urn:enterprise.soap.sforce.com">
            <username>`omitted username`</username>
            <password>`omitted password``omitted security token`</password>

This request returns what you require when making future requests. The session seems to be valid for 7200 seconds/2 hours.

The response would be as follows:

<?xml version="1.0" encoding="UTF-8"?>
<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns="urn:enterprise.soap.sforce.com" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
                    <roleId xsi:nil="true"/>
                    <userDefaultCurrencyIsoCode xsi:nil="true"/>
                    <userFullName>Michael Da Costa</userFullName>

Important items from the Login request

  • When you get a successful response from the Login request, Take note of the following response items:
    • <serverUrl>
      • This is used in future requests, and should be prefixed to your future request endpoints so that they would reflect as follows (for example):
        • https://********************-dev-ed.my.salesforce.com/services/Soap/c/47.0
        • Which is <serverUrl>.salesforce.com/services/Soap/c/47.0
    • <sessionId>
      • Additional requests require that you use a session id. So if a request requires a session id, use the value in this element.
The create request

Now that you’ve logged in you can make other requests using the info from ‘Important items from the Login request’ section. An example create request:

Endpoint: https://********************-dev-ed.my.salesforce.com/services/Soap/c/47.0 Soap Message:

<!-- Type: POST request -->
<Envelope xmlns="http://schemas.xmlsoap.org/soap/envelope/">
        <SessionHeader xmlns="urn:enterprise.soap.sforce.com">
        <create xmlns="urn:enterprise.soap.sforce.com">
            <sObjects xsi:type="urn1:Account" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
                    Bluebeards Grog House
                    It is better than Blackbeards.
  • To create an element on a type: add the following to the element:
    • xsi:type="urn1:Account" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    • So that it looks like the following:
      <sObjects xsi:type="urn1:Account" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
  • You have to specify the fields under the sObjects element.
    • Refer to the above example request.

Bulk API

  • To get started, you have to login to the workbench.
    • Go to utilities > REST Explorer

Bulk workflow

  • Create a job

    • Create a post request to the endpoint /services/data/v46.0/jobs/ingest
    • You can view current jobs by creating a GET request to the same endpoint
      • For the post body, use the following to create a simple insert operation (That is - to insert the records as uploaded)
            "operation" : "insert",
            "object" : "Account",
            "contentType" : "CSV",
            "lineEnding" : "CRLF"
      • To confirm creation, besides a 200 status code, you can run a get request against the endpoint and view the resulting records object for the job you’ve created.
        • You would require the Job Id, which can be retrieved from the GET request or the response of the POST to create the JOB
  • Uploading data using job

    • To upload data, you would require a job (defined above) that would have a job id. Using that jobid, create a PUT request to the following endpoint:
      • /services/data/v46.0/jobs/ingest/7504J0000035mIpQAI/batches
      • Which is /services/data/v46.0/jobs/ingest/<JobId>/batches
    • Set the headers of your request to the content type that the job accepts:
      • In this case:
        • Content-Type: text/csv
    • The request body would then be the CSV data that you would want to upload.
      • As an example (This is just one column though - would be delimited by comma. Which can be seen in the GET request for the job):
          "Sample Bulk API Account 1"
          "Sample Bulk API Account 2"
          "Sample Bulk API Account 3"
          "Sample Bulk API Account 4"
  • Processing and closing job

    • Once the data is uploaded, you need to process the data and close it (it being the job)
    • To Start processing, make a PATCH request to /services/data/v46.0/jobs/ingest/7504J0000035mIpQAI where the last arg is the JobId.
      • Set the header to reflect Content-Type: Application/json
      • Set the request body to reflect:
          "state": "UploadComplete"
      • Executing this request sets the process to started.
  • Verifying the upload.

    • Running a GET request against the following endpoint /services/data/v46.0/jobs/ingest/7504J0000035mIpQAI where the last arg is the JobId, will show the state of the job. JobComplete reflects that the job has finished processing.
      • As an side note, you can get the results of the job using: /services/data/v46.0/jobs/ingest/7504J0000035mIpQAI/successfulResults

Streaming API

  • Will have to use post requests to the resource to create events.
    • Use the resource list at the end of this page for reference:
      • https://trailhead.salesforce.com/content/learn/modules/api_basics/api_basics_streaming
    • For your usage: https://developer.salesforce.com/docs/atlas.en-us.platform_events.meta/platform_events/platform_events_publish_api.htm

Note, future articles will cover the Streaming api, and implementations that work with the streaming api.