pozil / sf-docs-to-s3

Export Salesforce Documents to Amazon S3

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Export Salesforce Documents to Amazon S3

About

This project is an integration between a Salesforce Org, Salesforce Functions and Amazon S3.

The project is complementary with this node app that allows Salesforce users to download Amazon S3 documents.

The goal of the integration is to export documents from Salesforce to S3 to reduce file storage consumption on the Salesforce side.

Thanks to Functions, we transfer documents to S3 with the following scenario:

  1. User uploads and attaches a document to a record in Salesforce.
  2. Apex trigger kicks in after the document is saved and calls a handler class with the document metadata (not the document binary content to avoid Apex limits).
  3. Apex trigger handler class invokes a Salesforce Function asynchronously with the document metadata.
  4. Function retrieves the document content using the Salesforce REST API.
  5. Function uploads the document content to an Amazon S3 bucket.
  6. Once the document is uploaded, the function creates a Salesforce record that links the document stored in S3 to the record and it calls an Apex callback method on the trigger handler class.
  7. Apex callback method removes the original document from Salesforce.

Integration architecture

Installation

Prerequisites

Salesforce Resources

  1. Sign up for a Salesforce Functions trial org.
  2. Enable Dev Hub in your org.
  3. Install the Salesforce CLI.
  4. Authorize your Dev Hub in the Salesforce CLI.

AWS Resources

  1. Sign up for a AWS free-tier account.
  2. Create a S3 bucket.
  3. Complete these steps in the Identity and Access Management (IAM) console:
    1. Create a policy that grants write access on your S3 bucket.
    2. Create a user.
    3. Assign your policy to the user.

Step 1: Deploy and configure the Salesforce Org

  1. Install the project in a Scratch org by running this script:

    MacOS or Linux

    ./install-dev.sh

    Windows

    install-dev.bat

    You can install the project on other types of Salesforce orgs by looking at the content of the scripts and changing the commands.

  2. For each Object that you would like to export document for (Account in this example), create a record for the custom metadata type "S3 Document Setting"

    1. Navigate to Custom Code > Custom Metadata Types in Salesforce Setup

    2. Click Manage Records for "S3 Document Setting"

    3. Click New

    4. Assuming that we want to work with the Account object, enter those values then click Save:

      • Label: S3 Account Document
      • S3 Document Setting Name: S3_Account_Document
      • Object API Name: Account
  3. For each Object that you would like to export document for, create a junction object between S3_Document__c and your object (Account based on the previous example)

    Junction object

    1. Navigate to Object Manager in Salesforce Setup

    2. Click Create and select Custom Object

    3. Enter those values then click Save:

      • Label: S3 Account Document
      • Plural Label: S3 Account Documents
      • Object Name: S3_Account_Document__c
      • Record Name: S3 Account Document ID
      • Data Type: Auto Number
      • Display Format: S3-ACC-DOC-{0000}
      • Starting Number: 0

    Note: The object name is automatically selected by the Function so naming must follow this convention: S3_OBJECT_Document__c where OBJECT is the API name of the object without the trailing __c for custom objects. For example, if you have a My_Custom_Object__c object, you should enter S3_My_Custom_Object_Document__c.

  4. Optional: for each Object that you would like to export document for, configure related list layout to display relevant fields

    Related list layout configuration

Step 2: Deploy and configure the Salesforce Function

You have two options for this step: deploy to a compute environement or run locally. Make sure to refer to the relevant section and check the environment variables reference section for the appropriate configuration.

Deploy to compute environment

Follow these steps to deploy your function to a compute environment:

  1. Log in to Salesforce Functions (you may have to repeat this command later as this will eventually time out)

    sf login functions
  2. Create a compute environment:

    sf env create compute --alias s3env --connected-org s3
  3. Deploy the Salesforce Function:

    cd functions/s3import
    sf deploy functions -o s3
  4. Configure the Salesforce Function with the following command (see environment variables reference):

    sf env var set AWS_ACCESS_KEY_ID=XXXXXXXXXX -e s3env
    sf env var set AWS_SECRET_ACCESS_KEY=XXXXXXXXXX -e s3env
    sf env var set AWS_REGION=XXXXXXXXXX -e s3env
    sf env var set AWS_S3_BUCKET=XXXXXXXXXX -e s3env
    sf env var set DOWNLOAD_URL_PREFIX='XXXXXXXXXX' -e s3env

Run locally

Follow these steps to test your function locally:

  1. Create a .env file in the functions/s3import directory. Use the following template and make sure to replace values accordingly (see environment variables reference):

    AWS_ACCESS_KEY_ID=XXXXXXXXXX
    AWS_SECRET_ACCESS_KEY=XXXXXXXXXX
    AWS_REGION=XXXXXXXXXX
    AWS_S3_BUCKET=XXXXXXXXXX
    DOWNLOAD_URL_PREFIX=XXXXXXXXXX
  2. Run these commands to start the function locally:

    cd functions/s3import
    sf run function start

Environment variables reference

Variable Name Description Example
AWS_ACCESS_KEY_ID The access key ID for your AWS IAM user. secret
AWS_SECRET_ACCESS_KEY The secret access key for your AWS IAM user. secret
AWS_REGION The region of your S3 bucket. eu-west-3
AWS_S3_BUCKET The name of your S3 bucket. poz-sf-demo
DOWNLOAD_URL_PREFIX An optional prefix appended in front of the S3 download URL. This is useful for redirecting users to a proxy that checks Salesforce auth before downloading the file. https://my-proxy.herokuapp.com/download?url=

Troubleshooting

Monitor Salesforce Function's logs by running:

sf env log tail -e s3env

Monitor Salesforce logs by running:

sfdx force:apex:log:tail -c

About

Export Salesforce Documents to Amazon S3

License:Creative Commons Zero v1.0 Universal


Languages

Language:Apex 50.3%Language:JavaScript 41.9%Language:Batchfile 4.1%Language:Shell 3.7%