ABSYZ ABSYZ

  • Home

    Home

  • About us

    Who We Are

  • Our Expertise

    What we Do

  • Our Approach

    How We Do It

  • Products

    What We Made

  • Industries

    Who We Do It For

  • Clients

    Whom We Did It For.

  • Article & Blogs

    What Experts Think

  • Careers

    Join The Team

  • Get In Touch

    Let’s Get Started

ABSYZ

Upload files in Amazon S3 from Salesforce

Home / Article & Blogs / Amazon S3 / Upload files in Amazon S3 from Salesforce
By Subrat inAmazon S3, Apex, Integration, Salesforce

upload files in amazon S3 from salesforce_02

When we have to upload multiple files or attach many files to any record, Salesforce provides storage limit per user license purchased. It varies from edition to edition. So, sometimes organisations decide to use external storage service like Amazon S3 cloud.

User can be given option to upload files to Amazon S3 via Salesforce and access them using the uploaded URLs. REST protocol is used in this scenario.

Files will be uploaded securely from Salesforce to Amazon server. After create your AWS (Amazon Web Service) user account, login secret and key ID will be shared with you by Amazon. This will be used to login to S3 Cloud from Salesforce.

After logging in to AWS, you can go to console screen and click on S3 under Storage & Content Delivery section.

You can create a bucket where the files will be uploaded.

You can not create folders inside bucket, but a logical folder using ‘/’ slash can be created.

We will see here everything in action:

[sourcecode language=”java”]

public void uploadToAmazonS3 (Attachment attach, String folderName) {

String filename = folderName+’/’ + attach.Name;
String attachmentBody = EncodingUtil.base64Encode(attach.Body);
String formattedDateString = DateTime.now().formatGMT(‘EEE, dd MMM yyyy HH:mm:ss z’);
String bucketname = //you can write the bucket name where files should be uploaded
String host = //aws server base url

HttpRequest req = new HttpRequest();
req.setMethod(‘PUT’);
req.setEndpoint(‘https://’ + bucketname + ‘.’ + host + ‘/’ + filename);
req.setHeader(‘Host’, bucketname + ‘.’ + host);
req.setHeader(‘Content-Length’, String.valueOf(attachmentBody.length()));
req.setHeader(‘Content-Type’, attach.ContentType);
req.setHeader(‘Connection’, ‘keep-alive’);
req.setHeader(‘Date’, formattedDateString);
req.setHeader(‘ACL’, ‘public-read-write’);
Blob blobBody = EncodingUtil.base64Decode(attachmentBody);
req.setBodyAsBlob(blobBody);
}
[/sourcecode]

Create a REST request and set the headers as mentioned.

host can be region specific server ‘s3-ap-southeast-1.amazonaws.com’ or the generic ‘s3.amazonaws.com’.

The request needs to be equipped with proper authentication so that it reaches securely at correct endpoint. To achieve this, Amazon provided login secret and key  ID will be used and an authorization string will be created. Authorization string will contain an encrypted signature.

[sourcecode language=”java”]
String key = XXXXXXXXXXXXXXXXXXXX
String secret = XXXXXXXXXXXXXXXXXXXXXXXXXXXXX

String stringToSign = ‘PUT\n\n’ + attach.ContentType + ‘\n’ + ‘/’ + bucketname + ‘/’ + filename;

Blob mac = Crypto.generateMac(‘HMACSHA1’, blob.valueOf(stringToSign), blob.valueof(secret));
String signed = EncodingUtil.base64Encode(mac);
String authHeader = ‘AWS’ + ‘ ‘ + key + ‘:’ + signed;

[/sourcecode]

The above authorization string needs to be passed as a header to the http request. And then make the REST send the request.

[sourcecode language=”java”]
req.setHeader(‘Authorization’, authHeader);
Http http = new Http();
HTTPResponse resp;

resp = http.send(req);
[/sourcecode]

The response status code of 200 means a successful upload.

Now, the bucket needs to be configured as a website. The objects (files uploaded) should be made publicly readable, so that the same URL using which the file is uploaded can be used to access the files publicly. To do so you need to write a bucket policy that grants everyone “s3:GetObject” permission.

You can go to http://awspolicygen.s3.amazonaws.com/policygen.html

and create a policy. Follow the below steps to create the policy.

Principal: *

Set the Amazon Resource Name (ARN) to arn:aws:s3:::<bucket_name>/<key_name>

Add your bucket above and <key_name> is set to *.

Click on Add Statement and then Generate Policy. Copy the JSON script generated.

To provide you an example how does the policy looks, I have created a bucket policy script.

[sourcecode language=”java”]
{
“Version”: “2012-10-17”,
“Id”: “Policy1463490894535”,
“Statement”: [
{
“Sid”: “Stmt1463490882740”,
“Effect”: “Allow”,
“Principal”: “*”,
“Action”: “s3:GetObject”,
“Resource”: “arn:aws:s3:::bucket_name/*”
}
]
}
[/sourcecode]

Then open the bucket you created, go to properties. Click on Add Bucket Policy, when the popup opens, paste the script generated and save. This will make the files uploaded in the bucket publicly accessible.
REST
118
Like this post
2 Posts
Subrat

Search Posts

Archives

Categories

Recent posts

BioAsia 2023 in Hyderabad: An Annual International Event

BioAsia 2023 in Hyderabad: An Annual International Event

The Role Of Marketing in Small & Medium Enterprises

The Role Of Marketing in Small & Medium Enterprises

Salesforce For Retail: How Salesforce CRM Can Help Retailers

Salesforce For Retail: How Salesforce CRM Can Help Retailers

What is ChatGPT & How Does It Work?

What is ChatGPT & How Does It Work?

What Is Graphic Design? (Executive Summary 2023)

What Is Graphic Design? (Executive Summary 2023)

  • Previous PostCreating Google Calendar Events from Salesforce Without Integration
  • Next PostEvent Monitoring Series

Related Posts

Salesforce For Retail: How Salesforce CRM Can Help Retailers
Salesforce

Salesforce For Retail: How Salesforce CRM Can Help Retailers

Introduction To Copado Devops Tool
Salesforce

Introduction To Copado Devops Tool

What is Salesforce Code Builder?
Salesforce

What is Salesforce Code Builder?

Automation in Healthcare And Its Benefits
Health Cloud Salesforce

Automation in Healthcare And Its Benefits

7 Comments

  1. Aditya p
    Reply
    3 January 2017

    Hi,

    I tried exactly the same way but i am getting “400 Error – Bad Request”. What could be the issue ?

    Regards,
    Aditya.

    Reply
    • Subrat
      Reply
      8 January 2017

      Hi Aditya,

      First thing I would say to check the error code in the response. It can be something like ‘CredentialsNotSupported’, ‘ExpiredToken’, ‘IncompleteBody’ etc. Then based on that, modify your request body or parameter as needed.
      For example, if you don’t provide the number of bytes specified by the Content-Length HTTP header, you will 400 bad request with IncompleteBody code.

      Thanks

      Reply
  2. rishikush2
    Reply
    2 November 2017

    Hi,
    i got this type of error:
    RESPONSE STRING: System.HttpResponse[Status=Forbidden, StatusCode=403]

    AccessDeniedAccess DeniedA6F24AA2A978CDF8K7F4xK71TB6xo0/tTcdDEXRMpEMbaM0od0BbfVO7bPAsPRKeZOWVbm/2QQLfOHH5Y5bi0KoRUJk=
    what i do?

    Reply
  3. mayukh
    Reply
    6 February 2018

    Hi,
    I am trying to upload a JSON file to amazon s3 bucket. But I am getting an exception as :
    Callout Exception: Unexpected end of file from server

    My Code:
    public class ProductAmazon_RestClass {
    public void ProductAmazon_RestMethod(string folderName){
    string binaryString = ProductAmazonIntegration.ProductAmazonIntegration();
    String key=’***********************’;
    String secret=’*********************************************************************’;
    String formattedDateString= Datetime.now().formatGMT(‘EEE, dd MMM yyyy HH:mm:ss z’);
    String bucketname = ”;
    String host = ‘s3-website-us-east-1.amazonaws.com’;
    String method = ‘PUT’;
    String filename = ‘Product/Product.json’;

    //Request starts
    HttpRequest req = new HttpRequest();
    req.setMethod(method);
    req.setEndpoint(‘https://’ + bucketname + ‘.’ + host + ‘/’ + bucketname + ‘/’ + filename);
    req.setHeader(‘Host’, bucketname + ‘.’ + host);
    req.setTimeout(120000);
    req.setHeader(‘Content-Length’, string.valueOf(binaryString.length()));
    req.setHeader(‘Content-Encoding’, ‘UTF-8’);
    req.setHeader(‘Content-type’, ‘application/json’);
    req.setHeader(‘Connection’,’keep-alive’);
    req.setHeader(‘Date’, formattedDateString);
    req.setHeader(‘ACL’,’public-read’);
    req.setBody(binaryString);

    String stringToSign = ‘PUTnnn’+formattedDateString+’nn/’+bucketname+’/’+filename;
    String signed = createSignature(stringToSign,secret);
    String authHeader = ‘AWS’ + ‘ ‘ + key + ‘:’ + signed;
    req.setHeader(‘Authorization’,authHeader);
    Http http = new Http();
    try {
    //Execute web service call
    HTTPResponse res = http.send(req);
    System.debug(‘RESPONSE STRING: ‘ + res.toString());
    System.debug(‘RESPONSE STATUS: ‘+res.getStatus());
    System.debug(‘STATUS_CODE: ‘+res.getStatusCode());

    } catch(System.CalloutException e) {
    system.debug(‘AWS Service Callout Exception: ‘ + e.getMessage());
    }

    }

    public string createSignature(string canonicalBuffer,String secret){
    string sig;
    Blob mac = Crypto.generateMac(‘HMACSHA1’, blob.valueof(canonicalBuffer),blob.valueof(secret));
    sig = EncodingUtil.base64Encode(mac);
    return sig;

    }
    }

    Reply
  4. Shubham Gupta
    Reply
    22 February 2018

    Hi,
    Apart from integrating Amazon AWS S3 with Salesforce, we have a requirement to Gzip files before placing them in S3 bucket. Is there any way possible to Gzip TXT files? If yes then only thing I need is to Integrate S3 with Salesforce through RESTful Services.
    Thanks in advance,
    Shubham

    Reply
    • Subrat
      Reply
      22 February 2018

      Yes, you can do that setting setcompressed to true on HTTP request. And it is recommended to use header parameter content-encoding as gzip in this case.

      Reply
  5. Himanshu Gupta
    Reply
    15 March 2018

    Whats wrong here??
    I am getting some or the other error code. AWS s3 is open to public

    public class echo2 {
    public void uploadToAmazonS3 (Attachment attach) {

    String filename = attach.Name;
    String attachmentBody = EncodingUtil.base64Encode(attach.Body);
    String formattedDateString = DateTime.now().formatGMT(‘EEE, dd MMM yyyy HH:mm:ss z’);
    String bucketname = ‘testing071’;
    String host = ‘s3.us-east-1.amazonaws.com’;

    HttpRequest req = new HttpRequest();
    req.setMethod(‘PUT’);
    req.setEndpoint(‘https://’ + bucketname + ‘.’ + host + ‘/’ + filename);
    req.setHeader(‘Host’, bucketname + ‘.’ + host);
    req.setHeader(‘Content-Length’, String.valueOf(attachmentBody.length()));
    req.setHeader(‘Content-Type’, attach.ContentType);
    req.setHeader(‘Connection’, ‘keep-alive’);
    req.setHeader(‘Date’, formattedDateString);
    req.setHeader(‘ACL’, ‘public-read-write’);
    Blob blobBody = EncodingUtil.base64Decode(attachmentBody);
    req.setBodyAsBlob(blobBody);

    String key = ‘**’;
    String secret = ‘**’;

    String stringToSign = ‘PUT\n\n’ + attach.ContentType + ‘\n’ + ‘/’ + bucketname + ‘/’ + filename;

    Blob mac = Crypto.generateMac(‘HMACSHA1’, blob.valueOf(stringToSign), blob.valueof(secret));
    String signed = EncodingUtil.base64Encode(mac);
    String authHeader = ‘AWS’ + ‘ ‘ + key + ‘:’ + signed;

    req.setHeader(‘Authorization’, authHeader);
    system.debug(‘rq body: ‘+req.getBody());
    Http http = new Http();
    HTTPResponse resp;
    system.debug(‘req’+req);
    resp = http.send(req);
    system.debug(‘resp’+resp);
    system.debug(resp.getBody());
    }
    }

    Reply

Leave a Reply (Cancel reply)

Your email address will not be published. Required fields are marked *

*
*

ABSYZ Logo

INDIA | USA | UAE

  • About us
  • Article & Blogs
  • Careers
  • Get In Touch
  • Our Expertise
  • Our Approach
  • Products
  • Industries
  • Clients
  • White Papers

Copyright ©2022 Absyz Inc. All Rights Reserved.

youngsoft
Copy
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “ACCEPT ALL”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent. Privacy Policy
Cookie SettingsREJECT ALLACCEPT ALL
Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled

Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.

CookieDurationDescription
cookielawinfo-checkbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.

Functional

Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.

Performance

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

Analytics

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.

Advertisement

Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.

Others

Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.

SAVE & ACCEPT