Amazon S3 with Spring Boot & Java – Sample Code

amazon S3 with Spring Boot and Java

Amazon Simple Storage Service (S3) is an AWS object storage platform which helps you to store the files/data in form of objects, and, store and retrieve any amount of data from anywhere – websites and mobile apps, corporate applications, and data from IoT sensors or devices. Each file stored in Amazon S3 (as an object) is represented using a key. The following are some of the usecases where S3 could be used. Thus, it becomes of utmost importance to understand how to interact with Amazon S3 API using AWS SDKs including AWS Java SDK.

  • Backup & recovery
  • Data archiving
  • Big data analytics
  • Disaster recovery
  • Cloud-native application data
  • Hybrid cloud storage

AWS Java SDK supports various APIs related to Amazon S3 service for working with files/objects stored in S3.

In this post, you would learn about how to get started with AmazonS3 AWS Java SDK APIs using Spring Boot and Java. Some of the following would get covered:

  • Spring Boot CommandLiner invoking Custom S3 Class
  • Custom Class representing Amazon S3 Operations
  • Configuration file reading properties from application.properties
  • Application.properties file consisting of Configuration Information

Spring Boot CommandLiner invoking Custom S3 Class

package com.vflux.rbot;

import java.io.IOException;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

import com.vflux.rbot.storage.S3Storage;

import javazoom.jl.decoder.JavaLayerException;

@SpringBootApplication
public class RecruiterbotApplication implements CommandLineRunner {

    @Autowired S3Storage s3Storage;

    public static void main(String[] args) {
        SpringApplication app = new SpringApplication(RecruiterbotApplication.class);
        app.run(args);
    }

    @Override
    public void run(String... arg0) throws IOException, JavaLayerException  {
        this.s3Storage.uploadFile("btc123.md", "/home/support/Documents/btc.md");
    }

}

Custom Class representing Amazon S3 Operations

Pay attention to some of the following in the code given below:

  • The constructor of S3Storage class is invoked to create an instance using the beans representing AWS region and aws credential provider.
  • UploadFile invokes putObject API to upload the file to S3 storage.
package com.vflux.rbot.storage;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;

import com.amazonaws.AmazonServiceException;
import com.amazonaws.auth.AWSCredentialsProvider;
import com.amazonaws.regions.Region;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;

@Component
public class S3Storage {

    @Autowired String awsS3AudioBucket;

    private AmazonS3 amazonS3;

    public S3Storage(@Autowired Region awsRegion, @Autowired AWSCredentialsProvider awsCredentialsProvider) {
        this.amazonS3 = AmazonS3ClientBuilder.standard().withCredentials(awsCredentialsProvider)
                .withRegion(awsRegion.getName()).build();
    }

    public void uploadFile(String keyName, String filePath) {
        try {
            this.amazonS3.putObject(this.awsS3AudioBucket, keyName, filePath);
        } catch (AmazonServiceException e) {
            System.err.println(e.getErrorMessage());
        }
    }
}

Configuration file reading properties from application.properties

AppConfig class represents instantiation of several properties beans related to some of the following:

  • Access key id
  • Access key secret
  • AWS region
  • Bucket
  • AWS credentials providers
package com.vflux.rbot.config;

import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.PropertySource;

import com.amazonaws.auth.AWSCredentialsProvider;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.regions.Region;
import com.amazonaws.regions.Regions;

@Configuration
@PropertySource("classpath:application.properties")
public class AppConfig {
    @Value("${aws.access.key.id}") String awsKeyId;
    @Value("${aws.access.key.secret}") String awsKeySecret;
    @Value("${aws.region}") String awsRegion;
    @Value("${aws.s3.audio.bucket}") String awsS3AudioBucket; 

    @Bean(name = "awsKeyId") 
    public String getAWSKeyId() {
        return awsKeyId;
    }

    @Bean(name = "awsKeySecret") 
    public String getAWSKeySecret() {
        return awsKeySecret;
    }

    @Bean(name = "awsRegion") 
    public Region getAWSPollyRegion() {
        return Region.getRegion(Regions.fromName(awsRegion));
    }

    @Bean(name = "awsCredentialsProvider") 
    public AWSCredentialsProvider getAWSCredentials() {
        BasicAWSCredentials awsCredentials = new BasicAWSCredentials(this.awsKeyId, this.awsKeySecret);
        return new AWSStaticCredentialsProvider(awsCredentials);
    }

    @Bean(name = "awsS3AudioBucket") 
    public String getAWSS3AudioBucket() {
        return awsS3AudioBucket;
    }
}

Application.properties file consisting of Configuration Information

The following code can be placed in a file application.properties which is found in src/main/resources folder. These properties are loaded when Spring Boot app starts.

aws.access.key.id = DKBV6sjP55jshfdBBFD45a
aws.access.key.secret = vcI9NXav0PNIAB16zohi9ccvMUg1z12
aws.region = ap-south-1

aws.s3.audio.bucket = someBucketName

Further Reading / References

Summary

In this post, you learned about getting started with Amazon S3 using Spring Boot, Java and AWS Java SDK.

Did you find this article useful? Do you have any questions or suggestions about this article in relation to getting started with Amazon S3 using Spring Boot and Java? Leave a comment and ask your questions and I shall do my best to address your queries.

Ajitesh Kumar

Ajitesh Kumar

I have been recently working in the area of Data analytics including Data Science and Machine Learning / Deep Learning. I am also passionate about different technologies including programming languages such as Java/JEE, Javascript, Python, R, Julia, etc, and technologies such as Blockchain, mobile computing, cloud-native technologies, application security, cloud computing platforms, big data, etc. I would love to connect with you on Linkedin. Check out my latest book titled as First Principles Thinking: Building winning products using first principles thinking.
Posted in AWS, Java, Tutorials. Tagged with , , .