start portlet menu bar

HCLSoftware: Fueling the Digital+ Economy

Display portlet menu
end portlet menu bar
Select Page

Are you working with Amazon S3 and would you like an easy way to perform your batch operations? We have what you are looking for!

The Amazon S3 plug-in is available on Automation Hub, download it to empower your Workload Automation environment.

The Amazon S3 plug-in helps you upload, download or delete objects and monitor the progress directly from the Dynamic Workload Console. Furthermore, you also schedule an Amazon S3 job just by creating a simple job definition.

The following are the few prerequisites you need to have to use the plugin:

  1. AWS Account
  2. AWS IAM credentials. If you don’t know how to retrieve them, Click Here
  3. Proper permission for the user to use the resource

Now let us see how easy it is to upload, download or delete objects and monitor their progress on AWS.

Create a new job and select “Amazon S3” in the Cloud section.

Job Definition Page

Figure 1: Job Definition Page


First, establish a connection to the AWS server by entering the required details.

Then, you can test the connection to AWS server by clicking on Test Connection.

Connection Page

Figure 2: Connection Page


After having successfully tested the connection, you can go to the Action tab and specify the details about the kind of operation you need to perform (upload, download, delete) and the bucket where you need to perform.

Action Page

Figure 3: Action Page

a. Upload:
Use this option to upload the list of files and folder into specified Amazon S3 Bucket, and you can store anywhere in the bucket using the prefix.
You can also use search files option to browse all the files and folders in the given path. This path is taken as parent path and all the files will be taking this as the reference path.
You have an option to delete the files in your system after the upload is complete.

Upload section

Figure 4: Action Page > Upload section

b. Download:
Use this operation to download the list of files or folders that is specified in the keys section from the specified bucket.
You can specify the location where you want the downloaded files to be stored.

Download section

Figure 5: Action Page > Download section

c. Delete:
Use this operation to delete objects on the specified bucket on Amazon S3.
You can use load objects to check the objects that are present in the bucket you have chosen.

Delete section

Figure 6: Action Page > Delete section

Submitting your job:
Once the job definition is ready, save it and submit it into the plan.
The job will start its execution and perform the desired operation.

Monitoring your job:
While Amazon S3 executes the process, the plug-in offers the possibility to monitor real-time execution. Such monitor page, called Workflow details, is accessible from the monitor jobs view, and contains all the details about the job. Refresh the page to see the updates.

Workflow Details Page

Figure 7: Workflow Details Page

Thus, thanks to the Amazon S3 plug-in, you can automate your operations on Amazon S3 and monitor them, all from one place.

On Automation Hub you can find many other integrations that will enable you to automate everything you want.
Automate more, automate better!

Comment wrap
Automation | July 24, 2023
Workload Automation: More than 100 Plugins Lets You Automate More (and Better)
The Automation Hub is an innovative work-in-progress as we regularly add many more items to it. It showcases HCL Workload Automation’s ability to orchestrate IT and business workloads.
Automation | February 10, 2023
Banking Case Study: New Client Adoption and Reverse Check: HWA+DRYiCE-iControl
We look at a banking case study wherein we are going through the “New Client Adoption and Reverse Check” business process.
Automation | September 28, 2022
See your scheduling metrics on Prometheus and Grafana
HCL Workload Automation has exposed its metrics for the main components, the back-end which reports metrics around job execution.