diff --git a/awstransfer-s3-sam/README.md b/awstransfer-s3-sam/README.md
new file mode 100644
index 000000000..194cbac47
--- /dev/null
+++ b/awstransfer-s3-sam/README.md
@@ -0,0 +1,165 @@
+# Bidirectional selective file transfer between remote SFTP server and Amazon S3 using AWS Transfer Family Connector
+
+This pattern shows how to setup an AWS Transfer Family SFTP connector to list files from the remote server and transfer specific files to Amazon S3 bucket. You can also transfer specific files from Amazon S3 bucket to the remote SFTP server.
+
+Learn more about this pattern at Serverless Land Patterns: https://serverlessland.com/patterns/awstransfer-s3-sam.
+
+Important: this application uses various AWS services and there are costs associated with these services after the Free Tier usage - please see the [AWS Pricing page](https://aws.amazon.com/pricing/) for details. You are responsible for any AWS costs incurred. No warranty is implied in this example.
+
+## Requirements
+
+* [Create an AWS account](https://portal.aws.amazon.com/gp/aws/developer/registration/index.html) if you do not already have one and log in. The IAM user that you use must have sufficient permissions to make necessary AWS service calls and manage AWS resources.
+* [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.html) installed and configured
+* [Git Installed](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git)
+* [AWS Serverless Application Model](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html) (AWS SAM) installed
+* [JQ](https://docs.aws.amazon.com/solutions/latest/dynamic-object-and-rule-extensions-for-aws-network-firewall/operation-and-customization.html#install-jq) should be installed.
+
+
+
+## Deployment Instructions
+
+1. Create a new directory, navigate to that directory in a terminal and clone the GitHub repository:
+ ```
+ git clone https://github.com/aws-samples/serverless-patterns
+ ```
+
+2. Change directory to the pattern directory:
+ ```
+ cd serverless-patterns/awstransfer-s3-sam
+ ```
+
+3. From the command line, run the below command to deploy the pattern:
+ ```
+ bash deploy.sh
+ ```
+
+4. During the prompts:
+ * Enter a stack name
+ * Enter the desired AWS Region (e.g. us-east-1)
+
+5. The deployment script deploys both `template-sftp-server.yaml` and `template-sftp-connector.yaml`. Please make a note of the output both the deployments as they will be used during testing.
+
+
+## How it works
+
+Please refer to the architecture diagram below:
+
+
+
+* The remote SFTP server is simulated using AWS Transfer Family SFTP Server for this pattern. In a real use case, this can be any remote SFTP server outside of AWS.
+* SFTP Connector is configured to connect to the remote server with Amazon S3 bucket using SFTP protocol. The authentication is done using SSH Key based handshake.
+* Amazon S3 bucket is used for file storage on the AWS side.
+* User can list files on the remote server and selectively transfer files from the remote server to the Amazon S3 bucket using AWS Transfer Family API or CLI commands.
+* User can also transfer files from Amazon S3 to the remote server using the AWS Transfer Family API or CLI commands.
+
+## Testing
+
+1. Use the endpoint to test the SFTP server with transferring a file using a client, the rest of this test steps are shown using OpenSSH. Please refer to [Transferring files over a server endpoint using a client](https://docs.aws.amazon.com/transfer/latest/userguide/transfer-file.html) for other options.
+
+2. Test the connection using using the below command from your command line. Please replace `SFTPTransferConnector` from the deployment output:
+ ```bash
+ aws transfer test-connection --region {your-region} --connector-id {SFTPTransferConnector}
+ ```
+ It should give an output similar to below:
+ ```json
+ {
+ "Status": "OK",
+ "StatusMessage": "Connection succeeded"
+ }
+ ```
+
+3. Transfer `sample1.txt` and `sample2.txt` files to the remote SFTP server (similated) using the below commands. In this sample project replace `SSHPrivateKeyFileName` and `SSHPrivateKeyFileName` with `sftpuser`. Replace the value of `TransferServerEndpoint` from the deployment output:
+ ```bash
+ sftp -i {SSHPrivateKeyFileName} {TransferServerUser}@{TransferServerEndpoint}
+ pwd
+ mkdir Remote
+ cd Remote
+ put sample1.txt
+ put sample2.txt
+ ls
+ ```
+ Confirm to proceed with the connection after the first command.
+
+5. List the files on the remote SFTP server using the below command. Please replace `SFTPTransferConnector` and `MyLocalS3Bucket` from the deploy output:
+
+ ```bash
+ aws transfer start-directory-listing --region {your-region} --connector-id {SFTPTransferConnector} --remote-directory-path /Remote --output-directory-path /{MyLocalS3Bucket}/FromRemoteSFTPServer
+ ```
+
+ The command invokes an asynchronous API. The output of the command will be as follows:
+ ```json
+ {
+ "ListingId": "273e5b33-xxxx-xxxx-xxxx-xxxxx9a507f53",
+ "OutputFileName": "c-cxxxxxxxx-xxxxxx-xxxx-xxxx-xxxx-xxxxxa507f53.json"
+ }
+ ```
+
+6. Log into [Amazon S3 console](https://console.aws.amazon.com/s3). Open the `MyLocalS3Bucket` and navigate to `FromRemoteSFTPServer` folder. Check the content of the JSON file. It should look something like below:
+ ```json
+ {
+ "files": [
+ {
+ "filePath": "/Remote/sample1.txt",
+ "modifiedTimestamp": "2024-04-28T07:47:27Z",
+ "size": 146
+ },
+ {
+ "filePath": "/Remote/sample2.txt",
+ "modifiedTimestamp": "2024-04-28T07:47:46Z",
+ "size": 146
+ }
+ ],
+ "paths": [],
+ "truncated": false
+ }
+ ```
+
+7. Transfer one of the files from the remote SFTP server to the Amazon S3 bucket using the following command:
+ ```bash
+ aws transfer start-file-transfer --region {your-region} --connector-id {SFTPTransferConnector} --retrieve-file-paths /Remote/sample1.txt --local-directory-path /{MyLocalS3Bucket}/FromRemoteSFTPServer
+ ```
+
+ The output of the command should something like below:
+ ```json
+ {
+ "TransferId": "e863xxxx-xxxx-xxxx-xxxx-xxxxa40c5ff9"
+ }
+ ```
+
+8. Log into [Amazon S3 console](https://console.aws.amazon.com/s3). Open the `MyLocalS3Bucket` and navigate to `FromRemoteSFTPServer` folder. You should be able to find the transferred `sample1.txt` file.
+
+9. Upload a file into the `MyLocalS3Bucket` bucket using the following command. Replace the `MyLocalS3Bucket` from the deploy output:
+ ```bash
+ aws s3 cp sample3.txt s3://{MyLocalS3Bucket}/local/sample3.txt
+ ```
+
+10. Transfer the `sample3.txt` file Amazon S3 `MyLocalS3Bucket` bucket to the remove SPT server using the following command:
+ ```bash
+ aws transfer start-file-transfer --region {your-region} --connector-id {SFTPTransferConnector} --send-file-paths /{MyLocalS3Bucket}/local/sample3.txt --remote-directory-path /FromAmazonS3
+ ```
+
+11. Validate the file transfer by logging into the the remove SFTP server using the below commands:
+ ```bash
+ sftp -i {SSHPrivateKeyFileName} {TransferServerUser}@{TransferServerEndpoint}
+ ls
+ cd FromAmazonS3
+ ls
+ ```
+
+## Cleanup
+
+1. Delete the content in the Amazon S3 bucket using the following command. Please *ensure* that the correct bucket name is provided to avoid accidental data loss:
+ ```bash
+ aws s3 rm s3://{MySFTPServerS3Bucket} --recursive --region {my-region}
+ aws s3 rm s3://{MyLocalS3Bucket} --recursive --region {my-region}
+ ```
+
+2. Delete the stack
+ ```bash
+ bash undeploy.sh
+ ```
+
+----
+Copyright 2024 Amazon.com, Inc. or its affiliates. All Rights Reserved.
+
+SPDX-License-Identifier: MIT-0
diff --git a/awstransfer-s3-sam/awstransfer-s3-sam.json b/awstransfer-s3-sam/awstransfer-s3-sam.json
new file mode 100644
index 000000000..53e3171c5
--- /dev/null
+++ b/awstransfer-s3-sam/awstransfer-s3-sam.json
@@ -0,0 +1,82 @@
+{
+ "title": "Selective file transfer between SFTP server & Amazon S3 using AWS Transfer Family",
+ "description": "This pattern shows how to use AWS Transfer Family to list and transfer specific files between an SFTP server and Amazon S3 bucket.",
+ "language": "YAML",
+ "level": "200",
+ "framework": "SAM",
+ "introBox": {
+ "headline": "How it works",
+ "text": [
+ "The remote SFTP server is simulated using AWS Transfer Family SFTP Server for this pattern. In a real use case, this can be any remote SFTP server outside of AWS.",
+ "SFTP Connector is configured to connect to the remote server with Amazon S3 bucket using SFTP protocol. The authentication is done using SSH Key based handshake.",
+ "Amazon S3 bucket is used for file storage on the AWS side.",
+ "User can list files on the remote server and selectively transfer files from the remote server to the Amazon S3 bucket using AWS Transfer Family API or CLI commands.",
+ "User can also transfer files from Amazon S3 to the remote server using the AWS Transfer Family API or CLI commands."
+ ]
+ },
+ "gitHub": {
+ "template": {
+ "repoURL": "https://github.com/aws-samples/serverless-patterns/tree/main/awstransfer-s3-sam",
+ "templateURL": "serverless-patterns/awstransfer-s3-sam",
+ "projectFolder": "awstransfer-s3-sam",
+ "templateFile": "template-sftp-server.yaml"
+ }
+ },
+ "resources": {
+ "bullets": [
+ {
+ "text": "Getting started with AWS Transfer Family server endpoints",
+ "link": "https://docs.aws.amazon.com/transfer/latest/userguide/getting-started.html"
+ },
+ {
+ "text": "Configure SFTP connectors",
+ "link": "https://docs.aws.amazon.com/transfer/latest/userguide/configure-sftp-connector.html"
+ }
+ ]
+ },
+ "deploy": {
+ "text": [
+ "See the GitHub repo for detailed deployment instructions.",
+ "bash deploy.sh"
+ ]
+ },
+ "testing": {
+ "text": [
+ "See the GitHub repo for detailed testing instructions."
+ ]
+ },
+ "cleanup": {
+ "text": [
+ "Delete the Amazon S3 input bucket content: aws s3 rm s3://{MySFTPServerS3Bucket} --recursive --region {my-region}
",
+ "Delete the Amazon S3 output bucket content: aws s3 rm s3://{MyLocalS3Bucket} --recursive --region {my-region}
",
+ "bash undeploy.sh
"
+ ]
+ },
+ "authors": [
+ {
+ "name": "Biswanath Mukherjee",
+ "image": "https://d1rwvjey2iif32.cloudfront.net",
+ "bio": "I am a Sr. Solutions Architect working at AWS India.",
+ "linkedin": "biswanathmukherjee"
+ }
+ ],
+ "patternArch": {
+ "icon1": {
+ "x": 20,
+ "y": 50,
+ "service": "transfer",
+ "label": "Transfer Family server"
+ },
+ "icon2": {
+ "x": 80,
+ "y": 50,
+ "service": "s3",
+ "label": "S3 bucket"
+ },
+ "line1": {
+ "from": "icon1",
+ "to": "icon2",
+ "label": ""
+ }
+ }
+}
diff --git a/awstransfer-s3-sam/deploy.sh b/awstransfer-s3-sam/deploy.sh
new file mode 100644
index 000000000..81d68f13f
--- /dev/null
+++ b/awstransfer-s3-sam/deploy.sh
@@ -0,0 +1,90 @@
+#!/bin/bash
+
+# Take the stack name
+echo "Enter a stack name"
+read -r STACK_NAME
+
+# Take the desired AWS Region
+echo "Enter the desired AWS Region:"
+read -r AWS_REGION
+
+
+USER_NAME="sftpuser"
+
+# Generate key-pair
+# AWS Documentation: https://docs.aws.amazon.com/transfer/latest/userguide/configure-sftp-connector.html#format-sftp-connector-key
+ssh-keygen -t rsa -b 4096 -m PEM -f $USER_NAME -N ""
+
+# Check if the public key file exists
+if [ -f "$USER_NAME.pub" ]; then
+
+ # Store the content of the public key in a variable
+ PUBLIC_KEY=$(cat "$USER_NAME.pub")
+
+ # Deploy template-sftp-server.yaml
+ sam deploy \
+ --template-file template-sftp-server.yaml \
+ --stack-name "$STACK_NAME-1" \
+ --parameter-overrides "UserName=\"$USER_NAME\"" "SSHPublicKey=\"$PUBLIC_KEY\"" \
+ --capabilities CAPABILITY_IAM \
+ --region $AWS_REGION
+
+ # Get the stack ID
+ STACK_ID=$(aws cloudformation list-stacks --stack-status-filter CREATE_COMPLETE --query "StackSummaries[?contains(StackName, '$STACK_NAME-1')].StackId" --output text --region $AWS_REGION)
+
+ # Check if the stack ID is empty
+ if [ -z "$STACK_ID" ]; then
+ echo "Stack not found. Exiting..."
+ exit 1
+ fi
+
+ # Get the stack outputs
+ OUTPUTS=$(aws cloudformation describe-stacks --stack-name "$STACK_ID" --query "Stacks[0].Outputs" --output json --region $AWS_REGION)
+
+ # Get a TransferServerId output value
+ TRANSFER_SERVER_ID=$(echo "$OUTPUTS" | jq -r '.[] | select(.OutputKey == "TransferServerId") | .OutputValue')
+
+ # Get a TransferServerEndpoint output value
+ TRANSFER_SERVER_ENDPOINT=$(echo "$OUTPUTS" | jq -r '.[] | select(.OutputKey == "TransferServerEndpoint") | .OutputValue')
+
+ # Get a TransferLoggingRoleArn output value
+ TRANSFER_LOGGING_ROLE_ARN=$(echo "$OUTPUTS" | jq -r '.[] | select(.OutputKey == "TransferLoggingRoleArn") | .OutputValue')
+
+ # Get a SSHPrivateKey in single line without double quotes
+ # AWS Documentation: https://docs.aws.amazon.com/transfer/latest/userguide/sftp-connectors-tutorial.html
+ FORMATTED_PK=$(jq -sR . < "$USER_NAME"| sed 's/^"//;s/"$//')
+
+ # Wait for the server to be ready
+ STATE="NOT_AVAILABLE"
+
+ # Loop until the server is available
+ while [ "$STATE" != "ONLINE" ]; do
+ # Get the server state using the AWS CLI
+ STATE=$(aws transfer describe-server --server-id "$TRANSFER_SERVER_ID" --query "Server.State" --output text)
+
+ # Print the server state
+ echo "Server state: $STATE"
+
+ # Wait for 1 minute before checking again
+ sleep 60
+ done
+
+ # Print a message when the server is available
+ echo "Server is online! Proceesing with the next steps..."
+
+ # Get the TrustedHostKey from the TransferServer
+ # AWS Documentation: https://docs.aws.amazon.com/transfer/latest/userguide/API_SftpConnectorConfig.html
+ TRUSTED_HOST_KEY=$(ssh-keyscan $TRANSFER_SERVER_ENDPOINT)
+
+ # Deploy template-sftp-connector.yaml
+ sam deploy \
+ --template-file template-sftp-connector.yaml \
+ --stack-name "$STACK_NAME-2" \
+ --parameter-overrides "TransferServerEndpoint=\"sftp://$TRANSFER_SERVER_ENDPOINT\"" "UserName=\"$USER_NAME\"" "TransferLoggingRoleArn=\"$TRANSFER_LOGGING_ROLE_ARN\"" "SSHPrivateKey=\"$FORMATTED_PK\"" "TrustedHostKeys=\"$TRUSTED_HOST_KEY\"" \
+ --capabilities CAPABILITY_IAM \
+ --region $AWS_REGION
+
+else
+ echo "Public key file not found. Exiting..."
+ exit 1
+fi
diff --git a/awstransfer-s3-sam/example-pattern.json b/awstransfer-s3-sam/example-pattern.json
new file mode 100644
index 000000000..b2b56ab1a
--- /dev/null
+++ b/awstransfer-s3-sam/example-pattern.json
@@ -0,0 +1,63 @@
+{
+ "title": "Selective file transfer between SFTP server & Amazon S3 using AWS Transfer Family",
+ "description": "This pattern shows how to use AWS Transfer Family to list and transfer specific files between an SFTP server and Amazon S3 bucket.",
+ "language": "YAML",
+ "level": "200",
+ "framework": "SAM",
+ "introBox": {
+ "headline": "How it works",
+ "text": [
+ "The remote SFTP server is simulated using AWS Transfer Family SFTP Server for this pattern. In a real use case, this can be any remote SFTP server outside of AWS.",
+ "SFTP Connector is configured to connect to the remote server with Amazon S3 bucket using SFTP protocol. The authentication is done using SSH Key based handshake.",
+ "Amazon S3 bucket is used for file storage on the AWS side.",
+ "User can list files on the remote server and selectively transfer files from the remote server to the Amazon S3 bucket using AWS Transfer Family API or CLI commands.",
+ "User can also transfer files from Amazon S3 to the remote server using the AWS Transfer Family API or CLI commands."
+ ]
+ },
+ "gitHub": {
+ "template": {
+ "repoURL": "https://github.com/aws-samples/serverless-patterns/tree/main/awstransfer-s3-sam",
+ "templateURL": "serverless-patterns/awstransfer-s3-sam",
+ "projectFolder": "awstransfer-s3-sam",
+ "templateFile": "template-sftp-server.yaml"
+ }
+ },
+ "resources": {
+ "bullets": [
+ {
+ "text": "Getting started with AWS Transfer Family server endpoints",
+ "link": "https://docs.aws.amazon.com/transfer/latest/userguide/getting-started.html"
+ },
+ {
+ "text": "Configure SFTP connectors",
+ "link": "https://docs.aws.amazon.com/transfer/latest/userguide/configure-sftp-connector.html"
+ }
+ ]
+ },
+ "deploy": {
+ "text": [
+ "See the GitHub repo for detailed deployment instructions.",
+ "bash deploy.sh"
+ ]
+ },
+ "testing": {
+ "text": [
+ "See the GitHub repo for detailed testing instructions."
+ ]
+ },
+ "cleanup": {
+ "text": [
+ "Delete the Amazon S3 input bucket content: aws s3 rm s3://{MySFTPServerS3Bucket} --recursive --region {my-region}
",
+ "Delete the Amazon S3 output bucket content: aws s3 rm s3://{MyLocalS3Bucket} --recursive --region {my-region}
",
+ "bash undeploy.sh"
+ ]
+ },
+ "authors": [
+ {
+ "name": "Biswanath Mukherjee",
+ "image": "https://d1rwvjey2iif32.cloudfront.net",
+ "bio": "I am a Sr. Solutions Architect working at AWS India.",
+ "linkedin": "biswanathmukherjee"
+ }
+ ]
+}
diff --git a/awstransfer-s3-sam/images/architecture.png b/awstransfer-s3-sam/images/architecture.png
new file mode 100644
index 000000000..450cf33c6
Binary files /dev/null and b/awstransfer-s3-sam/images/architecture.png differ
diff --git a/awstransfer-s3-sam/sample1.txt b/awstransfer-s3-sam/sample1.txt
new file mode 100644
index 000000000..67026244f
--- /dev/null
+++ b/awstransfer-s3-sam/sample1.txt
@@ -0,0 +1 @@
+This is a sample file 1. This file will be used to test transfer of file from remote SFTP server to Amazon S3 using AWS Transfer Family Connector.
\ No newline at end of file
diff --git a/awstransfer-s3-sam/sample2.txt b/awstransfer-s3-sam/sample2.txt
new file mode 100644
index 000000000..72fecf107
--- /dev/null
+++ b/awstransfer-s3-sam/sample2.txt
@@ -0,0 +1 @@
+This is a sample file 2. This file will be used to test transfer of file from remote SFTP server to Amazon S3 using AWS Transfer Family Connector.
\ No newline at end of file
diff --git a/awstransfer-s3-sam/sample3.txt b/awstransfer-s3-sam/sample3.txt
new file mode 100644
index 000000000..1dc0124e1
--- /dev/null
+++ b/awstransfer-s3-sam/sample3.txt
@@ -0,0 +1 @@
+This is a sample file 3. This file will be used to test transfer of a file from Amazon S3 bucket to remote SFTP server using AWS Transfer Family Connectoraws s3 cp.
\ No newline at end of file
diff --git a/awstransfer-s3-sam/template-sftp-connector.yaml b/awstransfer-s3-sam/template-sftp-connector.yaml
new file mode 100644
index 000000000..af5f12108
--- /dev/null
+++ b/awstransfer-s3-sam/template-sftp-connector.yaml
@@ -0,0 +1,120 @@
+AWSTemplateFormatVersion: 2010-09-09
+Transform: AWS::Serverless-2016-10-31
+Description: AWS SAM template for creating an AWS Transfer Family SFTP Connector.
+Parameters:
+ TransferServerEndpoint:
+ Type: String
+ Description: The endpoint of the Transfer Server
+ UserName:
+ Type: String
+ AllowedPattern: "^[a-zA-Z0-9_][a-zA-Z0-9_.@-]{1,98}[a-zA-Z0-9_@.-]$"
+ Description: Username for AWS Transfer Family service managed user
+ TransferLoggingRoleArn:
+ Type: String
+ Description: The ARN of the IAM role to use for logging
+ SSHPrivateKey:
+ Type: String
+ Description: SSH Key for AWS Transfer Family service managed user
+ TrustedHostKeys:
+ Type: String
+ AllowedPattern: ".+"
+ Description: The Trusted Host Keys for the Transfer Server
+
+Resources:
+
+# Create S3 bucket to store files locally in AWS
+ MyLocalS3Bucket:
+ Type: AWS::S3::Bucket
+ Properties:
+ BucketName: !Sub ${AWS::StackName}-${AWS::AccountId}-${AWS::Region}-local
+
+# Create IAM role for Transfer Family CloudWatch logging
+ TransferLoggingRole:
+ Type: AWS::IAM::Role
+ Properties:
+ AssumeRolePolicyDocument:
+ Version: 2012-10-17
+ Statement:
+ - Effect: Allow
+ Principal:
+ Service: transfer.amazonaws.com
+ Action: sts:AssumeRole
+ Policies:
+ - PolicyName: !Sub ${AWS::StackName}-TransferLoggingPolicy
+ PolicyDocument:
+ Version: 2012-10-17
+ Statement:
+ - Effect: Allow
+ Action:
+ - logs:CreateLogGroup
+ - logs:CreateLogStream
+ - logs:PutLogEvents
+ Resource: "*"
+
+# Create an SFTP Transfer Connector
+ SFTPTransferConnector:
+ Type: AWS::Transfer::Connector
+ Properties:
+ Url: !Ref TransferServerEndpoint
+ AccessRole: !GetAtt TransferSFTPConnectorRole.Arn
+ LoggingRole: !Ref TransferLoggingRoleArn
+ SftpConfig:
+ TrustedHostKeys:
+ - !Ref TrustedHostKeys
+ UserSecretId: !Ref MySecret
+
+#Create a IAM Role for Transfer Server User with access to their home directory
+ TransferSFTPConnectorRole:
+ Type: AWS::IAM::Role
+ Properties:
+ AssumeRolePolicyDocument:
+ Version: 2012-10-17
+ Statement:
+ - Effect: Allow
+ Principal:
+ Service: transfer.amazonaws.com
+ Action: sts:AssumeRole
+ Policies:
+ - PolicyName: TransferSFTPConnectorPolicy
+ PolicyDocument:
+ Version: 2012-10-17
+ Statement:
+ - Effect: Allow
+ Action:
+ - s3:ListBucket
+ - s3:GetBucketLocation
+ Resource:
+ - Fn::GetAtt: MyLocalS3Bucket.Arn
+ - Effect: Allow
+ Action:
+ - s3:GetObject
+ - s3:PutObject
+ - s3:DeleteObject
+ - s3:DeleteObjectVersion
+ - s3:GetObjectVersion
+ - s3:GetObjectACL
+ - s3:PutObjectACL
+ Resource:
+ - Fn::Sub: ${MyLocalS3Bucket.Arn}/*
+ - Effect: Allow
+ Action:
+ - secretsmanager:GetSecretValue
+ Resource:
+ - !Ref MySecret
+
+ # Create a secret to store the private key
+ MySecret:
+ Type: AWS::SecretsManager::Secret
+ Properties:
+ Description: A secret to store the private key
+ SecretString: !Sub '{"Username": "${UserName}", "PrivateKey": "${SSHPrivateKey}"}'
+
+Outputs:
+
+ MyLocalS3Bucket:
+ Description: The name of the S3 Bucket
+ Value: !Ref MyLocalS3Bucket
+
+ SFTPTransferConnector:
+ Description: The SFTP Transfer Connector
+ Value: !Ref SFTPTransferConnector
\ No newline at end of file
diff --git a/awstransfer-s3-sam/template-sftp-server.yaml b/awstransfer-s3-sam/template-sftp-server.yaml
new file mode 100644
index 000000000..6832d1a3f
--- /dev/null
+++ b/awstransfer-s3-sam/template-sftp-server.yaml
@@ -0,0 +1,159 @@
+AWSTemplateFormatVersion: 2010-09-09
+Transform: AWS::Serverless-2016-10-31
+Description: AWS SAM template for creating an AWS Transfer Family SFTP Server.
+Parameters:
+ UserName:
+ Type: String
+ AllowedPattern: "^[a-zA-Z0-9_][a-zA-Z0-9_.@-]{1,98}[a-zA-Z0-9_@.-]$"
+ Description: Username for AWS Transfer Family service managed user
+ SSHPublicKey:
+ Type: String
+ AllowedPattern: ".+"
+ Description: SSH Key for AWS Transfer Family service managed user.
+
+
+Resources:
+# Create S3 bucket to store files uploaded to AWS Transfer Family server (simulated remote SFTP Server)
+ MySFTPServerS3Bucket:
+ Type: AWS::S3::Bucket
+ Properties:
+ BucketName: !Sub ${AWS::StackName}-${AWS::AccountId}-${AWS::Region}-sftp-server
+
+
+# Create Transfer Family server
+ TransferServer:
+ Type: AWS::Transfer::Server
+ Properties:
+ EndpointType: PUBLIC
+ IdentityProviderType: SERVICE_MANAGED
+ LoggingRole:
+ Fn::GetAtt: TransferLoggingRole.Arn
+ SecurityPolicyName: TransferSecurityPolicy-2024-01
+
+
+
+# Create IAM role for Transfer Family CloudWatch logging
+ TransferLoggingRole:
+ Type: AWS::IAM::Role
+ Properties:
+ AssumeRolePolicyDocument:
+ Version: 2012-10-17
+ Statement:
+ - Effect: Allow
+ Principal:
+ Service: transfer.amazonaws.com
+ Action: sts:AssumeRole
+ Policies:
+ - PolicyName: !Sub ${AWS::StackName}-TransferLoggingPolicy
+ PolicyDocument:
+ Version: 2012-10-17
+ Statement:
+ - Effect: Allow
+ Action:
+ - logs:CreateLogGroup
+ - logs:CreateLogStream
+ - logs:PutLogEvents
+ Resource: "*"
+
+
+# Create an IAM role to allow Transfer Family to access S3
+ TransferS3AccessRole:
+ Type: AWS::IAM::Role
+ Properties:
+ AssumeRolePolicyDocument:
+ Version: 2012-10-17
+ Statement:
+ - Effect: Allow
+ Principal:
+ Service: transfer.amazonaws.com
+ Action: sts:AssumeRole
+ Policies:
+ - PolicyName: !Sub ${AWS::StackName}-TransferS3AccessPolicy
+ PolicyDocument:
+ Version: 2012-10-17
+ Statement:
+ - Effect: Allow
+ Action:
+ - s3:GetBucketLocation
+ - s3:GetObjectTagging
+ - s3:ListBucket
+ - s3:*Object
+ - s3:PutObjectTagging
+ Resource:
+ - Fn::GetAtt: MySFTPServerS3Bucket.Arn
+ - !Sub ${MySFTPServerS3Bucket.Arn}/*
+
+
+# Create Transfer Family Server User
+ TransferServerUser:
+ Type: AWS::Transfer::User
+ Properties:
+ Role:
+ Fn::GetAtt: TransferServerUserRole.Arn
+ ServerId:
+ Fn::GetAtt: TransferServer.ServerId
+ UserName: !Ref UserName
+ HomeDirectoryMappings:
+ - Entry: /
+ Target:
+ Fn::Sub: /${MySFTPServerS3Bucket}/${UserName}
+ HomeDirectoryType: LOGICAL
+ SshPublicKeys:
+ - !Ref SSHPublicKey
+
+
+#Create a IAM Role for Transfer Server User with access to their home directory
+ TransferServerUserRole:
+ Type: AWS::IAM::Role
+ Properties:
+ AssumeRolePolicyDocument:
+ Version: 2012-10-17
+ Statement:
+ - Effect: Allow
+ Principal:
+ Service: transfer.amazonaws.com
+ Action: sts:AssumeRole
+ Policies:
+ - PolicyName: TransferServerUserPolicy
+ PolicyDocument:
+ Version: 2012-10-17
+ Statement:
+ - Effect: Allow
+ Action:
+ - s3:ListBucket
+ - s3:GetBucketLocation
+ Resource:
+ - Fn::GetAtt: MySFTPServerS3Bucket.Arn
+ - Effect: Allow
+ Action:
+ - s3:GetObject
+ - s3:PutObject
+ - s3:DeleteObject
+ - s3:DeleteObjectVersion
+ - s3:GetObjectVersion
+ - s3:GetObjectACL
+ - s3:PutObjectACL
+ Resource:
+ - Fn::Sub: ${MySFTPServerS3Bucket.Arn}/*
+
+Outputs:
+
+ TransferServerId:
+ Description: The ID of the Transfer Server
+ Value: !GetAtt TransferServer.ServerId
+
+ TransferServerEndpoint:
+ Description: The endpoint of the Transfer Server
+ Value: !Sub ${TransferServer.ServerId}.server.transfer.${AWS::Region}.amazonaws.com
+
+ TransferServerUser:
+ Description: The username of the Transfer Server User
+ Value: !Ref UserName
+
+ MySFTPServerS3Bucket:
+ Description: The name of the S3 Bucket
+ Value: !Ref MySFTPServerS3Bucket
+
+ TransferLoggingRoleArn:
+ Description: The name of the Transfer Logging Role
+ Value: !GetAtt TransferLoggingRole.Arn
\ No newline at end of file
diff --git a/awstransfer-s3-sam/undeploy.sh b/awstransfer-s3-sam/undeploy.sh
new file mode 100644
index 000000000..36bd69bb6
--- /dev/null
+++ b/awstransfer-s3-sam/undeploy.sh
@@ -0,0 +1,14 @@
+#!/bin/bash
+
+echo "Enter a stack name"
+read -r STACK_NAME
+
+echo "Enter the desired AWS Region:"
+read -r AWS_REGION
+
+
+# delete the stack2
+sam delete --stack-name "$STACK_NAME-2" --region $AWS_REGION
+
+# delete the stack1
+sam delete --stack-name "$STACK_NAME-1" --region $AWS_REGION
\ No newline at end of file