Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Creating a local stack s3 repository failed. use Postman works, but code calls are not working. #8192

Open
firstsaofan opened this issue May 21, 2024 · 0 comments
Labels
7.x Relates to a 7.x client version Category: Bug

Comments

@firstsaofan
Copy link

Creating a local stack s3 repository failed. use Postman works, but code calls are not working.

NEST/Elasticsearch.Net version:

<PackageReference Include="NEST" Version="7.17.5" />

Elasticsearch version:

elasticsearch:7.17.7

localstack/localstack:latest

.NET runtime version:

<TargetFramework>net8.0</TargetFramework>

Operating system version:

Ubuntu 20,ide: Rider

Description of the problem including expected versus actual behavior:

7.17 API docs:https://www.elastic.co/guide/en/elasticsearch/plugins/7.17/repository-s3.html

8.13 API docs:https://www.elastic.co/guide/en/elasticsearch/reference/8.13/repository-s3.html

Creating a repository type based on local stack s3 failed. Postman works, but code calls are not working.

Error message: The bucket does not exist. In fact, my bucket exists,

I also tested uploading an image file.

I'm not sure what else I need to configure.

At the same time, Postman can successfully create a repository, and at this point, it should not actually request local stack S3,

But when I call the code, I will report an error, no sack bucket. At the same time,

I validated based on this repository, obtained the repository, and created a snapshot operation,

It will prompt me that I do not have this bucket.

I confirm that the request has been made to local stack s3,

The log information for local stack s3 is provided at the end of the following figure.

Is there a compatibility relationship between the version of Elastic Search and the version of S3,

I will provide detailed code and Docker Compose.yml file to reproduce the content

Invalid NEST response built from a unsuccessful (500) low level call on PUT: /_snapshot/unit-test-s3-reponamea7fa5e24-9843-4ce7-ba15-89489c86e98a
# Audit trail of this API call:
 - [1] BadResponse: Node: http://localhost:9200/ Took: 00:00:00.0914049
# OriginalException: Elasticsearch.Net.ElasticsearchClientException: Request failed to execute. Call: Status code 500 from: PUT /_snapshot/unit-test-s3-reponamea7fa5e24-9843-4ce7-ba15-89489c86e98a. ServerError: Type: repository_verification_exception Reason: "[unit-test-s3-reponamea7fa5e24-9843-4ce7-ba15-89489c86e98a] path  is not accessible on master node" CausedBy: "Type: i_o_exception Reason: "Unable to upload object [tests-EfaP_MESRUy_X28dSyF4Lg/master.dat] using a single upload" CausedBy: "Type: amazon_s3_exception Reason: "The specified bucket does not exist (Service: Amazon S3; Status Code: 404; Error Code: NoSuchBucket; Request ID: 5f9b4920-13a0-4917-9441-5c4c87b4a238; S3 Extended Request ID: null)"""
# Request:
<Request stream not captured or already read to completion by serializer. Set DisableDirectStreaming() on ConnectionSettings to force it to be set on the response.>
# Response:
<Response stream not captured or already read to completion by serializer. Set DisableDirectStreaming() on ConnectionSettings to force it to be set on the response.>

S3-1

S3-2
S3-3

postman test

create repository
http://localhost:9200/_snapshot/unit-test-s3-reponameb5ad0268-3012-4677-af46-780c8776f97455

S3-4

verify local stack s3 repository
http://localhost:9200/_snapshot/unit-test-s3-reponameb5ad0268-3012-4677-af46-780c8776f97455/_verify

S3-5

get local stack s3 repository
http://localhost:9200/_snapshot/unit-test-s3-reponameb5ad0268-3012-4677-af46-780c8776f97455

S3-6

test create snapshot in local stack s3 repository
http://localhost:9200/_snapshot/unit-test-s3-reponameb5ad0268-3012-4677-af46-780c8776f97455/test-snapshot?wait_for_completion=true

S3-7

local stack s3 container logs

S3-8

Steps to reproduce:

because my requirement is to use both fs and s3 as repositories,

So at the beginning, I didn't write the two services in a docker compose. yml file

1.Set up ES environment

1.1 docker compose.yml file

version: '3'
services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.17.7
    container_name: elasticsearchdemo
    network_mode: host
    environment:
      - node.name=elasticsearch
      - cluster.name=docker-cluster
      - discovery.type=single-node
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
      - xpack.security.enabled=true
      - ELASTIC_PASSWORD=password
      - "xpack.security.http.ssl.enabled=false"
    ports:
      - "9200:9200"
    volumes:
      - ./config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml  

1.2 If you want to mount config.yml to your local computer, Windows may consider config.yml as a folder and prompt you to map a folder to a file. You can use the Docker cp command to copy it to the host and then mount it. At this point, it will prompt you that the file ==jvm.options== is necessary, so remember to keep jvm.options on the host and only mount the config.yml configuration file, which will result in errors.

docker cp elasticsearchdemo:/usr/share/elasticsearch/config ./config

ES config.yml file

path.repo is designed to use fs as the repository, and you may not need it

cluster.name: "docker-cluster"


xpack.security.http.ssl.enabled: false
s3.client.default.endpoint: "http://localhost:4566"
s3.client.default.protocol: "http"

path:
  repo:
    - /usr/share/elasticsearch/data/backups
    - /usr/share/elasticsearch/data/long_term_backups

summary:After starting the ES container normally, you need to do these operations

#1 Enter the ES container bash
docker exec -it elasticsearchdemo sh
#2 install s3 plugin
bin/elasticsearch-plugin install repository-s3
#3 add s3 access_key 、 secret_key //The value here corresponds to the value of the local stack s3 docker.compose.yml file

bin/elasticsearch-keystore add s3.client.default.access_key

bin/elasticsearch-keystore add s3.client.default.secret_key

# If you need to remove s3 access_key, secret_key, please use the following command
# bin/elasticsearch-keystore remove s3.client.default.access_key
# bin/elasticsearch-keystore remove s3.client.default.secret_key

2.local stack s3 docker compose.yml file:

ES_ENDPOINT_STRATEGY=port (This parameter is very important)

Official address:https://docs.localstack.cloud/user-guide/aws/es/#endpoints,

Please select host for the network, or merge the above two docket compose.yml files into one and use the same network

The default localhost:4566 is the endpoint of S3:

http://localhost:4566/

version: '3'
services:

  localstack:
    image: localstack/localstack:latest
    container_name: localstack
    network_mode: host
    ports:
      - "4566:4566"
      - "4571:4571"
      - "8055-8080:8055-8080"
    environment:
      - SERVICES=s3
      - ES_ENDPOINT_STRATEGY=port
      - DEFAULT_REGION=us-east-1
      - LOCALSTACK_HOST=localhost
      - USE_SSL=false
      - PERSISTENCE=/tmp/localstack/data
      - AWS_ACCESS_KEY_ID=test
      - AWS_SECRET_ACCESS_KEY=test

3.test code

S3 test code:

You need to install the following nuget package

      <PackageReference Include="AWSSDK.S3" Version="3.7.307.29" />
      <PackageReference Include="NEST" Version="7.17.5" />
       <PackageReference Include="FluentAssertions" Version="6.12.0" />
using System.Net;
using System.Net.Http.Json;
using Amazon.Runtime;
using Amazon.S3;
using Amazon.S3.Model;
using Amazon.Util;
using FluentAssertions;

namespace DataMaskingTest;

public class LocalStackS3Test
{
    [Fact]
    public async Task Check_LocalStack_Valid()
    {
        string bucketName = "my-bucket"; // bucket name
        string serviceUrl = "http://localhost:4566"; // Localstack s3 service URL

        var awsCredentials = new BasicAWSCredentials("test", "test");
        var config = new AmazonS3Config
        {
            ServiceURL = serviceUrl,
            ForcePathStyle = true,
        };

        var client = new AmazonS3Client(awsCredentials, config);

        var putBucketRequest = new PutBucketRequest
        {
            BucketName = bucketName,
            BucketRegion = S3Region.US,
            UseClientRegion = false
        };

        var policy = @"{
  ""Version"": ""2012-10-17"",
  ""Statement"": [
    {
      ""Effect"": ""Allow"",
      ""Principal"": ""*"",
      ""Action"": [
        ""s3:GetObject"",
        ""s3:PutObject"",
        ""s3:DeleteObject"",
        ""s3:AbortMultipartUpload"",
        ""s3:ListMultipartUploadParts""
      ],
      ""Resource"": ""arn:aws:s3:::my-bucket/*""
    }
  ]}";
        
        var putBucketPolicyRequest = new PutBucketPolicyRequest
        {
            
            BucketName = bucketName,
            Policy = policy
        };
        var res = await client.PutBucketAsync(putBucketRequest);
        
        //update policy
        await client.PutBucketPolicyAsync(putBucketPolicyRequest);
        
        //put a file into bucket,
        // Generate an image file
        byte[] imageBytes = GenerateImageBytes();
        string keyName = "example_image.jpg";

        var putObjectRequest = new PutObjectRequest
        {
            BucketName = bucketName,
            Key = keyName,
            InputStream = new MemoryStream(imageBytes)
        };

        var putObjectResponse = await client.PutObjectAsync(putObjectRequest);

        putObjectResponse.HttpStatusCode.Should().Be(HttpStatusCode.OK);
        
        
        res.HttpStatusCode.Should().Be(HttpStatusCode.OK);
    }
    
    private byte[] GenerateImageBytes()
    {
        byte[] fakeImageBytes = new byte[1024];
        return fakeImageBytes;
    }
    
}
ES test code
 [Fact]
        public async void Test_Es_S3Repository()
        {
            var node = new Uri("http://localhost:9200");
            var settings = new ConnectionSettings(node)
                .BasicAuthentication("elastic", "password"); 

            var client = new ElasticClient(settings);

            //Provide necessary parameters
            string repoName = "unit-test-s3-reponame"+Guid.NewGuid();
            //fs
            string bucketName = "my-bucket"; // bucket name
            
            var response = await client.Snapshot.CreateRepositoryAsync(repoName,
                c => c.S3(s3 => s3.Settings(bucketName)));

            Assert.True(response.IsValid);
        }

Expected behavior

I hope the code can also create a successful repository. Also, is there a version coupling relationship between ES and local stack s3? If you know the solution, please let me know the detailed configuration of ES and local stack s3 and how to solve it. Thank you very much

actual behavior

Create failed, S3 does not have this bucket

@firstsaofan firstsaofan added 7.x Relates to a 7.x client version Category: Bug labels May 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
7.x Relates to a 7.x client version Category: Bug
Projects
None yet
Development

No branches or pull requests

1 participant