Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Azure Cosmos DB Docker container stopping #15865

Open
soenneker opened this issue Feb 2, 2022 · 14 comments
Open

Azure Cosmos DB Docker container stopping #15865

soenneker opened this issue Feb 2, 2022 · 14 comments

Comments

@soenneker
Copy link

soenneker commented Feb 2, 2022

Copying from: Azure/azure-cosmos-dotnet-v3#3010

Describe the bug
Using Azure DevOps and a ubuntu-latest hosted agent, when starting the container, the container returns right away the following:

This is an evaluation version.  There are [165] days left in the evaluation period.
Shutting Down
Shut Down

To Reproduce

Create an Azure DevOps pipeline with the following:

pool:
  name: Azure Pipelines
  vmImage: 'ubuntu-latest'
jobs:
- job: TestJob
  displayName: 'My Job'
  steps:
  - task: PowerShell@2
    name: showNetAdapters
    displayName: 'Show NetAdapters'
    inputs:
      pwsh: true
      targetType: inline
      script: |
        ifconfig
        $ipAddress = (hostname -I | awk '{print $1}')
        Write-Output "IpAddress = $ipAddress"
  - task: PowerShell@2
    name: startCosmosDb
    displayName: 'Start Azure Cosmos DB emulator'
    inputs:
      pwsh: true
      targetType: inline
      script: |
        $ipAddress = (hostname -I | awk '{print $1}')
        $containerId = (docker create -p 8081:8081 -p 10251:10251 -p 10252:10252 -p 10253:10253 -p 10254:10254 -m 3g --cpus=2.0 -e AZURE_COSMOS_EMULATOR_PARTITION_COUNT=10 -e AZURE_COSMOS_EMULATOR_ENABLE_DATA_PERSISTENCE=false -e AZURE_COSMOS_EMULATOR_IP_ADDRESS_OVERRIDE=$ipAddress mcr.microsoft.com/cosmosdb/linux/azure-cosmos-emulator)
        Write-Host "##vso[task.setvariable variable=cosmosDbContainerId]$containerId"
        docker start $containerId
        Start-Sleep -Seconds 5
        $isStarted = $false
        while ($isStarted -eq $false) {
            $logs = (docker logs $containerId)
            if ($logs.Contains('Started')) {
                Write-Output "Container $containerId started."
                $isStarted = $true
                break;
            }
            Write-Output "Waiting for container $containerId to start"
            Write-Output ($logs | Out-String)
            Start-Sleep -Seconds 5
        }
  - script: |
      echo 'ContainerId = $(cosmosDbContainerId)'
      docker logs $(cosmosDbContainerId)
    displayName: Diagnostics
  - script: |
      ipAddress="hostname -I | awk '{print $1}'"
      curl -k https://$ipAddress:8081/_explorer/emulator.pem > $(Agent.TempDirectory)/emulatorcert.crt
      cp $(Agent.TempDirectory)/emulatorcert.crt /usr/local/share/ca-certificates/
      update-ca-certificates
      echo "##vso[task.setvariable variable=cosmosDbEndpoint]https://$ipAddress:8081"
    displayName: 'Prepare emulator'
  - script: |
      if [ ! -z "$(cosmosDbContainerId)" ];
      then
        docker rm -f $(cosmosDbContainerId)
        rm -f /usr/local/share/ca-certificates/emulatorcert.crt
      fi
    displayName: 'Clean Azure Cosmos DB emulator'
    condition: always()

Expected behavior
The container should create the partitions and return "Started" at some point

Actual behavior
The container exits and the logs shows

This is an evaluation version.  There are [165] days left in the evaluation period.
Shutting Down
Shut Down

Additional context
Can't use older images (tags) to see if the code has changes since, as it fails the evaluation time check
Is there something I can add as environment variable to get more logs out to diagnose the problem?

@DOMZE
Copy link

DOMZE commented Feb 2, 2022

After discussion, it seems there's a problem with ubuntu-latest (ubuntu-20.04) vmImage. ubuntu-18.04 works.

Anyway to understand what has changed from 18 to 20?

@anatolybolshakov
Copy link
Contributor

Hi @soenneker @DOMZE for questions regarding changes on hosted images - I would suggest to open a ticket in https://github.com/actions/virtual-environments to get right eyes on it.

@github-actions
Copy link

github-actions bot commented Aug 2, 2022

This issue is stale because it has been open for 180 days with no activity. Remove the stale label or comment on the issue otherwise this will be closed in 5 days

@github-actions github-actions bot added the stale label Aug 2, 2022
@DOMZE
Copy link

DOMZE commented Aug 2, 2022

Should this be closed as Azure/azure-cosmos-db-emulator-docker#45 is still not yet fixed?

@github-actions
Copy link

This issue is stale because it has been open for 180 days with no activity. Remove the stale label or comment on the issue otherwise this will be closed in 5 days

@github-actions github-actions bot added the stale label Jan 30, 2023
@DOMZE
Copy link

DOMZE commented Feb 2, 2023

Bump. See my last comment

@github-actions github-actions bot removed the stale label Feb 2, 2023
@iarovyi
Copy link

iarovyi commented Apr 6, 2023

I got the same behavior on github action (Ubuntu 22.04.2 LTS) with step:

      - name: Experiment
        run: |
          docker run `
          --publish 8081:8081 `
          --detach `
          --memory 3g --cpus=2.0 `
          --name=test-linux-emulator `
          --env AZURE_COSMOS_EMULATOR_PARTITION_COUNT=1 `
          --env AZURE_COSMOS_EMULATOR_ENABLE_DATA_PERSISTENCE=false `
          --env AZURE_COSMOS_EMULATOR_IP_ADDRESS_OVERRIDE=127.0.0.1 `
          mcr.microsoft.com/cosmosdb/linux/azure-cosmos-emulator
          Write-Host "Waiting"
          Start-Sleep -Seconds 60
          docker logs test-linux-emulator
          Write-Host "Testing"
        shell: pwsh

with following output:

Unable to find image 'mcr.microsoft.com/cosmosdb/linux/azure-cosmos-emulator:latest' locally
latest: Pulling from cosmosdb/linux/azure-cosmos-emulator
d7bfe07ed847: Pulling fs layer
36fa5862c795: Pulling fs layer
9a0c6c013dde: Pulling fs layer
ac2323e1dbe1: Pulling fs layer
e448b7b24c9f: Pulling fs layer
e8ae8e4f368f: Pulling fs layer
b12082378c75: Pulling fs layer
c179b44ae075: Pulling fs layer
860113fc41f4: Pulling fs layer
cc9f4be5ddce: Pulling fs layer
03253ef18358: Pulling fs layer
37717bdc4f51: Pulling fs layer
687432500fde: Pulling fs layer
e1769ad6e072: Pulling fs layer
257d64871554: Pulling fs layer
69fa8a67727d: Pulling fs layer
a13581a9d01e: Pulling fs layer
4a2591c9e88a: Pulling fs layer
0cef34cf1e4b: Pulling fs layer
1a54b2f019c9: Pulling fs layer
c179b44ae075: Waiting
860113fc41f4: Waiting
cc9f4be5ddce: Waiting
03253ef18358: Waiting
37717bdc4f51: Waiting
687432500fde: Waiting
e1769ad6e072: Waiting
257d64871554: Waiting
69fa8a67727d: Waiting
a13581a9d01e: Waiting
4a2591c9e88a: Waiting
0cef34cf1e4b: Waiting
1a54b2f019c9: Waiting
ac2323e1dbe1: Waiting
e448b7b24c9f: Waiting
e8ae8e4f368f: Waiting
b12082378c75: Waiting
9a0c6c013dde: Download complete
d7bfe07ed847: Verifying Checksum
d7bfe07ed847: Download complete
ac2323e1dbe1: Verifying Checksum
ac2323e1dbe1: Download complete
36fa5862c795: Verifying Checksum
36fa5862c795: Download complete
e448b7b24c9f: Verifying Checksum
e448b7b24c9f: Download complete
e8ae8e4f368f: Verifying Checksum
e8ae8e4f368f: Download complete
860113fc41f4: Verifying Checksum
860113fc41f4: Download complete
c179b44ae075: Verifying Checksum
c179b44ae075: Download complete
b12082378c75: Verifying Checksum
b12082378c75: Download complete
37717bdc4f51: Verifying Checksum
37717bdc4f51: Download complete
cc9f4be5ddce: Verifying Checksum
cc9f4be5ddce: Download complete
e1769ad6e072: Verifying Checksum
e1769ad6e072: Download complete
257d64871554: Verifying Checksum
257d64871554: Download complete
03253ef18358: Verifying Checksum
03253ef18358: Download complete
69fa8a67727d: Verifying Checksum
69fa8a67727d: Download complete
a13581a9d01e: Verifying Checksum
a13581a9d01e: Download complete
4a2591c9e88a: Verifying Checksum
4a2591c9e88a: Download complete
d7bfe07ed847: Pull complete
1a54b2f019c9: Verifying Checksum
1a54b2f019c9: Download complete
687432500fde: Verifying Checksum
687432500fde: Download complete
0cef34cf1e4b: Verifying Checksum
0cef34cf1e4b: Download complete
36fa5862c795: Pull complete
9a0c6c013dde: Pull complete
ac2323e1dbe1: Pull complete
e448b7b24c9f: Pull complete
e8ae8e4f368f: Pull complete
b12082378c75: Pull complete
c179b44ae075: Pull complete
860113fc41f4: Pull complete
cc9f4be5ddce: Pull complete
03253ef18358: Pull complete
37717bdc4f51: Pull complete
687432500fde: Pull complete
e1769ad6e072: Pull complete
257d64871554: Pull complete
69fa8a67727d: Pull complete
a13581a9d01e: Pull complete
4a2591c9e88a: Pull complete
0cef34cf1e4b: Pull complete
1a54b2f019c9: Pull complete
Digest: sha256:c4275ab4dc15f6472ba05ed19ed729c4580986ac27c6542a3320ab0dfea85fde
Status: Downloaded newer image for mcr.microsoft.com/cosmosdb/linux/azure-cosmos-emulator:latest
d8cec6485f0338932ddf941bba271b37e1aa3e7212912c104b6777a68b837d3e
Waiting
This is an evaluation version.  There are [37] days left in the evaluation period.
Shutting Down
Shut Down
Testing

@Robinlievrouw
Copy link

Also have this issue, currently blocking us from running CosmosDB on azure devops pipelines

@adamjones353
Copy link

I am also seeing this issue, I have been fighting this for two days now. I have tried to use an older version of Ubuntu but 18.04 has been retired by Microsoft Azure DevOps. Anyone got any work around for this?

@soenneker
Copy link
Author

Right now we're using the cosmos emulator script that's included on the windows host. Another workaround might be using a real cosmos instance and clearing out the data before running tests against it.

Copy link

This issue is stale because it has been open for 180 days with no activity. Remove the stale label or comment on the issue otherwise this will be closed in 5 days

@github-actions github-actions bot added the stale label Nov 14, 2023
@DOMZE
Copy link

DOMZE commented Nov 14, 2023

Bump. Any updates?

@github-actions github-actions bot removed the stale label Nov 14, 2023
@zakkatkk
Copy link

I'm running into this issue on an Apple M1 chipset but my colleague seems to be able to run the image just fine from a Windows machine. Nothing spit out of the verbose logs, the image quietly exits with code 0 like it ain't no thang.

verticalslice-cosmos-1  | This is an evaluation version.  There are [152] days left in the evaluation period.
verticalslice-cosmos-1  | Starting
verticalslice-cosmos-1 exited with code 0

@CedNZ
Copy link

CedNZ commented Jan 23, 2024

I've just hit this issue, any updates for 2024?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants