Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cache server not working #92

Closed
NorkzYT opened this issue Dec 24, 2022 · 20 comments
Closed

Cache server not working #92

NorkzYT opened this issue Dec 24, 2022 · 20 comments
Labels
wontfix This will not be worked on

Comments

@NorkzYT
Copy link

NorkzYT commented Dec 24, 2022

Hope anyone that reads this is having a great day!

I only have the following line in the Docker Container logs, and no other log appears when I use the remote cache.

54896/11/01 01:17PM 30 Server listening at http://0.0.0.0:3000 | severity=INFO pid=6 hostname=turborepo-remote-cache

How do I know when I run the npm script build command, the action is importing the cache, and/or exporting it?

Is the STORAGE_PATH environmental variable correctly used along with volumes?

The following contains the contents of the TurboRepo Remote Cache docker-compose.yml file.

version: '3.9'
services:

  turborepo-remote-cache:
    image: fox1t/turborepo-remote-cache:1.8.0
    container_name: turborepo-remote-cache
    hostname: turborepo-remote-cache
    environment:
      - NODE_ENV=production
      - TURBO_TOKEN='xxx'
      - LOG_LEVEL=debug
      - STORAGE_PROVIDER=local
      - STORAGE_PATH=/tmp
    volumes:
      - /mnt/appdata/repo-team/turborepo-remote-cache/tmp:/tmp
    ports:
      - 3535:3000
    networks:
      - proxy

networks:
  proxy:
    driver: bridge
    external: true

Originally posted by @NorkzYT in #84

@NorkzYT
Copy link
Author

NorkzYT commented Dec 24, 2022

Update:

For some reason the server is not being used when running the following in a dockerfile.

"build": "turbo run build --team=\"team_time\" --token=\"xxx\" --api=\"https://cache.domain.com\"",

@fox1t
Copy link
Collaborator

fox1t commented Dec 28, 2022

Hey! What is your local turborepo client saying? Is the remote cache enabled? Can you double-check if you followed all steps
here?

@NorkzYT
Copy link
Author

NorkzYT commented Dec 28, 2022

According to what the terminal indicates when I run "build," the remote cache is enabled. Additionally, I followed every instruction in the documentation. The issue I have is that every time I build from a docker-file the build always takes around 60 seconds instead of a lower interval. Which to me indicates that the cache is not being used since I can run the build command without a docker-file and get lower intervals of build time.

@NorkzYT
Copy link
Author

NorkzYT commented Dec 28, 2022

The following are the contents of the docker-file.

# DockerFile

# ------------------------------------------------------------------------------ #

# ---- Dependencies ----
FROM node:lts-buster-slim AS pnpm
ENV CI=1

#ARG PNPM_VERSION=7.19.0

RUN apt-get update && apt-get install libssl-dev ca-certificates -y \
 && apt-get install --no-install-recommends -y openssl \
 && apt-get install -y wget \
 && apt-get install -y openssl git \
 && apt-get install -y rsync
 
RUN npm --no-update-notifier --no-fund --global install pnpm@latest

ENV PRISMA_SKIP_POSTINSTALL_GENERATE=1

# ------------------------------------------------------------------------------ #

# ---- Build ----
FROM pnpm AS builder
WORKDIR /app

COPY . .

# Get the git head
RUN git rev-parse HEAD | cut -c1-7 >hash.txt

# Install everything
RUN pnpm install --no-frozen-lockfile

# TinyMCE postinstall
RUN pnpm run tinymcepostinstall-pnpm

# Run prisma generate
RUN npx prisma generate --schema ./prisma/first-schema.prisma
RUN npx prisma generate --schema ./prisma/second-schema.prisma

# Build the package
RUN pnpm run build-turbo

# ------------------------------------------------------------------------------ #

# ---- Production ----
FROM pnpm as production
WORKDIR /app
ENV NODE_ENV=production

RUN addgroup --system --gid 1001 nodejs
RUN adduser --system --uid 1001 nextjs

# Copy from the build artifacts and the locks and package.json.
COPY --chown=nextjs:nodejs --from=builder /app/build ./build
COPY --chown=nextjs:nodejs --from=builder /app/lib ./lib
COPY --chown=nextjs:nodejs --from=builder /app/prisma ./prisma
COPY --chown=nextjs:nodejs --from=builder /app/public ./public
COPY --chown=nextjs:nodejs --from=builder /app/next.config.js ./next.config.js
COPY --chown=nextjs:nodejs --from=builder /app/package.json ./package.json
COPY --chown=nextjs:nodejs --from=builder /app/node_modules ./node_modules

# Hash identity
COPY --chown=nextjs:nodejs --from=builder /app/hash.txt ./hash.txt

USER nextjs

# Expose the Port
EXPOSE 7537

# Expose the Prisma Studio Ports
EXPOSE 5555
EXPOSE 5556

# Start the app
CMD [ "pnpm", "start", "--", "--port", "7537" ]

# --------------------------------------------------------------------------------------

@fox1t
Copy link
Collaborator

fox1t commented Dec 29, 2022

Another question: inside the docker container, during the build steps, do you see logged any "remote cache enabled" from the turborepo client?

@NorkzYT
Copy link
Author

NorkzYT commented Dec 29, 2022

Yes, here is a screenshot.

image

@NorkzYT
Copy link
Author

NorkzYT commented Dec 29, 2022

turbo.json

{
  "$schema": "https://turbo.build/schema.json",
  "pipeline": {
    "build-turbo": {
      "outputs": [".next/**"]
    },
    "lint": {
      "outputs": []
    },
    "dev": {
      "cache": false
    },
    "test": {
      "dependsOn": ["^build"],
      "cache": false
    }
  }
}

.turbo/config.json

{
  "teamId": "team_name",
  "apiUrl": "https://sub-turbo.domain.com"
}

From package.json

"build-turbo": "turbo run build --team=\"team_name\" --token=\"xxx\" --api=\"https://sub-turbo.domain.com\""

@NorkzYT
Copy link
Author

NorkzYT commented Dec 29, 2022

Interesting, I just ran pnpm run build-turbo, and I get the following.

 WARNING  failed to contact turbod. Continuing in standalone mode: connection to turbo daemon process failed. Please ensure the following:
        - the process identified by the pid in the file at /tmp/turbod/e75b3c585e827d40/turbod.pid is not running, 
and remove /tmp/turbod/e75b3c585e827d40/turbod.pid
        - check the logs at /root/.local/share/turborepo/logs/e75b3c585e827d40-Brainstormr.log
        - the unix domain socket at /tmp/turbod/e75b3c585e827d40/turbod.sock has been removed
        You can also run without the daemon process by passing --no-daemon

@fox1t
Copy link
Collaborator

fox1t commented Dec 30, 2022

OK, so the issue is on the client's side. This is the first time I have seen it. Could a search on the turborepo repository help?

@NorkzYT
Copy link
Author

NorkzYT commented Dec 30, 2022

When I looked into the warning, there was just one suggestion that claimed to fix the problem: install "turbo": "1.4.4-canary.0." However, this did not work.

https://stackoverflow.com/a/73512113/19395252

Given that the user in the conversation is using a MacBook Pro M1, and I am using an ARM64-based Linux computer, it appears from the discussion that this solely affects 'ARM' architectures.

https://pullanswer.com/questions/warning-failed-to-contact-turbod

I'll keep looking for a solution to this. For development, it will probably be necessary for myself to use an earlier version of Turbo.

@fox1t
Copy link
Collaborator

fox1t commented Dec 31, 2022

Should we add this to the documentation on our side?

@NorkzYT
Copy link
Author

NorkzYT commented Dec 31, 2022

Not yet, I am going to test on a AMD64 machine to check if I recieve working results with the latest turbo version, or if it does not work, and etc.
I will advise ASAP on what I find.

@NorkzYT
Copy link
Author

NorkzYT commented Dec 31, 2022

I discovered that my turbo.json was wrong since I used turbo-build rather than just build. On the most recent turbo version, I no longer receive any daemon warnings. However, the following problem is yet unresolved.

vercel/turbo#2034

I tried to run the following:

"build-turbo": "turbo run build --team=\"team_time\" --token=\"xxx\" --api=\"https://cache.domain.com\"",

Even though the TURBO TOKEN is same in the docker-compose.yml file, the turborepo-remote-cache informs me that the authorization token is wrong. Take notice that the api field is using https.

I followed up by using "http" as follows: —api="http://turborepo-remote-cache:3000" The cache was not reached, even though there was nothing recorded in the docker container's logs, and I received the following message when I ran the dockerfile.

cache miss, executing c4eabec879c0c995

When I run pnpm run build-turbo in the terminal it runs perfectly with the following beginning output

• Running build
• Remote caching enabled
build: Skipping cache check for //#build, outputs have not changed since previous run.
build: cache hit, replaying output c4eabec879c0c995

End output:

 Tasks:    1 successful, 1 total
Cached:    1 cached, 1 total
  Time:    49ms >>> FULL TURBO

I am not sure what the issue could be, I do know that the dockerfile could be the one causing the problem.

It appears to me that the remote cache is not functioning.

Do you have a functioning model that I could copy and test from? Since I wish to utilize an ARM64 computer rather than an AMD64 one, I have not yet tested this on an AMD64 machine. This might perhaps only be an ARM64 architecture issue.

@matteovivona
Copy link
Collaborator

Hi @NorkzYT. Any update about your issue?

@NorkzYT
Copy link
Author

NorkzYT commented Feb 3, 2023

@matteovivona

Unfortunately, none.

@matteovivona
Copy link
Collaborator

@matteovivona
Unfortunately, none.

just tried with version 1.12 and first package on my Dockerfile

-> [build 6/10] RUN ppm turbo --filter-api build => # @flow/sentry:build:
-> # @flow/feature-flag:build: cache hit, replaying output 2065d6fa18024458 => # @flow/feature-flag:build:
=> # @flow/feature-flag:build: > @flow/feature-flag@0.3.0 build /home/node/packages/feature-flag => => # @flow/feature-flag:build: > rimraf dist &8 tsc
=> # @flow/feature-flag:build:

second package

=> [build 6/10] RUN prpm turbo --filter-api build =› =› # @flow/filters:build:
-> -> # @flow/automations:build: cache miss, executing 393b11febee45d47 =› => # @flow/automations:build:
-> -> # @fLow/automations :build: > @fLow/automations@1.1.1 build /home/node/packages/automations => => # @flow/automations:build: > rimraf dist 8& tsc
=> # @flow/automations:build:

@NorkzYT
Copy link
Author

NorkzYT commented Feb 3, 2023

@matteovivona
Unfortunately, none.

just tried with version 1.12 and first package on my Dockerfile

-> [build 6/10] RUN ppm turbo --filter-api build => # @flow/sentry:build:
-> # @flow/feature-flag:build: cache hit, replaying output 2065d6fa18024458 => # @flow/feature-flag:build:
=> # @flow/feature-flag:build: > @flow/feature-flag@0.3.0 build /home/node/packages/feature-flag => => # @flow/feature-flag:build: > rimraf dist &8 tsc
=> # @flow/feature-flag:build:

second package

=> [build 6/10] RUN prpm turbo --filter-api build =› =› # @flow/filters:build:
-> -> # @flow/automations:build: cache miss, executing 393b11febee45d47 =› => # @flow/automations:build:
-> -> # @fLow/automations :build: > @fLow/automations@1.1.1 build /home/node/packages/automations => => # @flow/automations:build: > rimraf dist 8& tsc
=> # @flow/automations:build:

Interesting. Does it work for you without any failures? I ask since the second package says "cache miss".

@matteovivona
Copy link
Collaborator

@matteovivona
Unfortunately, none.

just tried with version 1.12 and first package on my Dockerfile

-> [build 6/10] RUN ppm turbo --filter-api build => # @flow/sentry:build:
-> # @flow/feature-flag:build: cache hit, replaying output 2065d6fa18024458 => # @flow/feature-flag:build:
=> # @flow/feature-flag:build: > @flow/feature-flag@0.3.0 build /home/node/packages/feature-flag => => # @flow/feature-flag:build: > rimraf dist &8 tsc
=> # @flow/feature-flag:build:

second package

=> [build 6/10] RUN prpm turbo --filter-api build =› =› # @flow/filters:build:
-> -> # @flow/automations:build: cache miss, executing 393b11febee45d47 =› => # @flow/automations:build:
-> -> # @fLow/automations :build: > @fLow/automations@1.1.1 build /home/node/packages/automations => => # @flow/automations:build: > rimraf dist 8& tsc
=> # @flow/automations:build:

Interesting. Does it work for you without any failures? I ask since the second package says "cache miss".

the second package, in this case, had to be re-built

@matteovivona matteovivona added the wontfix This will not be worked on label Feb 5, 2023
@rahnarsson
Copy link

@NorkzYT

Had the same issue, and did some Wiresharkin during the docker-build process. I noticed that there was TLS-handshake error when connecting to turbo-cache server. I was also using node-slim baseimage and after installing ca-certificates package to the container remote cache started to work:

RUN apt-get update && apt-get install -y
ca-certificates \

@matteovivona
Copy link
Collaborator

RUN apt-get update && apt-get install -y ca-certificates \

@rahnarsson good call. We had noticed that in some cases with node-alpine it was necessary to install g++, make and libc6-compat

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

4 participants