Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

turbo prune --scope=web doesn't include turbo.json in out/ or out/json folder #631

Closed
miszo opened this issue Jan 28, 2022 · 1 comment · Fixed by #633
Closed

turbo prune --scope=web doesn't include turbo.json in out/ or out/json folder #631

miszo opened this issue Jan 28, 2022 · 1 comment · Fixed by #633

Comments

@miszo
Copy link

miszo commented Jan 28, 2022

What version of Turborepo are you using?

1.1.1

What package manager are you using / does the bug impact?

Yarn v1

What operating system are you using?

Mac

Describe the Bug

When running the turbo prune --scope=web, the out/ folder has the following structure:

  • apps
  • full
  • json
  • packages
  • .gitignore
  • package.json
  • yarn.lock

since running the npx @turbo/codemod create-turbo-config codemod I have the turbo.json.

This caused some issues when building the app using docker

Here's my Dockerfile

FROM node:16-alpine AS builder
ARG APP_SCOPE
ENV APP_SCOPE ${APP_SCOPE}
RUN apk update && apk add --no-cache git
# Set working directory
WORKDIR /app
RUN yarn global add turbo
COPY . .
RUN turbo prune --scope=${APP_SCOPE} --docker

# Add lockfile and package.json's of isolated subworkspace
FROM node:16-alpine AS installer
RUN apk update && apk add --no-cache git
WORKDIR /app
#set CI variable to true
ENV CI=true
COPY --from=builder /app/out/json/ .
COPY --from=builder /app/out/yarn.lock ./yarn.lock
RUN yarn install --immutable

FROM node:16-alpine AS sourcer
ARG APP_SCOPE
ENV APP_SCOPE ${APP_SCOPE}
RUN apk update && apk add --no-cache git
WORKDIR /app
COPY --from=installer /app/ .
COPY --from=builder /app/out/full/ .
COPY .gitignore .gitignore
RUN NODE_ENV=production yarn turbo run build --scope=${APP_SCOPE} --include-dependencies --no-deps

EXPOSE 3000
CMD yarn turbo run start --scope=${APP_SCOPE}

I've fixed it with

FROM node:16-alpine AS builder
ARG APP_SCOPE
ENV APP_SCOPE ${APP_SCOPE}
RUN apk update && apk add --no-cache git
# Set working directory
WORKDIR /app
RUN yarn global add turbo
COPY . .
RUN turbo prune --scope=${APP_SCOPE} --docker

# Add lockfile and package.json's of isolated subworkspace
FROM node:16-alpine AS installer
RUN apk update && apk add --no-cache git
WORKDIR /app
#set CI variable to true
ENV CI=true
COPY --from=builder /app/out/json/ .
COPY --from=builder /app/out/yarn.lock ./yarn.lock
# Add turbo.json's from the root
COPY --from=builder /app/turbo.json ./turbo.json
RUN yarn install --immutable

FROM node:16-alpine AS sourcer
ARG APP_SCOPE
ENV APP_SCOPE ${APP_SCOPE}
RUN apk update && apk add --no-cache git
WORKDIR /app
COPY --from=installer /app/ .
COPY --from=builder /app/out/full/ .
COPY .gitignore .gitignore
RUN NODE_ENV=production yarn turbo run build --scope=${APP_SCOPE} --include-dependencies --no-deps

EXPOSE 3000
CMD yarn turbo run start --scope=${APP_SCOPE}

Expected Behavior

turbo.json file should be included in the out/ or out/json/ folder

then my Dockerfile would look like this

FROM node:16-alpine AS builder
ARG APP_SCOPE
ENV APP_SCOPE ${APP_SCOPE}
RUN apk update && apk add --no-cache git
# Set working directory
WORKDIR /app
RUN yarn global add turbo
COPY . .
RUN turbo prune --scope=${APP_SCOPE} --docker

# Add lockfile and package.json's of isolated subworkspace
FROM node:16-alpine AS installer
RUN apk update && apk add --no-cache git
WORKDIR /app
#set CI variable to true
ENV CI=true
COPY --from=builder /app/out/json/ .
COPY --from=builder /app/out/turbo.json ./turbo.json
COPY --from=builder /app/out/yarn.lock ./yarn.lock
RUN yarn install --immutable

FROM node:16-alpine AS sourcer
ARG APP_SCOPE
ENV APP_SCOPE ${APP_SCOPE}
RUN apk update && apk add --no-cache git
WORKDIR /app
COPY --from=installer /app/ .
COPY --from=builder /app/out/full/ .
COPY .gitignore .gitignore
RUN NODE_ENV=production yarn turbo run build --scope=${APP_SCOPE} --include-dependencies --no-deps

EXPOSE 3000
CMD yarn turbo run start --scope=${APP_SCOPE}

Or just as it was already

FROM node:16-alpine AS builder
ARG APP_SCOPE
ENV APP_SCOPE ${APP_SCOPE}
RUN apk update && apk add --no-cache git
# Set working directory
WORKDIR /app
RUN yarn global add turbo
COPY . .
RUN turbo prune --scope=${APP_SCOPE} --docker

# Add lockfile, turbo.json and package.json's of isolated subworkspace
FROM node:16-alpine AS installer
RUN apk update && apk add --no-cache git
WORKDIR /app
#set CI variable to true
ENV CI=true
COPY --from=builder /app/out/json/ .
COPY --from=builder /app/out/yarn.lock ./yarn.lock
RUN yarn install --immutable

FROM node:16-alpine AS sourcer
ARG APP_SCOPE
ENV APP_SCOPE ${APP_SCOPE}
RUN apk update && apk add --no-cache git
WORKDIR /app
COPY --from=installer /app/ .
COPY --from=builder /app/out/full/ .
COPY .gitignore .gitignore
RUN NODE_ENV=production yarn turbo run build --scope=${APP_SCOPE} --include-dependencies --no-deps

EXPOSE 3000
CMD yarn turbo run start --scope=${APP_SCOPE}

To Reproduce

  1. Init new turborepo project
  2. Run turbo prune --scope=web
  3. There's no turbo.json file in the out/ or `out/json/ folder
@jakst
Copy link

jakst commented Jan 28, 2022

To add to this issue, for yarn v2/v3 the following need to be included:

  • The .yarnrc.yml file and .yarn folder (not sure if the cache files and install-state.gz inside the .yarn folder should be included though)
  • In the lockfile, the actual workspaces are pruned away. They're needed or yarn will complain that your workspace is missing. This is what they look like in the lockfile, "my-app@workspace:apps/my-app": and then you can run commands like yarn workspace my-app build.
  • Monorepo deps in the root package.json. Otherwise yarn install won't run

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants