Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Print PDF in local self-hosted version not working #1623

Open
1 task done
Doreapp opened this issue Nov 29, 2023 · 17 comments
Open
1 task done

[Bug] Print PDF in local self-hosted version not working #1623

Doreapp opened this issue Nov 29, 2023 · 17 comments
Assignees
Labels
bug Something isn't working needs triage Issues that need to be triaged v4 Issues related to the latest version

Comments

@Doreapp
Copy link

Doreapp commented Nov 29, 2023

Is there an existing issue for this?

  • Yes, I have searched the existing issues and none of them match my problem.

Product Variant

Self-Hosted

Current Behavior

I have a resume, when I press "download PDF" button, it opens a new tab about:blank and nothing is downloaded

Expected Behavior

The resume in PDF should be downloaded

Steps To Reproduce

Here is the full docker-compose configuration that I use to test the app:

docker-compose.yml
version: "3.8"

# In this Docker Compose example, it assumes that you maintain a reverse proxy externally (or chose not to).
# The only two exposed ports here are from minio (:9000) and the app itself (:3000).
# If these ports are changed, ensure that the env vars passed to the app are also changed accordingly.

services:
  # Database (Postgres)
  postgres:
    image: postgres:15-alpine
    restart: unless-stopped
    volumes:
      - postgres_data:/var/lib/postgresql/data
    environment:
      POSTGRES_DB: postgres
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U postgres -d postgres"]
      interval: 10s
      timeout: 5s
      retries: 5

  # Storage (for image uploads)
  minio:
    image: minio/minio
    restart: unless-stopped
    command: server --console-address ":9001" /data
    ports:
      - 9000:9000
      - 9001:9001
    volumes:
      - minio_data:/data
    environment:
      MINIO_ROOT_USER: minioadmin
      MINIO_ROOT_PASSWORD: minioadmin

  # Chrome Browser (for printing and previews)
  chrome:
    image: browserless/chrome:1.61.0-puppeteer-21.4.1
    restart: unless-stopped
    environment:
      TOKEN: chrome_token
      EXIT_ON_HEALTH_FAILURE: true
      PRE_REQUEST_HEALTH_CHECK: true

  # Redis (for cache & server session management)
  redis:
    image: redis:alpine
    restart: unless-stopped
    command: redis-server --requirepass password

  app:
    image: amruthpillai/reactive-resume:latest
    restart: unless-stopped
    ports:
      - 3000:3000
    depends_on:
      - postgres
      - minio
      - redis
      - chrome
    environment:
      # -- Environment Variables --
      PORT: 3000
      NODE_ENV: production

      # -- URLs --
      PUBLIC_URL: http://localhost:3000/
      PUBLIC_SERVER_URL: https://localhost:3000/api
      STORAGE_URL: http://localhost:9000/default

      # -- Printer (Chrome) --
      CHROME_TOKEN: chrome_token
      CHROME_URL: ws://chrome:3000

      # -- Database (Postgres) --
      DATABASE_URL: postgresql://postgres:postgres@postgres:5432/postgres

      # -- Auth --
      ACCESS_TOKEN_SECRET: access_token_secret
      REFRESH_TOKEN_SECRET: refresh_token_secret

      # -- Emails --
      MAIL_FROM: noreply@localhost
      # SMTP_URL: smtp://user:pass@smtp:587 # Optional

      # -- Storage (Minio) --
      STORAGE_ENDPOINT: minio
      STORAGE_PORT: 9000
      STORAGE_REGION: us-east-1 # Optional
      STORAGE_BUCKET: default
      STORAGE_ACCESS_KEY: minioadmin
      STORAGE_SECRET_KEY: minioadmin
      STORAGE_USE_SSL: false

      # -- Cache (Redis) --
      REDIS_URL: redis://default:password@redis:6379

      # -- Sentry --
      # VITE_SENTRY_DSN: https://id.sentry.io # Optional

      # -- Crowdin (Optional) --
      # CROWDIN_PROJECT_ID:
      # CROWDIN_PERSONAL_TOKEN:

      # -- Email (Optional) --
      # DISABLE_EMAIL_AUTH: true
      # VITE_DISABLE_SIGNUPS: true

      # -- GitHub (Optional) --
      GITHUB_CLIENT_ID: github_client_id
      GITHUB_CLIENT_SECRET: github_client_secret
      GITHUB_CALLBACK_URL: http://localhost:3000/api/auth/github/callback

      # -- Google (Optional) --
      GOOGLE_CLIENT_ID: google_client_id
      GOOGLE_CLIENT_SECRET: google_client_secret
      GOOGLE_CALLBACK_URL: http://localhost:3000/api/auth/google/callback

volumes:
  minio_data:
  postgres_data:
docker-compose up -d
  • Create a resume
  • Press download PDF button

What browsers are you seeing the problem on?

Firefox

What template are you using?

None

Anything else?

In the logs, I get this when pressing "Download PDF" button :

reactive-resume-app-1       | Trace: Error: net::ERR_NAME_NOT_RESOLVED at http://host.docker.internal:3000//artboard/preview
reactive-resume-app-1       |     at navigate (/app/node_modules/.pnpm/puppeteer-core@21.5.2/node_modules/puppeteer-core/lib/cjs/puppeteer/cdp/Frame.js:176:27)
reactive-resume-app-1       |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
reactive-resume-app-1       |     at async Deferred.race (/app/node_modules/.pnpm/puppeteer-core@21.5.2/node_modules/puppeteer-core/lib/cjs/puppeteer/util/Deferred.js:83:20)
reactive-resume-app-1       |     at async CdpFrame.goto (/app/node_modules/.pnpm/puppeteer-core@21.5.2/node_modules/puppeteer-core/lib/cjs/puppeteer/cdp/Frame.js:142:25)
reactive-resume-app-1       |     at async CdpPage.goto (/app/node_modules/.pnpm/puppeteer-core@21.5.2/node_modules/puppeteer-core/lib/cjs/puppeteer/api/Page.js:595:20)
reactive-resume-app-1       |     at async PrinterService.generateResume (/app/dist/apps/server/main.js:13176:13)
reactive-resume-app-1       |     at async /app/dist/apps/server/main.js:13120:25
reactive-resume-app-1       |     at async UtilsService.getCachedOrSet (/app/dist/apps/server/main.js:11886:23)
reactive-resume-app-1       |     at async ResumeService.printResume (/app/dist/apps/server/main.js:13965:21)
reactive-resume-app-1       |     at async ResumeController.printResume (/app/dist/apps/server/main.js:13580:25)
reactive-resume-app-1       |     at PrinterService.generateResume (/app/dist/apps/server/main.js:13226:21)
reactive-resume-app-1       |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
reactive-resume-app-1       |     at async /app/dist/apps/server/main.js:13120:25
reactive-resume-app-1       |     at async UtilsService.getCachedOrSet (/app/dist/apps/server/main.js:11886:23)
reactive-resume-app-1       |     at async ResumeService.printResume (/app/dist/apps/server/main.js:13965:21)
reactive-resume-app-1       |     at async ResumeController.printResume (/app/dist/apps/server/main.js:13580:25)

After a quick search in the code, I think this is due to this line where localhost is replaced by host.docker.internal

IMPORTANT NOTE: I tried to replace SERVER_URL to http://localhost:3000 (i.e. without trailing slash) but I still get Error: net::ERR_NAME_NOT_RESOLVED at http://host.docker.internal:3000/artboard/preview)

Edit

After changing localhost occurrences in docker-compose.yml to 127.0.0.1 I don't get the error anymore (I think it bypassed the issue), but I get a new one... (with the exact same behavior for the end user). Here are the logs:

reactive-resume-chrome-1    | 2023-11-29T18:05:39.172Z browserless:chrome-helper Setting up file:// protocol request rejection
reactive-resume-app-1       | Trace: Error [TypeError]: Cannot read properties of null (reading 'cloneNode')
reactive-resume-app-1       |     at evaluate (evaluate at processPage (/app/dist/apps/server/main.js:13182:45), <anonymous>:1:50)
reactive-resume-app-1       |     at #evaluate (/app/node_modules/.pnpm/puppeteer-core@21.5.2/node_modules/puppeteer-core/lib/cjs/puppeteer/cdp/ExecutionContext.js:229:55)
reactive-resume-app-1       |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
reactive-resume-app-1       |     at async ExecutionContext.evaluate (/app/node_modules/.pnpm/puppeteer-core@21.5.2/node_modules/puppeteer-core/lib/cjs/puppeteer/cdp/ExecutionContext.js:126:16)
reactive-resume-app-1       |     at async IsolatedWorld.evaluate (/app/node_modules/.pnpm/puppeteer-core@21.5.2/node_modules/puppeteer-core/lib/cjs/puppeteer/cdp/IsolatedWorld.js:128:16)
reactive-resume-app-1       |     at async CdpFrame.evaluate (/app/node_modules/.pnpm/puppeteer-core@21.5.2/node_modules/puppeteer-core/lib/cjs/puppeteer/api/Frame.js:363:20)
reactive-resume-app-1       |     at async CdpPage.evaluate (/app/node_modules/.pnpm/puppeteer-core@21.5.2/node_modules/puppeteer-core/lib/cjs/puppeteer/api/Page.js:744:20)
reactive-resume-app-1       |     at async processPage (/app/dist/apps/server/main.js:13182:34)
reactive-resume-app-1       |     at async PrinterService.generateResume (/app/dist/apps/server/main.js:13195:17)
reactive-resume-app-1       |     at async /app/dist/apps/server/main.js:13120:25
reactive-resume-app-1       |     at PrinterService.generateResume (/app/dist/apps/server/main.js:13226:21)
reactive-resume-app-1       |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
reactive-resume-app-1       |     at async /app/dist/apps/server/main.js:13120:25
reactive-resume-app-1       |     at async UtilsService.getCachedOrSet (/app/dist/apps/server/main.js:11886:23)
reactive-resume-app-1       |     at async ResumeService.printResume (/app/dist/apps/server/main.js:13965:21)
reactive-resume-app-1       |     at async ResumeController.printResume (/app/dist/apps/server/main.js:13580:25)
@Doreapp Doreapp added bug Something isn't working needs triage Issues that need to be triaged v4 Issues related to the latest version labels Nov 29, 2023
@Doreapp
Copy link
Author

Doreapp commented Nov 29, 2023

Note:
I manage to retro-engineer a workaround to get a PDF of my resume. Here are the steps for anyone with the same issue:

  1. Save your resume in JSON

  2. Open http://localhost:3000/artboard/preview

  3. Execute

    localStorage.setItem('resume', JSON.stringify(<THE-JSON-DATA-OF-YOUR-RESUME>))

    in the browser

  4. Print the page, with no border and print background enabled

Note: Don't forget to clear the local storage after (localStorage.clear()) to avoid bugs in the resume-builder app

@dal3ir
Copy link

dal3ir commented Dec 1, 2023

You need put the bucket name in the storage url

example: STORAGE_URL: http://localhost:9000/default

source : https://www.reddit.com/r/selfhosted/comments/182n5th/comment/kapf8wg/

@Doreapp
Copy link
Author

Doreapp commented Dec 1, 2023

You need put the bucket name in the storage url

example: STORAGE_URL: http://localhost:9000/default

source : https://www.reddit.com/r/selfhosted/comments/182n5th/comment/kapf8wg/

This is already the case, check back the docker-compose.yml file

@dal3ir
Copy link

dal3ir commented Dec 2, 2023

This is my compose file :

version: "3.8"

# In this Docker Compose example, it assumes that you maintain a reverse proxy externally (or chose not to).
# The only two exposed ports here are from minio (:9000) and the app itself (:3000).
# If these ports are changed, ensure that the env vars passed to the app are also changed accordingly.

services:
  # Database (Postgres)
  postgres:
    image: postgres:15-alpine
    restart: unless-stopped
    volumes:
      - postgres_data:/var/lib/postgresql/data
    environment:
      POSTGRES_DB: postgres
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U postgres -d postgres"]
      interval: 10s
      timeout: 5s
      retries: 5

  # Storage (for image uploads)
  minio:
    image: minio/minio
    restart: unless-stopped
    command: server /data
    ports:
      - 9000:9000
    volumes:
      - minio_data:/data
    environment:
      MINIO_ROOT_USER: minioadmin
      MINIO_ROOT_PASSWORD: minioadmin

  # Chrome Browser (for printing and previews)
  chrome:
    image: browserless/chrome:1.61.0-puppeteer-21.4.1
    restart: unless-stopped
    environment:
      TOKEN: chrome_token
      EXIT_ON_HEALTH_FAILURE: true
      PRE_REQUEST_HEALTH_CHECK: true

  # Redis (for cache & server session management)
  redis:
    image: redis:alpine
    restart: unless-stopped
    command: redis-server --requirepass password

  app:
    image: amruthpillai/reactive-resume:latest
    restart: unless-stopped
    ports:
      - 80:80
    depends_on:
      - postgres
      - minio
      - redis
      - chrome
    environment:
      # -- Environment Variables --
      PORT: 80
      NODE_ENV: production

      # -- URLs --
      PUBLIC_URL: http://localhost
      STORAGE_URL: http://localhost:9000/default

      # -- Printer (Chrome) --
      CHROME_TOKEN: chrome_token
      CHROME_URL: ws://chrome:3000

      # -- Database (Postgres) --
      DATABASE_URL: postgresql://postgres:postgres@postgres:5432/postgres
      # explain this url postgresql://postgres:postgres@postgres:5432/postgres  postgresql://username:password@host:port/database

      # -- Auth --
      ACCESS_TOKEN_SECRET: access_token_secret
      REFRESH_TOKEN_SECRET: refresh_token_secret

      # -- Emails --
      MAIL_FROM: noreply@localhost
      # SMTP_URL: smtp://user:pass@smtp:587 # Optional

      # -- Storage (Minio) --
      STORAGE_ENDPOINT: minio
      STORAGE_PORT: 9000
      STORAGE_REGION: us-east-1 # Optional
      STORAGE_BUCKET: default
      STORAGE_ACCESS_KEY: minioadmin
      STORAGE_SECRET_KEY: minioadmin
      STORAGE_USE_SSL: false

      # -- Cache (Redis) --
      REDIS_URL: redis://default:password@redis:6379

      GITHUB_CLIENT_ID: github_client_id
      GITHUB_CLIENT_SECRET: github_client_secret
      GITHUB_CALLBACK_URL: http://localhost:3000/api/auth/github/callback

      # -- Google (Optional) --
      GOOGLE_CLIENT_ID: google_client_id
      GOOGLE_CLIENT_SECRET: google_client_secret
      GOOGLE_CALLBACK_URL: http://localhost:3000/api/auth/google/callback

volumes:
  minio_data:
  postgres_data:

It works perfectly for me but one thing that I noticed was that whenever I was connected to vpn the site on http://localhost seems to work but wasn't able to download the pdfs for some reason and turning the vpn off fixes this for me.

@Doreapp
Copy link
Author

Doreapp commented Dec 2, 2023

Ok, thanks for your answer, it looks like the same behavior for me. I guess something is wrong with the usage of browserless when running locally while not in dev mode.

I'll soon retry behind a reverse proxy on a remote server and get back here.

@fulsiram
Copy link

fulsiram commented Dec 15, 2023

I have faced exactly the same issue today, adding extra_hosts: host.docker.internal:host-gateway to the chrome service seems to resolve the issue:

   chrome:
     image: browserless/chrome:1.61.0-puppeteer-21.4.1
     restart: unless-stopped
+    extra_hosts:
+      - host.docker.internal:host-gateway
     environment:
       TOKEN: chrome_token
       EXIT_ON_HEALTH_FAILURE: true
       PRE_REQUEST_HEALTH_CHECK: true

@RaidOpe
Copy link

RaidOpe commented Dec 17, 2023

I have faced exactly the same issue today, adding extra_hosts: host.docker.internal:host-gateway to the chrome service seems to resolve the issue:

   chrome:
     image: browserless/chrome:1.61.0-puppeteer-21.4.1
     restart: unless-stopped
+    extra_hosts:
+      - host.docker.internal:host-gateway
     environment:
       TOKEN: chrome_token
       EXIT_ON_HEALTH_FAILURE: true
       PRE_REQUEST_HEALTH_CHECK: true

Well, my environment is self-hosted, headless, with LAN access. It can be opened normally and enter the dashboard. If the code you mentioned is not added, "about:balnk" will be opened. If it is added, it will turn black and display " {"info":"Invalid request!"}" Exporting JSON file is ok.

’chrome-1‘ log

2023-12-17T03:16:07.426Z browserless:chrome-helper Chrome PID: 266
2023-12-17T03:16:07.427Z browserless:chrome-helper Finding prior pages
2023-12-17T03:16:07.984Z browserless:chrome-helper Found 1 pages
2023-12-17T03:16:07.984Z browserless:chrome-helper Setting up page Unknown
2023-12-17T03:16:07.984Z browserless:chrome-helper Injecting download dir "/usr/src/app/workspace"
2023-12-17T03:16:07.986Z browserless:system Chrome launched 2135ms
2023-12-17T03:16:07.986Z browserless:system Got chrome instance
2023-12-17T03:16:07.986Z browserless:job VXO6FLYWTWXM3Y1XMAFWO61BGPAE0CHW: Starting session.
2023-12-17T03:16:07.987Z browserless:job VXO6FLYWTWXM3Y1XMAFWO61BGPAE0CHW: Proxying request to /devtools/browser route: ws://127.0.0.1:33463/devtools/browser/158430b7-0af6-451f-9cf2-5dc6bbb99817.
2023-12-17T03:16:07.997Z browserless:chrome-helper Setting up file:// protocol request rejection
2023-12-17T03:16:08.328Z browserless:chrome-helper Setting up page Unknown
2023-12-17T03:16:08.329Z browserless:chrome-helper Injecting download dir "/usr/src/app/workspace"

popping up url is http://localhost:10000/default/clq88uc3q000010wd1s05hdjl/resumes/clq88xnkp000310wd07cxohpb.pdf

Any thought?

..... Is that a bug? If I manually change the 'localhost' into the IP of self-host machine running service, It start downloading.

Hope this helps some people who are confused. Now I gotta to figure out the disappearance of picture

@fulsiram
Copy link

popping up url is http://localhost:10000/default/clq88uc3q000010wd1s05hdjl/resumes/clq88xnkp000310wd07cxohpb.pdf
If I manually change the 'localhost' into the IP of self-host machine running service, It start downloading.

Change STORAGE_URL in app's environment variable to your remote server address.

@octavian-negru
Copy link

Here's a proposed fix: #1676

@Weav3r
Copy link

Weav3r commented Feb 6, 2024

Had same issue try this #1754 (comment)

@IsitoloMfaona
Copy link

I have faced exactly the same issue today, adding extra_hosts: host.docker.internal:host-gateway to the chrome service seems to resolve the issue:

   chrome:
     image: browserless/chrome:1.61.0-puppeteer-21.4.1
     restart: unless-stopped
+    extra_hosts:
+      - host.docker.internal:host-gateway
     environment:
       TOKEN: chrome_token
       EXIT_ON_HEALTH_FAILURE: true
       PRE_REQUEST_HEALTH_CHECK: true

It worked for me.
But i had to change adress of docker0 from 192.168.1.1 to 172.0.0.1 same range as app container
get info here https://stackoverflow.com/questions/52225493/change-default-docker0-bridge-ip-address
thx =)

@BloodyIron
Copy link

172.0.0.1

Are you SURE that is the IP you used????

@BloodyIron
Copy link

I cannot fathom why, but switching from localhost to 127.0.0.1 for my kubernetes yaml declaration of the env variables for the container for reactive-resume seems to have solved PDF generation+download for me... the lack of documentation for self-hosted just blows me away, and I've been scraping the internet hard for like half a week trying to fix this. Thanks to the original poster for that insight because for the life of me that solution seems to be nowhere else on the internet. Thanks eStranger!

@AashishSinghal
Copy link
Contributor

AashishSinghal commented May 12, 2024

I have found that when we build the application the pdf generation starts to work. I saw that when I had not built the application, not even once, the resume thumbnails were not rendering rather showing a JSON output that there was no such directory in the "dist/apps/client/index.html",
can someone give it a try after building the application?
Once the application is built the Preview/Generation of Resume works as expected.

@friishon
Copy link

Any solution?
I'm having the same problem, pdf is no converted and i only has about:blank

i'm doing inside or portainer in a nas but without open port on my router, o just want to use in local

@dal3ir
Copy link

dal3ir commented May 24, 2024

I was able to find the solution. It will work well if you are trying to access locally because the

- PUBLIC_URL: http://localhost:3000 
- STORAGE_URL: http://localhost:9000/default

points to local only. If you are hosting it on a server and it can be locally resolved, you can change the localhost to your server's IP.

If you are trying to host it on a vps as this was my case, you will have to use a reverse proxy to point the domains to internal ip and port.
For example :

- PUBLIC_URL: http://mydomain points to IP:3000 
- STORAGE_URL: http://minio.mydomain points to IP:9000/default

Or you can install a vpn like tailscale to only allow personal access to the site

- PUBLIC_URL: http://TailscaleIP:3000 
- STORAGE_URL: http://TailscaleIP:9000/default

Im using the simple.yml file and I just gave the containers a name so the the containers always keep the same name (It might not work if you already have containers with same name running, so change the values in file accordingly) . I also created two externally managed networks with 'docker network create NetworkName'.

docker-compose.yml
version: "3.8"

# In this Docker Compose example, it assumes that you maintain a reverse proxy externally (or chose not to).
# The only two exposed ports here are from minio (:9000) and the app itself (:3000).
# If these ports are changed, ensure that the env vars passed to the app are also changed accordingly.

services:
  # Database (Postgres)
  postgres:
    image: postgres:16-alpine
    container_name: postgres
    restart: unless-stopped
    volumes:
      - postgres_data:/var/lib/postgresql/data
    environment:
      POSTGRES_DB: postgres
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U postgres -d postgres"]
      interval: 10s
      timeout: 5s
      retries: 5
    networks:
      - backend

  # Storage (for image uploads)
  minio:
    image: minio/minio
    container_name: minio
    restart: unless-stopped
    command: server /data
    ports:
      - "9000:9000"
    volumes:
      - minio_data:/data
    environment:
      MINIO_ROOT_USER: minioadmin
      MINIO_ROOT_PASSWORD: minioadmin
    networks:
      - backend

  # Chrome Browser (for printing and previews)
  chrome:
    image: ghcr.io/browserless/chromium:latest
    container_name: chrome
    restart: unless-stopped
    environment:
      TIMEOUT: 10000
      CONCURRENT: 10
      TOKEN: chrome_token
      EXIT_ON_HEALTH_FAILURE: true
      PRE_REQUEST_HEALTH_CHECK: true
    networks:
      - backend

  app:
    image: amruthpillai/reactive-resume:latest
    container_name: reactive-resume
    restart: unless-stopped
    ports:
      - "3000:3000"
    depends_on:
      - postgres
      - minio
      - chrome
    networks:
      - backend
      - frontend
    environment:
      # -- Environment Variables --
      PORT: 3000
      NODE_ENV: production

      # -- URLs --
      PUBLIC_URL: http://localhost:3000
      STORAGE_URL: http://localhost:9000/default

      # -- Printer (Chrome) --
      CHROME_TOKEN: chrome_token
      CHROME_URL: ws://chrome:3000

      # -- Database (Postgres) --
      DATABASE_URL: postgresql://postgres:postgres@postgres:5432/postgres

      # -- Auth --
      ACCESS_TOKEN_SECRET: access_token_secret
      REFRESH_TOKEN_SECRET: refresh_token_secret

      # -- Emails --
      MAIL_FROM: noreply@localhost
      # SMTP_URL: smtp://user:pass@smtp:587 # Optional

      # -- Storage (Minio) --
      STORAGE_ENDPOINT: minio
      STORAGE_PORT: 9000
      STORAGE_REGION: us-east-1 # Optional
      STORAGE_BUCKET: default
      STORAGE_ACCESS_KEY: minioadmin
      STORAGE_SECRET_KEY: minioadmin
      STORAGE_USE_SSL: false

      # -- Crowdin (Optional) --
      # CROWDIN_PROJECT_ID:
      # CROWDIN_PERSONAL_TOKEN:

      # -- Email (Optional) --
      # DISABLE_EMAIL_AUTH: true
      # VITE_DISABLE_SIGNUPS: true
      # SKIP_STORAGE_BUCKET_CHECK: false

      # -- GitHub (Optional) --
      # GITHUB_CLIENT_ID: github_client_id
      # GITHUB_CLIENT_SECRET: github_client_secret
      # GITHUB_CALLBACK_URL: http://localhost:3000/api/auth/github/callback

      # -- Google (Optional) --
      # GOOGLE_CLIENT_ID: google_client_id
      # GOOGLE_CLIENT_SECRET: google_client_secret
      # GOOGLE_CALLBACK_URL: http://localhost:3000/api/auth/google/callback

volumes:
  minio_data:
  postgres_data:

networks:
  frontend:
    external: true
  backend:
    external: true

@friishon
Copy link

friishon commented May 29, 2024

I was able to find the solution. It will work well if you are trying to access locally because the

- PUBLIC_URL: http://localhost:3000 
- STORAGE_URL: http://localhost:9000/default

points to local only. If you are hosting it on a server and it can be locally resolved, you can change the localhost to your server's IP.

If you are trying to host it on a vps as this was my case, you will have to use a reverse proxy to point the domains to internal ip and port. For example :

- PUBLIC_URL: http://mydomain points to IP:3000 
- STORAGE_URL: http://minio.mydomain points to IP:9000/default

Or you can install a vpn like tailscale to only allow personal access to the site

- PUBLIC_URL: http://TailscaleIP:3000 
- STORAGE_URL: http://TailscaleIP:9000/default

Im using the simple.yml file and I just gave the containers a name so the the containers always keep the same name (It might not work if you already have containers with same name running, so change the values in file accordingly) . I also created two externally managed networks with 'docker network create NetworkName'.

docker-compose.yml

Didn't work for me

This is the docker compose I'm using

version: "3.9"
services:
  db:
    image: postgres:16
    container_name: Resume-DB
    hostname: resume-db
    security_opt:
      - no-new-privileges:true
    healthcheck:
      test: ["CMD", "pg_isready", "-q", "-d", "resume", "-U", "resumeuser"]
      timeout: 45s
      interval: 10s
      retries: 10
    volumes:
      - /volume1/docker/rxv4/db:/var/lib/postgresql/data:rw
    environment:
      POSTGRES_DB: resume
      POSTGRES_USER: resumeuser
      POSTGRES_PASSWORD: resumepass
    restart: on-failure:5


  minio:
    image: minio/minio:latest
    command: server /data
    container_name: Resume-MINIO
    hostname: resume-minio
    mem_limit: 4g
    cpu_shares: 768
    security_opt:
      - no-new-privileges:true
    user: UID:GID
    healthcheck:
      test: ["CMD", "mc", "ready", "local"]
      interval: 5s
      timeout: 5s
      retries: 5
    ports:
      - 9000:9000
    volumes:
      - /volume1/docker/rxv4/data:/data:rw
    environment:
      MINIO_ROOT_USER: minioadmin
      MINIO_ROOT_PASSWORD: miniopass
    restart: on-failure:5

  chrome:
    image: ghcr.io/browserless/chromium:latest
    container_name: Resume-CHROME
    hostname: chrome
    mem_limit: 4g
    cpu_shares: 768
    user: UID:GID
    environment:
      TOKEN: chrome_token
      EXIT_ON_HEALTH_FAILURE: false
      PRE_REQUEST_HEALTH_CHECK: false
    restart: on-failure:5
    
  resume:
    image: amruthpillai/reactive-resume:latest
    container_name: Resume-ACCESS
    hostname: resume
    restart: on-failure:5
    mem_limit: 6g
    cpu_shares: 768
    security_opt:
      - no-new-privileges:true
    ports:
      - 3000:3000
    environment:
      PORT: 3000
      NODE_ENV: production
      ACCESS_TOKEN_SECRET: access_token_secret
      REFRESH_TOKEN_SECRET: refresh_token_secret
      PUBLIC_URL: http://localhost:3000 
      STORAGE_URL: http://localhost:9000/default 
      CHROME_TOKEN: chrome_token
      CHROME_URL: ws://chrome:3000
      DATABASE_URL: postgresql://resumeuser:resumepass@resume-db:5432/resume
      STORAGE_ENDPOINT: minio
      STORAGE_PORT: 9000
      STORAGE_REGION: us-east-1
      STORAGE_BUCKET: default
      STORAGE_ACCESS_KEY: minioadmin
      STORAGE_SECRET_KEY: miniopass
      STORAGE_USE_SSL: false
      DISABLE_EMAIL_AUTH: false
      VITE_DISABLE_SIGNUPS: false
    depends_on:
      db:
        condition: service_healthy
      minio:
        condition: service_healthy
      chrome:
        condition: service_started

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working needs triage Issues that need to be triaged v4 Issues related to the latest version
Projects
None yet
Development

No branches or pull requests