Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

404 page not found #416

Closed
sylven opened this issue Jan 20, 2020 · 1 comment
Closed

404 page not found #416

sylven opened this issue Jan 20, 2020 · 1 comment

Comments

@sylven
Copy link

sylven commented Jan 20, 2020

Codebase
mounted codebase

Describe your issue
I left my containers working on thursday and my computer forced a restart because of a failure.
Today I started my containers again and every url returns '404 page not found'.

I've tried changing PROJECT_BASE_URL and updating /etc/hosts and different ports for traeffki with no luck.
I also tried updating .env versions from the last commit but that didn't help either.

Output of docker info

Client:
 Debug Mode: false

Server:
 Containers: 17
  Running: 8
  Paused: 0
  Stopped: 9
 Images: 50
 Server Version: 19.03.5
 Storage Driver: overlay2
  Backing Filesystem: extfs
  Supports d_type: true
  Native Overlay Diff: true
 Logging Driver: json-file
 Cgroup Driver: cgroupfs
 Plugins:
  Volume: local
  Network: bridge host ipvlan macvlan null overlay
  Log: awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog
 Swarm: inactive
 Runtimes: runc
 Default Runtime: runc
 Init Binary: docker-init
 containerd version: b34a5c8af56e510852c35414db4c1f4fa6172339
 runc version: 3e425f80a8c931f88e6d94a8c831b9d5aa481657
 init version: fec3683
 Security Options:
  seccomp
   Profile: default
 Kernel Version: 4.9.184-linuxkit
 Operating System: Docker Desktop
 OSType: linux
 Architecture: x86_64
 CPUs: 2
 Total Memory: 1.952GiB
 Name: docker-desktop
 ID: 7RQO:YSJA:66BV:5KXB:JZTH:BRUV:76BP:36GT:XLTI:AAG6:WGAE:XDTE
 Docker Root Dir: /var/lib/docker
 Debug Mode: true
  File Descriptors: 104
  Goroutines: 139
  System Time: 2020-01-20T12:49:20.4953816Z
  EventsListeners: 4
 HTTP Proxy: gateway.docker.internal:3128
 HTTPS Proxy: gateway.docker.internal:3129
 Registry: https://index.docker.io/v1/
 Labels:
 Experimental: false
 Insecure Registries:
  127.0.0.0/8
 Live Restore Enabled: false
 Product License: Community Engine

Contents of your docker-compose.yml

version: "3"

services:
  mariadb:
    image: wodby/mariadb:$MARIADB_TAG
    container_name: "${PROJECT_NAME}_mariadb"
    stop_grace_period: 30s
    environment:
      MYSQL_ROOT_PASSWORD: $DB_ROOT_PASSWORD
      MYSQL_DATABASE: $DB_NAME
      MYSQL_USER: $DB_USER
      MYSQL_PASSWORD: $DB_PASSWORD
#    volumes:
#      - ./mariadb-init:/docker-entrypoint-initdb.d # Place init .sql file(s) here.
#      - /path/to/mariadb/data/on/host:/var/lib/mysql # Use bind mount

  php:
    image: wodby/drupal-php:$PHP_TAG
    container_name: "${PROJECT_NAME}_php"
    environment:
      PHP_SENDMAIL_PATH: /usr/sbin/sendmail -t -i -S mailhog:1025
#      PHP_SENDMAIL_PATH: /usr/sbin/sendmail -t -i -S opensmtpd:25
      DB_HOST: $DB_HOST
      DB_PORT: $DB_PORT
      DB_USER: $DB_USER
      DB_PASSWORD: $DB_PASSWORD
      DB_NAME: $DB_NAME
      DB_DRIVER: $DB_DRIVER
      PHP_FPM_USER: wodby
      PHP_FPM_GROUP: wodby
      COLUMNS: 80 # Set 80 columns for docker exec -it.
## Read instructions at https://wodby.com/docs/stacks/php/local/#xdebug
#      PHP_XDEBUG: 1
#      PHP_XDEBUG_DEFAULT_ENABLE: 1
#      PHP_XDEBUG_REMOTE_CONNECT_BACK: 0
#      PHP_IDE_CONFIG: serverName=my-ide
#      PHP_XDEBUG_IDEKEY: "my-ide"
#      PHP_XDEBUG_REMOTE_HOST: host.docker.internal # Docker 18.03+ Mac/Win
#      PHP_XDEBUG_REMOTE_HOST: 172.17.0.1 # Linux
#      PHP_XDEBUG_REMOTE_HOST: 10.254.254.254 # macOS, Docker < 18.03
#      PHP_XDEBUG_REMOTE_HOST: 10.0.75.1 # Windows, Docker < 18.03
#      PHP_XDEBUG_REMOTE_LOG: /tmp/php-xdebug.log
## PHPUnit Drupal testing configurations
#      SIMPLETEST_BASE_URL: "http://nginx"
#      SIMPLETEST_DB: "${DB_DRIVER}://${DB_USER}:${DB_PASSWORD}@${DB_HOST}/${DB_NAME}#tests_"
#      MINK_DRIVER_ARGS_WEBDRIVER: '["chrome", {"browserName":"chrome","goog:chromeOptions":{"args":["--disable-gpu","--headless"]}}, "http://chrome:9515"]'

    volumes:
#      - ./:/var/www/html
## For macOS users (https://wodby.com/docs/stacks/drupal/local#docker-for-mac)
     - ./:/var/www/html:cached # User-guided caching
#      - docker-sync:/var/www/html # Docker-sync
## For XHProf and Xdebug profiler traces
#      - files:/mnt/files

#  nginx:
#    image: wodby/nginx:$NGINX_TAG
#    container_name: "${PROJECT_NAME}_nginx"
#    depends_on:
#      - php
#    environment:
#      NGINX_STATIC_OPEN_FILE_CACHE: "off"
#      NGINX_ERROR_LOG_LEVEL: debug
#      NGINX_BACKEND_HOST: php
#      NGINX_SERVER_ROOT: /var/www/html/web
#      NGINX_VHOST_PRESET: $NGINX_VHOST_PRESET
#
## proxy_pass for drupal extension s3fs -> rewriting compressed css and js files
##      NGINX_VHOST_PRESET: http-proxy
##      NGINX_SERVER_EXTRA_CONF_FILEPATH: /etc/nginx/extra.conf
#
#    #      NGINX_DRUPAL_FILE_PROXY_URL: http://example.com
#    volumes:
##      - ./:/var/www/html
## For macOS users (https://wodby.com/docs/stacks/drupal/local#docker-for-mac)
#     - ./:/var/www/html:cached # User-guided caching
#
## proxy_pass for drupal extension s3fs -> rewriting compressed css and js files
##     - ./extra-nginx.conf:/etc/nginx/extra.conf
#
##      - docker-sync:/var/www/html # Docker-sync
#    labels:
#      - "traefik.http.routers.${PROJECT_NAME}_nginx.rule=Host(`${PROJECT_BASE_URL}`)"

  mailhog:
    image: mailhog/mailhog
    container_name: "${PROJECT_NAME}_mailhog"
    labels:
      - "traefik.http.services.${PROJECT_NAME}_mailhog.loadbalancer.server.port=8025"
      - "traefik.http.routers.${PROJECT_NAME}_mailhog.rule=Host(`mailhog.${PROJECT_BASE_URL}`)"

#  postgres:
#    image: wodby/postgres:$POSTGRES_TAG
#    container_name: "${PROJECT_NAME}_postgres"
#    stop_grace_period: 30s
#    environment:
#      POSTGRES_PASSWORD: $DB_PASSWORD
#      POSTGRES_DB: $DB_NAME
#      POSTGRES_USER: $DB_USER
#    volumes:
#      - ./postgres-init:/docker-entrypoint-initdb.d # Place init file(s) here.
#      - /path/to/postgres/data/on/host:/var/lib/postgresql/data # Use bind mount

  apache:
    image: wodby/apache:$APACHE_TAG
    container_name: "${PROJECT_NAME}_apache"
    depends_on:
      - php
    environment:
      APACHE_LOG_LEVEL: debug
      APACHE_BACKEND_HOST: php
      APACHE_VHOST_PRESET: php
      APACHE_DOCUMENT_ROOT: /var/www/html/web
#      APACHE_INCLUDE_CONF: /etc/apache2/extra.conf
#      APACHE_INCLUDE_CONF: /usr/local/apache2/conf/custom-extra.conf # Use custom apache config
    volumes:
      #      - ./:/var/www/html
      # For macOS users (https://wodby.com/docs/stacks/drupal/local#docker-for-mac)
#      - ./extra-apache2.conf:/etc/apache2/extra.conf
#      - ./extra-apache2.conf:/usr/local/apache2/conf/extra.conf
#      - ./extra-apache2.conf:/usr/local/apache2/conf/custom-extra.conf # Map custom apache config
      - ./:/var/www/html:cached # User-guided caching
    #      - docker-sync:/var/www/html # Docker-sync
    labels:
      - "traefik.http.routers.${PROJECT_NAME}_apache.rule=Host(`${PROJECT_BASE_URL}`)"

#  varnish:
#    image: wodby/varnish:$VARNISH_TAG
#    container_name: "${PROJECT_NAME}_varnish"
#    depends_on:
#      - nginx
#    environment:
#      VARNISH_SECRET: secret
#      VARNISH_BACKEND_HOST: nginx
#      VARNISH_BACKEND_PORT: 80
#      VARNISH_CONFIG_PRESET: drupal
#      VARNISH_ALLOW_UNRESTRICTED_PURGE: 1
#    labels:
#      - "traefik.http.services.${PROJECT_NAME}_varnish.loadbalancer.server.port=6081"
#      - "traefik.http.routers.${PROJECT_NAME}_varnish.rule=Host(`varnish.${PROJECT_BASE_URL}`)"

#  redis:
#    container_name: "${PROJECT_NAME}_redis"
#    image: wodby/redis:$REDIS_TAG

#  adminer:
#    container_name: "${PROJECT_NAME}_adminer"
#    image: wodby/adminer:$ADMINER_TAG
#    environment:
## For PostgreSQL:
##      ADMINER_DEFAULT_DB_DRIVER: pgsql
#      ADMINER_DEFAULT_DB_HOST: $DB_HOST
#      ADMINER_DEFAULT_DB_NAME: $DB_NAME
#    labels:
#      - "traefik.http.routers.${PROJECT_NAME}_adminer.rule=Host(`adminer.${PROJECT_BASE_URL}`)"

  pma:
    image: phpmyadmin/phpmyadmin
    container_name: "${PROJECT_NAME}_pma"
    environment:
      PMA_HOST: $DB_HOST
      PMA_USER: $DB_USER
      PMA_PASSWORD: $DB_PASSWORD
      PHP_UPLOAD_MAX_FILESIZE: 1G
      PHP_MAX_INPUT_VARS: 1G
    labels:
      - "traefik.http.routers.${PROJECT_NAME}_pma.rule=Host(`pma.${PROJECT_BASE_URL}`)"
    volumes:
      - ./pma.config.ini:/usr/local/etc/php/conf.d/pma.config.ini

#  solr:
#    image: wodby/solr:$SOLR_TAG
#    container_name: "${PROJECT_NAME}_solr"
#    environment:
#      SOLR_DEFAULT_CONFIG_SET: $SOLR_CONFIG_SET
#      SOLR_HEAP: 1024m
#    labels:
#      - "traefik.http.routers.${PROJECT_NAME}_solr.rule=Host(`solr.${PROJECT_BASE_URL}`)"

#  drupal-node:
#    image: wodby/drupal-node:$DRUPAL_NODE_TAG
#    container_name: "${PROJECT_NAME}_drupal_nodejs"
#    environment:
#       NODE_SERVICE_KEY: node-service-key
#    labels:
#      - "traefik.http.routers.${PROJECT_NAME}_drupal_node.rule=Host(`drupal_node.${PROJECT_BASE_URL}`)"
#    volumes:
#      - ./path/to/your/single-page-app:/app
#    command: sh -c 'npm install && npm run start'

#  memcached:
#    container_name: "${PROJECT_NAME}_memcached"
#    image: wodby/memcached:$MEMCACHED_TAG

#  rsyslog:
#    container_name: "${PROJECT_NAME}_rsyslog"
#    image: wodby/rsyslog:$RSYSLOG_TAG

#  athenapdf:
#    image: arachnysdocker/athenapdf-service:$ATHENAPDF_TAG
#    container_name: "${PROJECT_NAME}_athenapdf"
#    environment:
#      WEAVER_AUTH_KEY: weaver-auth-key
#      WEAVER_ATHENA_CMD: "athenapdf -S"
#      WEAVER_MAX_WORKERS: 10
#      WEAVER_MAX_CONVERSION_QUEUE: 50
#      WEAVER_WORKER_TIMEOUT: 90
#      WEAVER_CONVERSION_FALLBACK: "false"

#  node:
#    image: wodby/node:$NODE_TAG
#    container_name: "${PROJECT_NAME}_node"
#    working_dir: /var/www/html/path/to/theme/to/build
#    labels:
#      - "traefik.http.services.${PROJECT_NAME}_node.loadbalancer.server.port=3000"
#      - "traefik.http.routers.${PROJECT_NAME}_node.rule=Host(`node.${PROJECT_BASE_URL}`)"
#    expose:
#      - "3000"
#    volumes:
#      - ./:/var/www/html
#    command: sh -c 'yarn install && yarn run start'

#  blackfire:
#    image: blackfire/blackfire
#    container_name: "${PROJECT_NAME}_blackfire"
#    environment:
#      BLACKFIRE_SERVER_ID: XXXXX
#      BLACKFIRE_SERVER_TOKEN: YYYYY

#  webgrind:
#    image: wodby/webgrind:$WEBGRIND_TAG
#    container_name: "${PROJECT_NAME}_webgrind"
#    environment:
#      WEBGRIND_PROFILER_DIR: /mnt/files/xdebug/profiler
#    labels:
#      - "traefik.http.routers.${PROJECT_NAME}_webgrind.rule=Host(`webgrind.${PROJECT_BASE_URL}`)"
#    volumes:
#      - files:/mnt/files

#  elasticsearch:
#    image: wodby/elasticsearch:$ELASTICSEARCH_TAG
#    environment:
#      ES_JAVA_OPTS: "-Xms500m -Xmx500m"
#    ulimits:
#      memlock:
#        soft: -1
#        hard: -1

#  kibana:
#    image: wodby/kibana:$KIBANA_TAG
#    depends_on:
#      - elasticsearch
#    labels:
#      - "traefik.http.services.${PROJECT_NAME}_kibana.loadbalancer.server.port=5601"
#      - "traefik.http.routers.${PROJECT_NAME}_kibana.rule=Host(`kibana.${PROJECT_BASE_URL}`)"

#  opensmtpd:
#    container_name: "${PROJECT_NAME}_opensmtpd"
#    image: wodby/opensmtpd:$OPENSMTPD_TAG

#  xhprof:
#    image: wodby/xhprof:$XHPROF_TAG
#    restart: always
#    volumes:
#      - files:/mnt/files
#    labels:
#      - "traefik.http.routers.${PROJECT_NAME}_xhprof.rule=Host(`xhprof.${PROJECT_BASE_URL}`)"
#  chrome:
#    image: selenium/standalone-chrome:$SELENIUM_CHROME_TAG
#    container_name: "${PROJECT_NAME}_chrome"
#    volumes:
#      - /dev/shm:/dev/shm
#    entrypoint:
#      - chromedriver
#      - "--no-sandbox"
#      - "--disable-dev-shm-usage"
#      - "--log-path=/tmp/chromedriver.log"
#      - "--verbose"
#      - "--whitelisted-ips="

  portainer:
    image: portainer/portainer
    container_name: "${PROJECT_NAME}_portainer"
    command: --no-auth -H unix:///var/run/docker.sock
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
    labels:
      - "traefik.http.routers.${PROJECT_NAME}_portainer.rule=Host(`portainer.${PROJECT_BASE_URL}`)"

  traefik:
    #image: traefik:v1.7.16-alpine #ISSUE: Pin Traefik to 1.7.16 to prevent backwards compatibility break: https://github.com/wodby/docker4drupal/issues/401
    image: traefik:v2.0
    container_name: "${PROJECT_NAME}_traefik"
    command: --api.insecure=true --providers.docker
    ports:
      - '8000:80'
#      - '8080:8080' # Dashboard
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock

#volumes:
## Docker-sync for macOS users
#  docker-sync:
#    external: true
## For Xdebug profiler
#  files:

Contents of your .env

### Documentation available at https://wodby.com/docs/stacks/drupal/local
### Changelog can be found at https://github.com/wodby/docker4drupal/releases
### Images tags format explained at https://github.com/wodby/docker4drupal#images-tags

## Current https://github.com/wodby/docker4drupal version: 5.4.14

# Uncomment this to use settings.local.php
APP_ENV=local

# Env vars can be set here:
# /etc/httpd/conf/vhosts/website.conf
# You can also set env var here:
#SetEnv APP_ENV production

### PROJECT SETTINGS

PROJECT_NAME=mysite
PROJECT_BASE_URL=mysite.docker.localhost # If you change this, remember updating trusted_host_patterns

DB_NAME=mysite
DB_USER=mysite
DB_PASSWORD=mysite
DB_ROOT_PASSWORD=password
DB_HOST=mariadb
DB_PORT=3306
DB_DRIVER=mysql

### --- MARIADB ----

MARIADB_TAG=10.4-3.6.8
#MARIADB_TAG=10.3-3.6.8
#MARIADB_TAG=10.2-3.6.8
#MARIADB_TAG=10.1-3.6.8

### --- VANILLA DRUPAL ----

DRUPAL_TAG=8-4.16.3
#DRUPAL_TAG=7-4.16.3

### --- PHP ----

# Linux (uid 1000 gid 1000)

#PHP_TAG=7.3-dev-4.14.2
#PHP_TAG=7.2-dev-4.14.2
#PHP_TAG=7.1-dev-4.14.2
#PHP_TAG=5.6-dev-4.14.2

# macOS (uid 501 gid 20)

PHP_TAG=7.3-dev-macos-4.14.2
#PHP_TAG=7.2-dev-macos-4.14.2
#PHP_TAG=7.1-dev-macos-4.14.2
#PHP_TAG=5.6-dev-macos-4.14.2

### --- NGINX ----

NGINX_TAG=1.17-5.7.4
#NGINX_TAG=1.16-5.7.4

NGINX_VHOST_PRESET=drupal8
#NGINX_VHOST_PRESET=drupal7
#NGINX_VHOST_PRESET=drupal6

### --- SOLR ---

SOLR_CONFIG_SET="search_api_solr_8.x-3.2"
#SOLR_CONFIG_SET="search_api_solr_8.x-2.7"
#SOLR_CONFIG_SET="search_api_solr_8.x-1.2"
#SOLR_CONFIG_SET="search_api_solr_7.x-1.14"

SOLR_TAG=8-4.1.2
#SOLR_TAG=7-4.1.2
#SOLR_TAG=6-4.1.2
#SOLR_TAG=5-4.1.2

### --- ELASTICSEARCH ---

ELASTICSEARCH_TAG=7-5.3.0
#ELASTICSEARCH_TAG=6-5.3.0

### --- KIBANA ---

KIBANA_TAG=7-5.3.0
#KIBANA_TAG=6-5.3.0

### --- REDIS ---

REDIS_TAG=4-3.1.4
#REDIS_TAG=5-3.1.4

### --- NODE ---

NODE_TAG=12-0.29.0
#NODE_TAG=10-0.29.0
#NODE_TAG=8-0.29.0

### --- VARNISH ---

VARNISH_TAG=6.0-4.3.6
#VARNISH_TAG=4.1-4.3.6

### --- POSTGRESQL ----

POSTGRES_TAG=12-1.8.0
#POSTGRES_TAG=11-1.8.0
#POSTGRES_TAG=10-1.8.0
#POSTGRES_TAG=9.6-1.8.0
#POSTGRES_TAG=9.5-1.8.0
#POSTGRES_TAG=9.4-1.8.0

### OTHERS

ADMINER_TAG=4-3.8.1
APACHE_TAG=2.4-4.1.5
ATHENAPDF_TAG=2.10.0
DRUPAL_NODE_TAG=1.0-2.0.0
MEMCACHED_TAG=1-2.3.6
OPENSMTPD_TAG=6.0-1.5.4
RSYSLOG_TAG=latest
SELENIUM_CHROME_TAG=3.141
WEBGRIND_TAG=1-1.13.1
XHPROF_TAG=2.0.0

Logs output docker-compose logs

Attaching to mysite_apache, mysite_php, mysite_mariadb, mysite_traefik, mysite_pma, mysite_mailhog, mysite_portainer
mysite_pma | phpMyAdmin not found in /var/www/html - copying now...
mysite_pma | Complete! phpMyAdmin has been successfully copied to /var/www/html
mysite_pma | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.25.0.7. Set the 'ServerName' directive globally to suppress this message
mysite_pma | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.25.0.7. Set the 'ServerName' directive globally to suppress this message
mysite_pma | [Mon Jan 20 12:12:06.691674 2020] [mpm_prefork:notice] [pid 1] AH00163: Apache/2.4.25 (Debian) PHP/7.2.19 configured -- resuming normal operations
mysite_pma | [Mon Jan 20 12:12:06.691780 2020] [core:notice] [pid 1] AH00094: Command line: 'apache2 -D FOREGROUND'
mysite_pma | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.25.0.4. Set the 'ServerName' directive globally to suppress this message
mysite_pma | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.25.0.4. Set the 'ServerName' directive globally to suppress this message
mysite_pma | [Mon Jan 20 12:15:54.974016 2020] [mpm_prefork:notice] [pid 1] AH00163: Apache/2.4.25 (Debian) PHP/7.2.19 configured -- resuming normal operations
mysite_pma | [Mon Jan 20 12:15:54.974112 2020] [core:notice] [pid 1] AH00094: Command line: 'apache2 -D FOREGROUND'
mysite_pma | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.25.0.4. Set the 'ServerName' directive globally to suppress this message
mysite_pma | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.25.0.4. Set the 'ServerName' directive globally to suppress this message
mysite_pma | [Mon Jan 20 12:29:54.783328 2020] [mpm_prefork:notice] [pid 1] AH00163: Apache/2.4.25 (Debian) PHP/7.2.19 configured -- resuming normal operations
mysite_pma | [Mon Jan 20 12:29:54.783469 2020] [core:notice] [pid 1] AH00094: Command line: 'apache2 -D FOREGROUND'
mysite_pma | [Mon Jan 20 12:41:58.597274 2020] [mpm_prefork:notice] [pid 1] AH00169: caught SIGTERM, shutting down
mysite_pma | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.25.0.2. Set the 'ServerName' directive globally to suppress this message
mysite_pma | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.25.0.2. Set the 'ServerName' directive globally to suppress this message
mysite_pma | [Mon Jan 20 12:42:10.252021 2020] [mpm_prefork:notice] [pid 1] AH00163: Apache/2.4.25 (Debian) PHP/7.2.19 configured -- resuming normal operations
mysite_pma | [Mon Jan 20 12:42:10.252111 2020] [core:notice] [pid 1] AH00094: Command line: 'apache2 -D FOREGROUND'
mysite_pma | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.25.0.3. Set the 'ServerName' directive globally to suppress this message
mysite_pma | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.25.0.3. Set the 'ServerName' directive globally to suppress this message
mysite_pma | [Mon Jan 20 12:42:57.569371 2020] [mpm_prefork:notice] [pid 1] AH00163: Apache/2.4.25 (Debian) PHP/7.2.19 configured -- resuming normal operations
mysite_pma | [Mon Jan 20 12:42:57.569525 2020] [core:notice] [pid 1] AH00094: Command line: 'apache2 -D FOREGROUND'
mysite_pma | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.25.0.4. Set the 'ServerName' directive globally to suppress this message
mysite_pma | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.25.0.4. Set the 'ServerName' directive globally to suppress this message
mysite_pma | [Mon Jan 20 12:44:30.902085 2020] [mpm_prefork:notice] [pid 1] AH00163: Apache/2.4.25 (Debian) PHP/7.2.19 configured -- resuming normal operations
mysite_pma | [Mon Jan 20 12:44:30.902205 2020] [core:notice] [pid 1] AH00094: Command line: 'apache2 -D FOREGROUND'
mysite_pma | [Mon Jan 20 12:48:32.162038 2020] [mpm_prefork:notice] [pid 1] AH00169: caught SIGTERM, shutting down
mysite_pma | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.25.0.5. Set the 'ServerName' directive globally to suppress this message
mysite_pma | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.25.0.5. Set the 'ServerName' directive globally to suppress this message
mysite_pma | [Mon Jan 20 12:48:44.983984 2020] [mpm_prefork:notice] [pid 1] AH00163: Apache/2.4.25 (Debian) PHP/7.2.19 configured -- resuming normal operations
mysite_pma | [Mon Jan 20 12:48:44.984981 2020] [core:notice] [pid 1] AH00094: Command line: 'apache2 -D FOREGROUND'
mysite_php | [20-Jan-2020 12:48:46] NOTICE: fpm is running, pid 1
mysite_php | [20-Jan-2020 12:48:46] NOTICE: ready to handle connections
mysite_apache | [Mon Jan 20 12:48:47.460574 2020] [ssl:info] [pid 1:tid 139997998185800] AH01887: Init: Initializing (virtual) servers for SSL
mysite_apache | [Mon Jan 20 12:48:47.463951 2020] [ssl:info] [pid 1:tid 139997998185800] AH01876: mod_ssl/2.4.41 compiled against Server: Apache/2.4.41, Library: OpenSSL/1.1.1d
mysite_apache | [Mon Jan 20 12:48:47.465618 2020] [http2:debug] [pid 1:tid 139997998185800] mod_http2.c(112): AH03089: initializing post config dry run
mysite_apache | [Mon Jan 20 12:48:47.476190 2020] [ssl:warn] [pid 1:tid 139997998185800] AH01873: Init: Session Cache is not configured [hint: SSLSessionCache]
mysite_apache | [Mon Jan 20 12:48:47.476242 2020] [ssl:info] [pid 1:tid 139997998185800] AH01887: Init: Initializing (virtual) servers for SSL
mysite_apache | [Mon Jan 20 12:48:47.476290 2020] [ssl:info] [pid 1:tid 139997998185800] AH01876: mod_ssl/2.4.41 compiled against Server: Apache/2.4.41, Library: OpenSSL/1.1.1d
mysite_apache | [Mon Jan 20 12:48:47.476324 2020] [http2:info] [pid 1:tid 139997998185800] AH03090: mod_http2 (v1.15.4, feats=CHPRIO+SHA256+INVHD+DWINS, nghttp2 1.39.2), initializing...
mysite_apache | [Mon Jan 20 12:48:47.476455 2020] [ldap:debug] [pid 1:tid 139997998185800] util_ldap.c(2988): AH01316: LDAP merging Shared Cache conf: shm=0x55899bb20110 rmm=0x55899bb20168 for VHOST: default
mysite_apache | [Mon Jan 20 12:48:47.480975 2020] [ldap:info] [pid 1:tid 139997998185800] AH01318: APR LDAP: Built with OpenLDAP LDAP SDK
mysite_apache | [Mon Jan 20 12:48:47.482512 2020] [ldap:info] [pid 1:tid 139997998185800] AH01319: LDAP: SSL support available
mysite_apache | [Mon Jan 20 12:48:47.484600 2020] [proxy_http2:info] [pid 1:tid 139997998185800] AH03349: mod_proxy_http2 (v1.15.4, nghttp2 1.39.2), initializing...
mysite_apache | [Mon Jan 20 12:48:47.487769 2020] [mpm_event:notice] [pid 1:tid 139997998185800] AH00489: Apache/2.4.41 (Unix) OpenSSL/1.1.1d configured -- resuming normal operations
mysite_apache | [Mon Jan 20 12:48:47.487950 2020] [mpm_event:info] [pid 1:tid 139997998185800] AH00490: Server built: Oct 28 2019 20:29:50
mysite_apache | [Mon Jan 20 12:48:47.488315 2020] [core:notice] [pid 1:tid 139997998185800] AH00094: Command line: 'httpd -D FOREGROUND'
mysite_apache | [Mon Jan 20 12:48:47.493896 2020] [proxy:debug] [pid 23:tid 139997998185800] proxy_util.c(1935): AH00925: initializing worker proxy:reverse shared
mysite_apache | [Mon Jan 20 12:48:47.494251 2020] [proxy:debug] [pid 23:tid 139997998185800] proxy_util.c(1992): AH00927: initializing worker proxy:reverse local
mysite_apache | [Mon Jan 20 12:48:47.494433 2020] [proxy:debug] [pid 23:tid 139997998185800] proxy_util.c(2027): AH00930: initialized pool in child 23 for (*) min=0 max=61 smax=61
mysite_apache | [Mon Jan 20 12:48:47.494481 2020] [proxy:debug] [pid 23:tid 139997998185800] proxy_util.c(1935): AH00925: initializing worker fcgi://php:9000/ shared
mysite_apache | [Mon Jan 20 12:48:47.494512 2020] [proxy:debug] [pid 23:tid 139997998185800] proxy_util.c(1992): AH00927: initializing worker fcgi://php:9000/ local
mysite_apache | [Mon Jan 20 12:48:47.494586 2020] [proxy:debug] [pid 23:tid 139997998185800] proxy_util.c(2027): AH00930: initialized pool in child 23 for (php) min=0 max=61 smax=61
mysite_apache | [Mon Jan 20 12:48:47.496906 2020] [core:debug] [pid 1:tid 139997998185800] log.c(1571): AH02639: Using SO_REUSEPORT: yes (1)
mysite_apache | [Mon Jan 20 12:48:47.500512 2020] [mpm_event:debug] [pid 23:tid 139997984541472] event.c(2315): AH02471: start_threads: Using epoll (wakeable)
mysite_apache | [Mon Jan 20 12:48:47.503991 2020] [proxy:debug] [pid 21:tid 139997998185800] proxy_util.c(1935): AH00925: initializing worker proxy:reverse shared
mysite_apache | [Mon Jan 20 12:48:47.504037 2020] [proxy:debug] [pid 21:tid 139997998185800] proxy_util.c(1992): AH00927: initializing worker proxy:reverse local
mysite_apache | [Mon Jan 20 12:48:47.504103 2020] [proxy:debug] [pid 21:tid 139997998185800] proxy_util.c(2027): AH00930: initialized pool in child 21 for (*) min=0 max=61 smax=61
mysite_apache | [Mon Jan 20 12:48:47.504145 2020] [proxy:debug] [pid 21:tid 139997998185800] proxy_util.c(1935): AH00925: initializing worker fcgi://php:9000/ shared
mysite_apache | [Mon Jan 20 12:48:47.504184 2020] [proxy:debug] [pid 21:tid 139997998185800] proxy_util.c(1992): AH00927: initializing worker fcgi://php:9000/ local
mysite_apache | [Mon Jan 20 12:48:47.504335 2020] [proxy:debug] [pid 21:tid 139997998185800] proxy_util.c(2027): AH00930: initialized pool in child 21 for (php) min=0 max=61 smax=61
mysite_apache | [Mon Jan 20 12:48:47.506100 2020] [mpm_event:debug] [pid 21:tid 139997984541472] event.c(2315): AH02471: start_threads: Using epoll (wakeable)
mysite_apache | [Mon Jan 20 12:48:47.519096 2020] [proxy:debug] [pid 22:tid 139997998185800] proxy_util.c(1935): AH00925: initializing worker proxy:reverse shared
mysite_apache | [Mon Jan 20 12:48:47.519131 2020] [proxy:debug] [pid 22:tid 139997998185800] proxy_util.c(1992): AH00927: initializing worker proxy:reverse local
mysite_apache | [Mon Jan 20 12:48:47.519180 2020] [proxy:debug] [pid 22:tid 139997998185800] proxy_util.c(2027): AH00930: initialized pool in child 22 for (*) min=0 max=61 smax=61
mysite_apache | [Mon Jan 20 12:48:47.519211 2020] [proxy:debug] [pid 22:tid 139997998185800] proxy_util.c(1935): AH00925: initializing worker fcgi://php:9000/ shared
mysite_apache | [Mon Jan 20 12:48:47.519236 2020] [proxy:debug] [pid 22:tid 139997998185800] proxy_util.c(1992): AH00927: initializing worker fcgi://php:9000/ local
mysite_apache | [Mon Jan 20 12:48:47.519283 2020] [proxy:debug] [pid 22:tid 139997998185800] proxy_util.c(2027): AH00930: initialized pool in child 22 for (php) min=0 max=61 smax=61
mysite_apache | [Mon Jan 20 12:48:47.520017 2020] [mpm_event:debug] [pid 22:tid 139997984541472] event.c(2315): AH02471: start_threads: Using epoll (wakeable)
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] mysqld (mysqld 10.4.11-MariaDB) starting as process 1 ...
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: Using Linux native AIO
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: Mutexes and rw_locks use GCC atomic builtins
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: Uses event mutexes
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: Compressed tables use zlib 1.2.11
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: Number of pools: 1
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: Using SSE2 crc32 instructions
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] mysqld: O_TMPFILE is not supported on /tmp (disabling future attempts)
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: Initializing buffer pool, total size = 128M, instances = 1, chunk size = 128M
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: Completed initialization of buffer pool
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: If the mysqld execution user is authorized, page cleaner thread priority can be changed. See the man page of setpriority().
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: 128 out of 128 rollback segments are active.
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: Creating shared tablespace for temporary tables
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ...
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB.
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: Waiting for purge to start
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: 10.4.11 started; log sequence number 886293014; transaction id 420619
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: Loading buffer pool(s) from /var/lib/mysql/ib_buffer_pool
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] Plugin 'FEEDBACK' is disabled.
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] Server socket created on IP: '0.0.0.0'.
mysite_mariadb | 2020-01-20 12:48:46 0 [Warning] 'proxies_priv' entry '@% root@a0ff889f3fb6' ignored in --skip-name-resolve mode.
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] Reading of all Master_info entries succeeded
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] Added new Master_info '' to hash table
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] mysqld: ready for connections.
mysite_mariadb | Version: '10.4.11-MariaDB'  socket: '/var/run/mysqld/mysqld.sock'  port: 3306  MariaDB Server
mysite_mariadb | 2020-01-20 12:48:46 0 [Note] InnoDB: Buffer pool(s) load completed at 200120 12:48:46
mysite_portainer | 2020/01/20 12:12:04 Templates already registered inside the database. Skipping template import.
mysite_portainer | 2020/01/20 12:12:04 Instance already has defined endpoints. Skipping the endpoint defined via CLI.
mysite_portainer | 2020/01/20 12:12:04 Starting Portainer 1.21.0 on :9000
mysite_portainer | 2020/01/20 12:15:55 Templates already registered inside the database. Skipping template import.
mysite_portainer | 2020/01/20 12:15:55 Instance already has defined endpoints. Skipping the endpoint defined via CLI.
mysite_portainer | 2020/01/20 12:15:55 Starting Portainer 1.21.0 on :9000
mysite_portainer | 2020/01/20 12:29:55 Templates already registered inside the database. Skipping template import.
mysite_portainer | 2020/01/20 12:29:55 Instance already has defined endpoints. Skipping the endpoint defined via CLI.
mysite_portainer | 2020/01/20 12:29:55 Starting Portainer 1.21.0 on :9000
mysite_portainer | 2020/01/20 12:42:10 Templates already registered inside the database. Skipping template import.
mysite_portainer | 2020/01/20 12:42:10 Instance already has defined endpoints. Skipping the endpoint defined via CLI.
mysite_portainer | 2020/01/20 12:42:10 Starting Portainer 1.21.0 on :9000
mysite_portainer | 2020/01/20 12:42:57 Templates already registered inside the database. Skipping template import.
mysite_portainer | 2020/01/20 12:42:57 Instance already has defined endpoints. Skipping the endpoint defined via CLI.
mysite_portainer | 2020/01/20 12:42:57 Starting Portainer 1.21.0 on :9000
mysite_portainer | 2020/01/20 12:44:30 Templates already registered inside the database. Skipping template import.
mysite_portainer | 2020/01/20 12:44:30 Instance already has defined endpoints. Skipping the endpoint defined via CLI.
mysite_portainer | 2020/01/20 12:44:30 Starting Portainer 1.21.0 on :9000
mysite_portainer | 2020/01/20 12:48:44 Templates already registered inside the database. Skipping template import.
mysite_portainer | 2020/01/20 12:48:44 Instance already has defined endpoints. Skipping the endpoint defined via CLI.
mysite_portainer | 2020/01/20 12:48:44 Starting Portainer 1.21.0 on :9000
mysite_traefik | time="2020-01-20T12:44:31Z" level=info msg="Configuration loaded from flags."
mysite_traefik | time="2020-01-20T12:48:31Z" level=error msg="accept tcp [::]:80: use of closed network connection" entryPointName=http
mysite_traefik | time="2020-01-20T12:48:31Z" level=error msg="accept tcp [::]:8080: use of closed network connection" entryPointName=traefik
mysite_traefik | time="2020-01-20T12:48:46Z" level=info msg="Configuration loaded from flags."
mysite_mailhog | [HTTP] Binding to address: 0.0.0.0:8025
mysite_mailhog | Creating API v1 with WebPath: 
mysite_mailhog | Creating API v2 with WebPath: 
mysite_mailhog | 2020/01/20 12:12:03 Using in-memory storage
mysite_mailhog | 2020/01/20 12:12:03 [SMTP] Binding to address: 0.0.0.0:1025
mysite_mailhog | 2020/01/20 12:12:03 Serving under http://0.0.0.0:8025/
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | 2020/01/20 12:15:54 Using in-memory storage
mysite_mailhog | 2020/01/20 12:15:54 [SMTP] Binding to address: 0.0.0.0:1025
mysite_mailhog | 2020/01/20 12:15:54 Serving under http://0.0.0.0:8025/
mysite_mailhog | [HTTP] Binding to address: 0.0.0.0:8025
mysite_mailhog | Creating API v1 with WebPath: 
mysite_mailhog | Creating API v2 with WebPath: 
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | 2020/01/20 12:29:53 Using in-memory storage
mysite_mailhog | [HTTP] Binding to address: 0.0.0.0:8025
mysite_mailhog | Creating API v1 with WebPath: 
mysite_mailhog | 2020/01/20 12:29:53 [SMTP] Binding to address: 0.0.0.0:1025
mysite_mailhog | 2020/01/20 12:29:53 Serving under http://0.0.0.0:8025/
mysite_mailhog | Creating API v2 with WebPath: 
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | 2020/01/20 12:42:10 Using in-memory storage
mysite_mailhog | 2020/01/20 12:42:10 [SMTP] Binding to address: 0.0.0.0:1025
mysite_mailhog | 2020/01/20 12:42:10 Serving under http://0.0.0.0:8025/
mysite_mailhog | [HTTP] Binding to address: 0.0.0.0:8025
mysite_mailhog | Creating API v1 with WebPath: 
mysite_mailhog | Creating API v2 with WebPath: 
mysite_mailhog | 2020/01/20 12:42:57 Using in-memory storage
mysite_mailhog | 2020/01/20 12:42:57 Serving under http://0.0.0.0:8025/
mysite_mailhog | 2020/01/20 12:42:57 [SMTP] Binding to address: 0.0.0.0:1025
mysite_mailhog | [HTTP] Binding to address: 0.0.0.0:8025
mysite_mailhog | Creating API v1 with WebPath: 
mysite_mailhog | Creating API v2 with WebPath: 
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | 2020/01/20 12:44:30 Using in-memory storage
mysite_mailhog | 2020/01/20 12:44:30 [SMTP] Binding to address: 0.0.0.0:1025
mysite_mailhog | 2020/01/20 12:44:30 Serving under http://0.0.0.0:8025/
mysite_mailhog | [HTTP] Binding to address: 0.0.0.0:8025
mysite_mailhog | Creating API v1 with WebPath: 
mysite_mailhog | Creating API v2 with WebPath: 
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | 2020/01/20 12:48:43 Using in-memory storage
mysite_mailhog | 2020/01/20 12:48:43 [SMTP] Binding to address: 0.0.0.0:1025
mysite_mailhog | [HTTP] Binding to address: 0.0.0.0:8025
mysite_mailhog | 2020/01/20 12:48:43 Serving under http://0.0.0.0:8025/
mysite_mailhog | Creating API v1 with WebPath: 
mysite_mailhog | Creating API v2 with WebPath: 
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
mysite_mailhog | [APIv1] KEEPALIVE /api/v1/events
@sylven
Copy link
Author

sylven commented Jan 20, 2020

Solved.
It was because I had this comment in the same line as PROJECT_BASE_URL and it was picking all the line. I added this comment a few days ago but I guess the container hasn't been rebuilded until today. What a waste of time 😓

PROJECT_BASE_URL=mysite.docker.localhost # If you change this, remember updating trusted_host_patterns

@sylven sylven closed this as completed Jan 20, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant