Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐛 Docker version fails to build #242

Closed
Node815 opened this issue Sep 15, 2023 · 13 comments
Closed

🐛 Docker version fails to build #242

Node815 opened this issue Sep 15, 2023 · 13 comments

Comments

@Node815
Copy link

Node815 commented Sep 15, 2023

Description

This errors out when building the Docker version. The before line through to the stop point:

#0 98.14 error: failed to run custom build command for `mlua v0.8.10`
#0 98.14 
#0 98.14 Caused by:
#0 98.14   process didn't exit successfully: `/app/target/release/build/mlua-cad64dedaf978192/build-script-main` (exit status: 101)
#0 98.14   --- stdout
#0 98.14   cargo:rerun-if-env-changed=LUA_INC
#0 98.14   cargo:rerun-if-env-changed=LUA_LIB
#0 98.14   cargo:rerun-if-env-changed=LUA_LIB_NAME
#0 98.14   cargo:rerun-if-env-changed=LUA_LINK
#0 98.14   cargo:rerun-if-env-changed=LUAJIT_NO_PKG_CONFIG
#0 98.14   cargo:rerun-if-env-changed=PKG_CONFIG_x86_64-unknown-linux-gnu
#0 98.14   cargo:rerun-if-env-changed=PKG_CONFIG_x86_64_unknown_linux_gnu
#0 98.14   cargo:rerun-if-env-changed=HOST_PKG_CONFIG
#0 98.14   cargo:rerun-if-env-changed=PKG_CONFIG
#0 98.14   cargo:rerun-if-env-changed=LUAJIT_STATIC
#0 98.14   cargo:rerun-if-env-changed=LUAJIT_DYNAMIC
#0 98.14   cargo:rerun-if-env-changed=PKG_CONFIG_ALL_STATIC
#0 98.14   cargo:rerun-if-env-changed=PKG_CONFIG_ALL_DYNAMIC
#0 98.14   cargo:rerun-if-env-changed=PKG_CONFIG_PATH_x86_64-unknown-linux-gnu
#0 98.14   cargo:rerun-if-env-changed=PKG_CONFIG_PATH_x86_64_unknown_linux_gnu
#0 98.14   cargo:rerun-if-env-changed=HOST_PKG_CONFIG_PATH
#0 98.14   cargo:rerun-if-env-changed=PKG_CONFIG_PATH
#0 98.14   cargo:rerun-if-env-changed=PKG_CONFIG_LIBDIR_x86_64-unknown-linux-gnu
#0 98.14   cargo:rerun-if-env-changed=PKG_CONFIG_LIBDIR_x86_64_unknown_linux_gnu
#0 98.14   cargo:rerun-if-env-changed=HOST_PKG_CONFIG_LIBDIR
#0 98.14   cargo:rerun-if-env-changed=PKG_CONFIG_LIBDIR
#0 98.14   cargo:rerun-if-env-changed=PKG_CONFIG_SYSROOT_DIR_x86_64-unknown-linux-gnu
#0 98.14   cargo:rerun-if-env-changed=PKG_CONFIG_SYSROOT_DIR_x86_64_unknown_linux_gnu
#0 98.14   cargo:rerun-if-env-changed=HOST_PKG_CONFIG_SYSROOT_DIR
#0 98.14   cargo:rerun-if-env-changed=PKG_CONFIG_SYSROOT_DIR
#0 98.14 
#0 98.14   --- stderr
#0 98.14   thread 'main' panicked at 'cannot find LuaJIT using `pkg-config`: `PKG_CONFIG_ALLOW_SYSTEM_CFLAGS="1" PKG_CONFIG_ALLOW_SYSTEM_LIBS="1" "pkg-config" "--libs" "--cflags" "luajit" "luajit >= 2.0.4"` did not exit successfully: exit status: 1
#0 98.14   error: could not find system library 'luajit' required by the 'mlua' crate
#0 98.14 
#0 98.14   --- stderr
#0 98.14   Package luajit was not found in the pkg-config search path.
#0 98.14   Perhaps you should add the directory containing `luajit.pc'
#0 98.14   to the PKG_CONFIG_PATH environment variable
#0 98.14   Package 'luajit', required by 'virtual:world', not found
#0 98.14   Package 'luajit', required by 'virtual:world', not found
#0 98.14   ', /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/mlua-0.8.10/build/find_normal.rs:88:13
#0 98.14   note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
#0 98.14 warning: build failed, waiting for other jobs to finish...
#0 116.0 thread 'main' panicked at 'Exited with status code: 101', /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/cargo-chef-0.1.62/src/recipe.rs:204:27
#0 116.0 note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
------
failed to solve: process "/bin/sh -c cargo chef cook --release --recipe-path recipe.json" did not complete successfully: exit code: 101

Screenshots

No response

Do you want to work on this issue?

None

Additional information

No response

@github-actions
Copy link

To reduce notifications, issues are locked until they are 🏁 status: ready for dev and to be assigned. You can learn more in our contributing guide https://github.com/neon-mmd/websurfx/blob/rolling/CONTRIBUTING.md

@github-actions
Copy link

The issue has been unlocked and is now ready for dev. If you would like to work on this issue, you can comment to have it assigned to you. You can learn more in our contributing guide https://github.com/neon-mmd/websurfx/blob/rolling/CONTRIBUTING.md

@neon-mmd
Copy link
Owner

neon-mmd commented Sep 15, 2023

Thanks ❤️ for taking the time to open this issue. We really appreciate it 👍 as this helps us improve the project 🙂 . Ok, I think I know this issue and I would like to suggest trying the below Dockerfile and let us know if it fixes your issue 🙂

FROM rust:latest AS chef
# We only pay the installation cost once,
# it will be cached from the second build onwards
RUN cargo install cargo-chef

WORKDIR /app

FROM chef AS planner
COPY . . 
RUN cargo chef prepare --recipe-path recipe.json

FROM chef AS builder
COPY --from=planner /app/recipe.json recipe.json
# Build dependencies - this is the caching Docker layer!
RUN apt-get update -y && apt-get install sudo apt-get install -y --no-install-recommends liblua5.4-dev liblua5.3-dev liblua5.2-dev liblua5.1-0-dev libluajit-5.1-dev
RUN cargo chef cook --release --recipe-path recipe.json

# Build application
COPY . .
RUN cargo install --path .

# We do not need the Rust toolchain to run the binary!
FROM gcr.io/distroless/cc-debian12
COPY --from=builder /app/public/ /opt/websurfx/public/
COPY --from=builder /app/websurfx/config.lua /etc/xdg/websurfx/config.lua
COPY --from=builder /app/websurfx/config.lua /etc/xdg/websurfx/allowlist.txt
COPY --from=builder /app/websurfx/config.lua /etc/xdg/websurfx/blocklist.txt
COPY --from=builder /usr/local/cargo/bin/* /usr/local/bin/
CMD ["websurfx"]

@Node815
Copy link
Author

Node815 commented Sep 15, 2023

Changes applied as suggested to the Dockerfile, New Error:

 => [internal] load .dockerignore                                                                                                                 0.0s
 => => transferring context: 127B                                                                                                                 0.0s
 => [internal] load build definition from Dockerfile                                                                                              0.0s
 => => transferring dockerfile: 1.14kB                                                                                                            0.0s
 => [internal] load metadata for docker.io/library/rust:latest                                                                                    0.8s
 => [internal] load metadata for gcr.io/distroless/cc-debian12:latest                                                                             0.4s
 => [chef 1/3] FROM docker.io/library/rust:latest@sha256:8a4ca3ca75afbc97bcf5362e9a694fe049d15734fbbaf82b8b7e224616c1254b                         0.0s
 => CACHED [stage-3 1/6] FROM gcr.io/distroless/cc-debian12@sha256:f44927808110f578fba42bf36eb68a5ecbb268b94543eb9725380ec51e9a39ed               0.0s
 => [internal] load build context                                                                                                                 0.0s
 => => transferring context: 4.33kB                                                                                                               0.0s
 => CACHED [chef 2/3] RUN cargo install cargo-chef                                                                                                0.0s
 => CACHED [chef 3/3] WORKDIR /app                                                                                                                0.0s
 => CACHED [planner 1/2] COPY . .                                                                                                                 0.0s
 => CACHED [planner 2/2] RUN cargo chef prepare --recipe-path recipe.json                                                                         0.0s
 => CACHED [builder 1/5] COPY --from=planner /app/recipe.json recipe.json                                                                         0.0s
 => ERROR [builder 2/5] RUN apt-get update -y && apt-get install sudo apt-get install -y --no-install-recommends liblua5.4-dev liblua5.3-dev lib  3.2s
------                                                                                                                                                 
 > [builder 2/5] RUN apt-get update -y && apt-get install sudo apt-get install -y --no-install-recommends liblua5.4-dev liblua5.3-dev liblua5.2-dev liblua5.1-0-dev libluajit-5.1-dev:                                                                                                                        
#0 0.423 Get:1 http://deb.debian.org/debian bookworm InRelease [151 kB]                                                                                
#0 0.438 Get:2 http://deb.debian.org/debian bookworm-updates InRelease [52.1 kB]                                                                       
#0 0.438 Get:3 http://deb.debian.org/debian-security bookworm-security InRelease [48.0 kB]                                                             
#0 0.537 Get:4 http://deb.debian.org/debian bookworm/main amd64 Packages [8906 kB]
#0 0.657 Get:5 http://deb.debian.org/debian bookworm-updates/main amd64 Packages [6432 B]
#0 0.700 Get:6 http://deb.debian.org/debian-security bookworm-security/main amd64 Packages [62.1 kB]
#0 1.825 Fetched 9226 kB in 1s (6183 kB/s)
#0 1.825 Reading package lists...
#0 2.437 Reading package lists...
#0 3.038 Building dependency tree...
#0 3.163 Reading state information...
#0 3.169 E: Unable to locate package apt-get
#0 3.169 E: Unable to locate package install
------
failed to solve: process "/bin/sh -c apt-get update -y && apt-get install sudo apt-get install -y --no-install-recommends liblua5.4-dev liblua5.3-dev liblua5.2-dev liblua5.1-0-dev libluajit-5.1-dev" did not complete successfully: exit code: 100

EDIT: I changed Line 15 to remove "sudo" as it was already installed.
RUN apt-get update -y && apt-get install -y --no-install-recommends liblua5.4-dev liblua5.3-dev liblua5.2-dev liblua5.1-0-dev libluajit-5.1-dev

It seems to be running now, I will update after this.

@Node815
Copy link
Author

Node815 commented Sep 15, 2023

It built but now won't start:

    09/15/2023 3:05:25 PM
    websurfx: error while loading shared libraries: libluajit-5.1.so.2: cannot open shared object file: No such file or directory
    Container stopped

@neon-mmd
Copy link
Owner

It built but now won't start:

    09/15/2023 3:05:25 PM
    websurfx: error while loading shared libraries: libluajit-5.1.so.2: cannot open shared object file: No such file or directory
    Container stopped

Sorry for the delay in the reply. Ok, I would like to suggest trying the below Dockerfile and Cargo.toml file and let us know if it fixes your issue 🙂 .

The new Dockerfile

FROM rust:latest AS chef
# We only pay the installation cost once,
# it will be cached from the second build onwards
RUN cargo install cargo-chef --locked

WORKDIR /app

FROM chef AS planner
COPY . . 
RUN cargo chef prepare --recipe-path recipe.json

FROM chef AS builder
COPY --from=planner /app/recipe.json recipe.json
# Build dependencies - this is the caching Docker layer!
RUN cargo chef cook --release --recipe-path recipe.json

# Build application
COPY . .
RUN cargo install --path .

# We do not need the Rust toolchain to run the binary!
FROM gcr.io/distroless/cc-debian12
COPY --from=builder /app/public/ /opt/websurfx/public/
COPY --from=builder /app/websurfx/config.lua /etc/xdg/websurfx/config.lua
COPY --from=builder /app/websurfx/allowlist.txt /etc/xdg/websurfx/allowlist.txt
COPY --from=builder /app/websurfx/blocklist.txt /etc/xdg/websurfx/blocklist.txt
COPY --from=builder /usr/local/cargo/bin/* /usr/local/bin/
CMD ["websurfx"]

The new Cargo.toml file

[package]
name = "websurfx"
version = "0.21.0"
edition = "2021"
description = "An open-source alternative to Searx that provides clean, ad-free, and organic results with incredible speed while keeping privacy and security in mind."
repository = "https://github.com/neon-mmd/websurfx"
license = "AGPL-3.0"

[dependencies]
reqwest = {version="0.11.20",features=["json"]}
tokio = {version="1.32.0",features=["rt-multi-thread","macros"]}
serde = {version="1.0.188",features=["derive"]}
handlebars = { version = "4.4.0", features = ["dir_source"] }
scraper = {version="0.17.1"}
actix-web = {version="4.4.0", features = ["cookies"]}
actix-files = {version="0.6.2"}
actix-cors = {version="0.6.4"}
serde_json = {version="1.0.105"}
fake-useragent = {version="0.1.3"}
env_logger = {version="0.10.0"}
log = {version="0.4.20"}
mlua = {version="0.8.10", features=["luajit", "vendored"]}
redis = {version="0.23.3", features=["tokio-comp","connection-manager"], optional = true}
md5 = {version="0.7.0"}
rand={version="0.8.5"}
once_cell = {version="1.18.0"}
error-stack = {version="0.4.0"}
async-trait = {version="0.1.73"}
regex = {version="1.9.4", features=["perf"]}
smallvec = {version="1.11.0", features=["union", "serde"]}
futures = {version="0.3.28"}
dhat = {version="0.3.2", optional = true}
mimalloc = { version = "0.1.38", default-features = false }
async-once-cell = {version="0.5.3"}
actix-governor = {version="0.4.1"}
mini-moka = { version="0.10", optional = true}

[dev-dependencies]
rusty-hook = "^0.11.2"
criterion = "0.5.1"
tempfile = "3.8.0"

[profile.dev]
opt-level = 0
debug = true
split-debuginfo = '...'
debug-assertions = true
overflow-checks = true
lto = false
panic = 'unwind'
incremental = true
codegen-units = 256
rpath = false

[profile.release]
opt-level = 3
debug = false # This should only be commented when testing with dhat profiler
# debug = 1 # This should only be uncommented when testing with dhat profiler
split-debuginfo = '...'
debug-assertions = false
overflow-checks = false
lto = true
panic = 'abort'
incremental = false
codegen-units = 1
rpath = false
strip = "debuginfo"

[features]
default = ["memory-cache"]
dhat-heap = ["dep:dhat"] 
memory-cache = ["dep:mini-moka"]
redis-cache = ["dep:redis"]
hybrid-cache = ["memory-cache", "redis-cache"]

Note
Before you make the changes make sure that you are on latest version of the branch. To update your local branch run the following as show below:

cd websurfx
git pull

Then apply the changes and then deploy the app by running docker compose up -d --build and then you will have the app deployed on your system.

@Node815
Copy link
Author

Node815 commented Sep 18, 2023

Progress?

09/18/2023 9:28:23 AM
[2023-09-18T16:28:23Z INFO  websurfx::cache::cacher] Using an in-memory cache
09/18/2023 9:28:23 AM
[2023-09-18T16:28:23Z INFO  websurfx] started server on port 8181 and IP 127.0.0.1
09/18/2023 9:28:23 AM
[2023-09-18T16:28:23Z INFO  websurfx] Open http://127.0.0.1:8181/ in your browser
09/18/2023 9:28:23 AM
[2023-09-18T16:28:23Z INFO  actix_server::builder] starting 10 workers
09/18/2023 9:28:23 AM
[2023-09-18T16:28:23Z INFO  actix_server::server] Actix runtime found; starting in Actix runtime```

I remapped the port to 8181:8080 though as I have a cockroachdb listening on 8080.   But at this time, there is no web page, Firefox just says page not found on port 8181.  

Unable to connect

Firefox can’t establish a connection to the server at 192.168.1.161:8181.

    The site could be temporarily unavailable or too busy. Try again in a few moments.
    If you are unable to load any pages, check your computer’s network connection.
    If your computer or network is protected by a firewall or proxy, make sure that Firefox is permitted to access the web.
    

@neon-mmd
Copy link
Owner

Progress?

09/18/2023 9:28:23 AM
[2023-09-18T16:28:23Z INFO  websurfx::cache::cacher] Using an in-memory cache
09/18/2023 9:28:23 AM
[2023-09-18T16:28:23Z INFO  websurfx] started server on port 8181 and IP 127.0.0.1
09/18/2023 9:28:23 AM
[2023-09-18T16:28:23Z INFO  websurfx] Open http://127.0.0.1:8181/ in your browser
09/18/2023 9:28:23 AM
[2023-09-18T16:28:23Z INFO  actix_server::builder] starting 10 workers
09/18/2023 9:28:23 AM
[2023-09-18T16:28:23Z INFO  actix_server::server] Actix runtime found; starting in Actix runtime```

I remapped the port to 8181:8080 though as I have a cockroachdb listening on 8080.   But at this time, there is no web page, Firefox just says page not found on port 8181.  

Unable to connect

Firefox can’t establish a connection to the server at 192.168.1.161:8181.

    The site could be temporarily unavailable or too busy. Try again in a few moments.
    If you are unable to load any pages, check your computer’s network connection.
    If your computer or network is protected by a firewall or proxy, make sure that Firefox is permitted to access the web.
    

Yes, the previous error is fixed. So we have some progress here 🙂 . I think I know the issue again this time and as such I would suggest updating your config file to look something like this (based on the docs):

-- ### General ###
logging = true -- an option to enable or disable logs.
debug = false -- an option to enable or disable debug mode.
threads = 10 -- the amount of threads that the app will use to run (the value should be greater than 0).

-- ### Server ###
port = "8181" -- port on which server should be launched
binding_ip = "0.0.0.0" --ip address on the which server should be launched.
production_use = false -- whether to use production mode or not (in other words this option should be used if it is to be used to host it on the server to provide a service to a large number of users (more than one))
-- if production_use is set to true
-- There will be a random delay before sending the request to the search engines, this is to prevent DDoSing the upstream search engines from a large number of simultaneous requests.
request_timeout = 30 -- timeout for the search requests sent to the upstream search engines to be fetched (value in seconds).
rate_limiter = {
	number_of_requests = 20, -- The number of request that are allowed within a provided time limit.
	time_limit = 1, -- The time limit in which the quantity of requests that should be accepted.
}

-- ### Search ###
-- Filter results based on different levels. The levels provided are:
-- {{
-- 0 - None
-- 1 - Low
-- 2 - Moderate
-- 3 - High
-- 4 - Aggressive
-- }}
safe_search = 2

-- ### Website ###
-- The different colorschemes provided are:
-- {{
-- catppuccin-mocha
-- dark-chocolate
-- dracula
-- gruvbox-dark
-- monokai
-- nord
-- oceanic-next
-- one-dark
-- solarized-dark
-- solarized-light
-- tokyo-night
-- tomorrow-night
-- }}
colorscheme = "catppuccin-mocha" -- the colorscheme name which should be used for the website theme
theme = "simple" -- the theme name which should be used for the website

-- ### Caching ###
redis_url = "redis://redis:6379" -- redis connection url address on which the client should connect on.

-- ### Search Engines ###
upstream_search_engines = {
	DuckDuckGo = true,
	Searx = false,
} -- select the upstream search engines from which the results should be fetched.

Again, let us know if this fixes your issue 🙂 .

@Node815
Copy link
Author

Node815 commented Sep 18, 2023

No fix. I tried your version you posted above, as well as double checked mine beforehand - page will not load. Mine was set to 127.0.0.1., I changed it to 0.0.0.0 on my first attempt, otherwise it was already set to answer on 8181. I ran the docker compose -d --build and it says it's running but no page availability. So, I deleted my config.lua and rebuilt it again with your version and same problem using the same docker compose method.

I had to chuckle at this one, while in Dozzle, looking for any log information (there's none), I think you have achieved the ultra low load record on my system!!!!
image

@neon-mmd
Copy link
Owner

No fix. I tried your version you posted above, as well as double checked mine beforehand - page will not load. Mine was set to 127.0.0.1., I changed it to 0.0.0.0 on my first attempt, otherwise it was already set to answer on 8181. I ran the docker compose -d --build and it says it's running but no page availability. So, I deleted my config.lua and rebuilt it again with your version and same problem using the same docker compose method.

I had to chuckle at this one, while in Dozzle, looking for any log information (there's none), I think you have achieved the ultra low load record on my system!!!! image

Ok, is it the same when you access it through 127.0.0.1:8181 on your browser ? Because on my system it works fine like I can access the page and everything, so it seems weird to me like why wouldn't it work. 🤔 . It seems like there is a configuration mismatch or something.

Also, I would suggest you to run this on the command line. Let us know what does this show when you run it:

docker compose logs -f 

@Node815
Copy link
Author

Node815 commented Sep 18, 2023

This isn't on a local server as in on 127.0.0.1 - it's on a different computer so using 127.0.0.1:8181 will point to my computer which wouldn't be running it. from the

docker compose logs -f :

ebsurfx-redis-1  | 1:C 18 Sep 2023 16:13:01.997 * oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
websurfx-redis-1  | 1:C 18 Sep 2023 16:13:01.997 * Redis version=7.2.1, bits=64, commit=00000000, modified=0, pid=1, just started
websurfx-redis-1  | 1:C 18 Sep 2023 16:13:01.997 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
websurfx-redis-1  | 1:M 18 Sep 2023 16:13:01.998 * monotonic clock: POSIX clock_gettime
websurfx-redis-1  | 1:M 18 Sep 2023 16:13:01.999 * Running mode=standalone, port=6379.
websurfx-redis-1  | 1:M 18 Sep 2023 16:13:02.001 * Server initialized
websurfx-redis-1  | 1:M 18 Sep 2023 16:13:02.003 * Loading RDB produced by version 7.2.1
websurfx-redis-1  | 1:M 18 Sep 2023 16:13:02.003 * RDB age 61867 seconds
websurfx-redis-1  | 1:M 18 Sep 2023 16:13:02.003 * RDB memory usage when created 0.83 Mb
websurfx-redis-1  | 1:M 18 Sep 2023 16:13:02.003 * Done loading RDB, keys loaded: 0, keys expired: 0.
websurfx-redis-1  | 1:M 18 Sep 2023 16:13:02.003 * DB loaded from disk: 0.002 seconds
websurfx-redis-1  | 1:M 18 Sep 2023 16:13:02.003 * Ready to accept connections tcp
websurfx-app-1    | [2023-09-18T17:52:14Z INFO  websurfx::cache::cacher] Using an in-memory cache
websurfx-app-1    | [2023-09-18T17:52:14Z INFO  websurfx] started server on port 8181 and IP 0.0.0.0
websurfx-app-1    | [2023-09-18T17:52:14Z INFO  websurfx] Open http://0.0.0.0:8181/ in your browser
websurfx-app-1    | [2023-09-18T17:52:14Z INFO  actix_server::builder] starting 10 workers
websurfx-app-1    | [2023-09-18T17:52:14Z INFO  actix_server::server] Actix runtime found; starting in Actix runtime

@Node815
Copy link
Author

Node815 commented Sep 18, 2023

Got it working, changed the network to "Host" and can see the page now.

Search speed is slow though - but may be due to machine.

CPU: quad core Intel Core i5-2400 (-MCP-) speed/min/max: 3294/1600/3400 MHz
Kernel: 6.1.0-12-amd64 x86_64 Up: 7h 23m Mem: 7056.0/15947.0 MiB (44.2%)
Storage: 11.03 TiB (45.9% used) Procs: 566 Shell: Bash inxi: 3.3.26
root@nas:~/compose/websurfx/websurfx#

Anyway, probably can close this now. :)

@neon-mmd
Copy link
Owner

Got it working, changed the network to "Host" and can see the page now.

Search speed is slow though - but may be due to machine.

CPU: quad core Intel Core i5-2400 (-MCP-) speed/min/max: 3294/1600/3400 MHz Kernel: 6.1.0-12-amd64 x86_64 Up: 7h 23m Mem: 7056.0/15947.0 MiB (44.2%) Storage: 11.03 TiB (45.9% used) Procs: 566 Shell: Bash inxi: 3.3.26 root@nas:~/compose/websurfx/websurfx#

Anyway, probably can close this now. :)

Yes, maybe because on my system it runs really fast. But anyway, nice to hear that the issue got resolved 🙂 . Thanks ❤️ again for opening this issue. Now Since everything looks good. I will close this issue right away 🚀 🙂 .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants