Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: Can't add new command when connection is in closed state #939

Closed
Rakeshvishnoi029 opened this issue Mar 29, 2019 · 60 comments
Closed

Comments

@Rakeshvishnoi029
Copy link

Rakeshvishnoi029 commented Mar 29, 2019

Hi

I fascing last 2 days this error please help me...

{ Error: read ETIMEDOUT
at TCP.onread (net.js:622:25)
errno: 'ETIMEDOUT',
code: 'ETIMEDOUT',
syscall: 'read',
fatal: true }
{ Error: Can't add new command when connection is in closed state
at PoolConnection._addCommandClosedState

I use mysql 2 and connect Pool

var mysql = require('mysql2');
var mysqlPool = mysql.createPool({
host: 'localhost',
user: 'root',
password: 'xyz',
database: 'xyz',
waitForConnections: true,
connectionLimit: 10,
queueLimit: 0
});

module.exports = mysqlPool;

55|| { Error: read ETIMEDOUT
55|| at TCP.onread (net.js:622:25)
55|| errno: 'ETIMEDOUT',
55|| code: 'ETIMEDOUT',
55| | syscall: 'read',
55|| fatal: true }
55|| { Error: Can't add new command when connection is in closed state
55| at PoolConnection._addCommandClosedState

@digiperfect-janak
Copy link

digiperfect-janak commented Apr 4, 2019

+1 we have been getting this issue since yesterday.

Error: This socket has been ended by the other party

followed by

Error: Can't add new command when connection is in closed state

@yunfan
Copy link

yunfan commented Apr 14, 2019

we too :[

@zebullax
Copy link

zebullax commented May 9, 2019

Same issue here, the only hack is to stop and restart the node app...

@Akash0333
Copy link

Same issue here, please tell me how we can resolve it

@ovidiu-balau
Copy link

Anyone found a fix for this?

@nag944
Copy link

nag944 commented Jul 12, 2019

Any news on this problem ? After connection timed out, it cannot be restored at all - as all activity related to network side is performed in constructor. Also, this error is returned as text message only, no codes provided.

Is it safe just to recreate connection object ? Actually it's very pour idea, as connection instance is passed to many structures in the program...

@ovidiu-balau
Copy link

Still an issue, anyone found any solution for this?

@jovemnf
Copy link

jovemnf commented Aug 2, 2019

me too

@ganchuhang
Copy link

I got this as well

@sidorares
Copy link
Owner

Hey @Coder77 @jovemnf @ovidiu-balau @Akash0333 @zebullax and everyone else experience issue - is there a reliable way to reproduce? Ideally dockerized self contained , or at least sequence of instructions

@nag944
Copy link

nag944 commented Aug 12, 2019

Hey @Coder77 @jovemnf @ovidiu-balau @Akash0333 @zebullax and everyone else experience issue - is there a reliable way to reproduce? Ideally dockerized self contained , or at least sequence of instructions

It's extremely easy to reproduce - just start a connection and wait for default timeout (you can emulate it changing default mysql timeout setting to some smaller value - by default it's around 8 hours).

Btw, I tested it not on pool, but with raw connection - in this case error object is empty at all, only text message is present.

As I checked source code, seems this problem has no way to be solved in current state. When mysql connection is gone away, the network connection instance data inside the component is destroyed, so you cannot just restart the net - must create a new connection instance at all (network connection instance is created in the constructor !!!).

@sidorares
Copy link
Owner

As I checked source code, seems this problem has no way to be solved in current state.

Yes, individual Connection is not meant to survive underlying transport reconnect, it's 1:1 mapping to mysql connection / thread with same possible states ( authentication, change user, etc ). When network connection dies or actively closed by server it should not be re-used, this is just how it is designed. If it's your issue the solution should be in your code ( listen for error / end events etc ). If you want simpler api just use pool - it should handle all this

@nag944
Copy link

nag944 commented Aug 12, 2019

But as I see previous messages, pool seems to have same problems ?

The only way I found is just send stub queries every hour to keep connection alive. In the other case, I cannot even make correct destruction - it hangs forever, so must forget about the instance and create new one - rely that garbage collector makes it's job right.

@zebullax
Copy link

Unless I'm mistaken, issue is not limited to single connection but also connection managed by pool (that s what I was using as a setup), and that would make sense since the pool is only managing creating and yielding connection, if an existing connections has been remotely closed, and is yielded by the pool, we get the same error that if we were working with a simple connection

@sidorares
Copy link
Owner

pool seems to have same problems ?

I'm happy to try to investigate, if I have simple way to reproduce with pool that would help me a lot

if an existing connections has been remotely closed, and is yielded by the pool, we get the same error that if we were working with a simple connection

In the case of pool it's different in terms of who is responsible for tracking and recycling of dead connection. Pool should do it all for you while when you use individual connection you must listen for error ( both on connection and on query results )

@MR4online
Copy link

having the same problem with pooled connections.
If our MySQL stats are correct, the count of the active connections does not vary over time. The last time the error occured was when an cron caused many queries. May be the aggregation of new connection doenst work as expected. Is there an way to debug that?

Our code looks pretty much like in the docs https://github.com/mysqljs/mysql#pooling-connections. The most logic is simple (get new Connection from permanent Pool, query it, release it).

@ADumaine
Copy link

I'm using a single connection and after the connection is lost con.end() or con.release() throws the error.
My current solution is to set the connection object to null and create a new connection on the error.

if (con) con=null
initcon(); //sets up the connection

@T1MOXA
Copy link

T1MOXA commented Oct 5, 2019

I have the same problem

@Xsmael
Copy link

Xsmael commented Oct 21, 2019

Mee toO! any fix ?

@juaneuro90
Copy link

Guys I'm having the same issue.

@jmordica
Copy link
Contributor

Me too.

@jmordica
Copy link
Contributor

Switching to createPool resolved my issue.

@franck-nadeau
Copy link

Switching to createPool resolved my issue.

@jmordica What do you mean by switching to createPool? That is what I am using and I am getting this problem. Also that is what Rakeshvishnoi029's initial code used. Are you passing something special as an option?

@zeeshanaligold
Copy link

Hi @jmordica

I have the same issue. Can you please tell how did you get it fix? Thx

const mysql = require('mysql2');

const con = mysql.createConnection({
    host     : process.env.RDS_HOSTNAME,
    user     : process.env.RDS_USERNAME,
    password : process.env.RDS_PASSWORD,
    database : process.env.RDS_DATABASE
});

@ghost
Copy link

ghost commented Apr 2, 2020

I have same issue!

@rudiantotiofan
Copy link

any update for this issue ? i got the same problem.

Thanks,

@davehorton
Copy link

Same issue here, using createPool. Any updates on this?

const pool = mysql.createPool({
  host: process.env.XX,
  user: process.env.XXX,
  password: process.env.XX,
  database: process.env.XX,
  connectionLimit: process.env.XX
});
"err":
{"type":"Error","message":"Can't add new command when connection is in closed state","stack":"Error: Can't add new command when connection is in closed state\n    
at PoolConnection._addCommandClosedState (/../mysql2/lib/connection.js:137:17)\n    
at PoolConnection.query (../mysql2/lib/connection.js:502:17)\n    
at getMysqlConnection (../login.js:31:10)\n    
at process.nextTick (../mysql2/lib/pool.js:44:37)\n    
at process._tickCallback (internal/process/next_tick.js:61:11)","fatal":true},"msg":"Error getting db connection","v":1

at least update the docs to remove the misleading information that pool handles this for you.

@sidorares
Copy link
Owner

@davehorton unfortunately not enough information to debug. Can you make a test repo to reproduce?

This can happen if you call .end() on connection and later try to call .execute() this might be a logic error in your application but could be a bug in the driver, unfortunately not enough information to tell

@runekm
Copy link

runekm commented May 1, 2020

(EDIT: I see that I was running an old version of the package (1.7.0). I tried upgrading to the newest vesrion. I don't know if there is any difference yet)

I too get this error from time to time. I haven't investigated the problem in detail, but it seems to occur every couple of days, and lasts untill the app is restarted.

The connection pool is created with:

var mysql = require('mysql2');
pool = mysql.createPool({
    connectionLimit : config.DB_CONNECTION_LIMIT,
    host     		: config.DB_HOST,
    port     		: config.DB_PORT,
    user     		: config.DB_USER,
    password 		: config.DB_PASSWORD,
    database 		: config.DB_DATABASE,
    dateStrings		: true,
    debug			: config.DB_DEBUG
});
promisePool = pool.promise();

The error occurs when I run:
[ results, fields ] = await connection.query( sql );
after running:
connection = await promisePool.getConnection();
which gives no error.

The error says:
Error: Can't add new command when connection is in closed state at PromisePoolConnection.query ([path_to_my_app]/node_modules/mysql2/promise.js:92:22

@HsinHeng
Copy link

In my case, I perform insertMany exceed 30,000 records at one time.
The error was produced.

I resolved my issue by increase max_allowed_packet in mysql server config.
because the transmission bytes are too large.

@sidorares
Copy link
Owner

@HsinHeng interesting, maybe that can be solved at driver level. We should be able to automatically detect max_allowed_packet ( from settings or by querying variable immediately after connection ) and then splitting outgoing packet into smaller packets - see https://dev.mysql.com/doc/internals/en/sending-more-than-16mbyte.html

mysql2 does support large incoming packet sizes ( see

} else {
// first large packet - remember it's id
if (this.largePacketParts.length === 0) {
this.firstPacketSequenceId = sequenceId;
}
this.largePacketParts.push(
chunk.slice(
start + this.packetHeaderLength,
start + this.packetHeaderLength + this.length
)
);
if (this.length < MAX_PACKET_LENGTH) {
this._flushLargePacket();
}
}
) but your issue is probably related to packet length > max_allowed_packet and not 16Mbyte packets. I need to double check maybe max_allowed_packet can be automatically sent at connection but if not still can be queried using .query()

@sidorares
Copy link
Owner

@HsinHeng what was exactly error code in your case? Having some help in error message would be useful I guess. Something along the lines if(error.code === errors.ER_NET_PACKET_TOO_LARGE) { error.message = "Packet too large. Check max_allowed_packet system/session variable. See [wiki url with more info] for more information" }

@HsinHeng
Copy link

HsinHeng commented Oct 23, 2020

@HsinHeng what was exactly error code in your case? Having some help in error message would be useful I guess. Something along the lines if(error.code === errors.ER_NET_PACKET_TOO_LARGE) { error.message = "Packet too large. Check max_allowed_packet system/session variable. See [wiki url with more info] for more information" }

@sidorares
You got it!

Here is my error details.

"name":"SequelizeDatabaseError","parent":{"code":"ER_NET_PACKET_TOO_LARGE","errno":1153,"sqlState":"08S01","sqlMessage":"Got a packet bigger than 'max_allowed_packet' bytes" }

Hope you could resolve this issue.

@sidorares
Copy link
Owner

@HsinHeng well in your case error text from the server already had some hints

What we can do is have maxAllowedPacketSize connection option that maybe is "auto" by default. When set to auto a query is performed immediately on connection to get max_allowed_packet variable ( downside: slightly worse latency until first "useful" response, especially when connection is always shoer lived )

@greedThread
Copy link

Still an issue, anyone found any solution for this?

@Veijar
Copy link

Veijar commented Apr 9, 2021

the same problem

@Desnoo
Copy link

Desnoo commented Apr 14, 2021

We also have this issue, when the connections in the connection pool are long in idle mode.
I debugged to the getConnection point and printed out the connection state that is returned at

return process.nextTick(() => cb(null, connection));

The contents of the connection are as follows:

  1: connection =
    { _events: Object,
      _eventsCount: 1,
      _maxListeners: 'undefined',
      config: ConnectionConfig,
      stream: Socket,
      ... }
  2: connection._fatalError =
    { errno: 'ECONNRESET',
      code: 'ECONNRESET',
      syscall: 'read',
      fatal: true,
      stack: 'Error: read ECONNRESET\n' +
      '    at TCP.onStreamRead [as…tic-apm-node/lib/instrumentation/index.js:328:27)',
      ... }
  3: connection._protocolError =
    { fatal: true,
      code: 'PROTOCOL_CONNECTION_LOST',
      stack: 'Error: Connection lost: The server closed the conn…js:483:12)\n' +
      '    at TCP.<anonymous> (net.js:675:12)',
      message: 'Connection lost: The server closed the connection.',
      Symbol(callsites): Array(4) }
  4: connection._statements =
    { Symbol(max): 16000,
      Symbol(lengthCalculator): ,
      Symbol(allowStale): false,
      Symbol(maxAge): 0,
      Symbol(dispose): ,
      ... }
  5: connection._internalId = 3

So the error seems to come in effect when the mysql server closes the connection "Connection lost: The server closed the connection.". Do you have any hints to prevent this?

@JonHX
Copy link

JonHX commented Jun 8, 2021

Same issue here

@SouzaCarleone
Copy link

Error: Can't add new command when connection is in closed state

Same issue here, any suggestion ?

@yo-wan
Copy link

yo-wan commented Aug 23, 2021

Possible workaround could be to check if any of connection's properties _fatalError, _protocolError, _closing is true before executing the query and to get new connection. But that is ugly and looking for better solution, probably based on events.

@irepela
Copy link

irepela commented Sep 8, 2021

I solved this problem by refactoring my code to use pool.query (which releases connection internally) instead of using connection.query and releasing connection manually. Looks like there was a connection leak in my code.

@MaksemM
Copy link

MaksemM commented Sep 13, 2021

I solved this problem by refactoring my code to use pool.query (which releases connection internally) instead of using connection.query and releasing connection manually. Looks like there was a connection leak in my code.

Do you have an example? I did the same thing using pool.query and it worked fine for much longer, but then it happened again. Did yours permanently fix?

@irepela
Copy link

irepela commented Sep 14, 2021

It didn't happen again in my case. Here is an example what was change:

Old code:
let connection;
try {
connection = await pool.getConnection();
const [rows] = await connection.query(SELECT_USERS, [id]);
return res.send(rows);
} catch (sqlError) {
await handleError(res, sqlError);
} finally {
await connection.release();
}

New code:
try {
const [rows] = await pool.query(SELECT_USERS, [id]);
return res.send(rows);
} catch (sqlError) {
await handleError(res, sqlError);
}

@ChromeUniverse
Copy link

ChromeUniverse commented Sep 27, 2021

It didn't happen again in my case. Here is an example what was change:

Old code:
let connection;
try {
connection = await pool.getConnection();
const [rows] = await connection.query(SELECT_USERS, [id]);
return res.send(rows);
} catch (sqlError) {
await handleError(res, sqlError);
} finally {
await connection.release();
}

New code:
try {
const [rows] = await pool.query(SELECT_USERS, [id]);
return res.send(rows);
} catch (sqlError) {
await handleError(res, sqlError);
}

Has anyone else been able to confirm that this works or found other fixes? I'll try refactoring my code anyways to test this out, but I would love to get more information if possible.

I'm currently using mysql2's promise API to create the connection but also getting the same error:

// connect to MySQL database
async function sql_connect() {
  // connect to database
  db = await mysql.createConnection({
    host     : 'localhost',
    user     : 'lucca',
    password : process.env.MYSQL_PASSWORD, 
    database : 'tank_battle'
  });

  return db;
}

@MaksemM
Copy link

MaksemM commented Sep 27, 2021

It didn't happen again in my case. Here is an example what was change:

Old code:
let connection;
try {
connection = await pool.getConnection();
const [rows] = await connection.query(SELECT_USERS, [id]);
return res.send(rows);
} catch (sqlError) {
await handleError(res, sqlError);
} finally {
await connection.release();
}

New code:
try {
const [rows] = await pool.query(SELECT_USERS, [id]);
return res.send(rows);
} catch (sqlError) {
await handleError(res, sqlError);
}

Thanks for that. I changed my code to use promise pool and it seems to have resolved itself, but I'm still cautious. Thanks for showing me the snippet.

@MaksemM
Copy link

MaksemM commented Sep 27, 2021

It didn't happen again in my case. Here is an example what was change:
Old code:
let connection;
try {
connection = await pool.getConnection();
const [rows] = await connection.query(SELECT_USERS, [id]);
return res.send(rows);
} catch (sqlError) {
await handleError(res, sqlError);
} finally {
await connection.release();
}
New code:
try {
const [rows] = await pool.query(SELECT_USERS, [id]);
return res.send(rows);
} catch (sqlError) {
await handleError(res, sqlError);
}

Has anyone else been able to confirm that this works or found other fixes? I'll try refactoring my code anyways to test this out, but I would love to get more information if possible.

I'm currently using mysql2's promise API to create the connection but also getting the same error:

// connect to MySQL database
async function sql_connect() {
  // connect to database
  db = await mysql.createConnection({
    host     : 'localhost',
    user     : 'lucca',
    password : process.env.MYSQL_PASSWORD, 
    database : 'tank_battle'
  });

  return db;
}

I'll show you what is currently working for me:

Database file:

require('dotenv').config()
const mysql = require('mysql2')

module.exports =  mysql.createPool({
  host: process.env.DB_HOST,
  user: process.env.DB_USER,
  password: process.env.DB_PASSWORD,
  database: process.env.DB_DATABASE,
  timezone: 'Australia/Sydney',
  waitForConnections: true,
  connectionLimit: 10,
  queueLimit: 0,
  debug: false
})

Controller file snippet

const pool = require('../models/db/db')
const promisePool = pool.promise()

createSalesOrderPost: async (req, res) => {
        const page_path = '/admin/sales-orders/edit-order?clientid=' + req.body.clientid
        try {

            if (req.file) {
                var xmlfile = req.file.originalname
            }

            // Database details
            const sql_1 = 'INSERT INTO mt_sales_orders (client_id, payment_id, sales_order_amount, xml_order) VALUES (' + req.body.clientid + ', ' + req.body.paymentid + ', ' + req.body.paymentamount + ', "' + xmlfile + '");'

            const [rows1, fields1] = await promisePool.query(sql_1)

            const sql_2 = 'UPDATE mt_payments SET sales_order_id= (SELECT sales_order_id FROM mt_sales_orders WHERE payment_id=' + req.body.paymentid + ') WHERE payment_id=' + req.body.paymentid

            // Create database entry
            const [rows2, fields2] = await promisePool.query(sql_2)

            await req.flash($success_msg, 'XML Order added successfully.')
            res.redirect(page_path)            

        } catch (error) {
            await req.flash($error_msg, 'Something went wrong.' + ' ' + error)
        }
    }

I haven't had any issues ever in my local environment and the issue seems to have fixed on live server.

Project is still in development so when I update live server with finished program I'll update if there is an issue.

I also have a custom function set up to do multiple SQL statements in a loop, but I am going to remove it as promise pooling seems to work without issues now.

const pool = require('../models/db/db')

const dbQueryPost = (sql, page_path, res) => {
    pool.getConnection(function (err, conn) {
        const sql_length = sql.length - 1
        sql.forEach((q, i) => {
            conn.query(q, (err, rows, fields) => {
                if (i == sql_length) {
                    res.redirect(page_path)
                }
            })
        })
        // conn.release()
        pool.releaseConnection(conn)
    })
}

module.exports = {
    dbQueryPost
}

And use like this inside controller:

const f = require('./defaultFunctions')

action: (req, res) => {
     const sql_1= 'SELECT * FROM mt_posts'
     const sql_2= 'SELECT * FROM mt_orders'

     const sql= [sql_1,sql_2]

     const page_path = '/posts'
     f.dbQueryPost(sql, page_path, res)
}

@ishmum123
Copy link

The socket it internally uses, has a destroyed property. I was thinking of accessing that and reconnecting when a new query comes in. Please suggest if there's something better than this. Alternatively you guys could expose the property.

@ChromeUniverse
Copy link

ChromeUniverse commented Oct 1, 2021

Thanks for the tips, @irepela and @MaksemM!
I've finished refactoring my code to use connection pooling/pool queries with promises and deployed it to my server. Even so, I guess I'll still have to wait a couple of hours to see if it has any effect. I'll post an update if anything comes up.

@ChromeUniverse
Copy link

ChromeUniverse commented Oct 2, 2021

Update: it works!

Async/await connection pooling seems to have done the trick. For the first time, my server has been running for a whole day now without throwing any errors. If anyone's still having any issues, I suggest following irepela's and MaksemM's examples. My current setup is described below.

First, I'm creating a connection pool with mysql2/promise and exporting it for reuse across different modules/files.

  • sql_util.js - creates connection pool:
// package imports
const mysql = require('mysql2/promise');
require('dotenv').config();

// exporting MySQL connection pool object
module.exports = mysql.createPool({
  host: 'localhost',
  user: 'lucca',
  password: process.env.MYSQL_PASSWORD,
  database: 'tank_battle',
  waitForConnections: true,
  connectionLimit: 20,
  queueLimit: 0
})

Then, in my routes/middleware/APIs/whatever, I'm just importing the pool object and running the queries on the pool object, just as you would with a regular MySQL connection object. In other words, I'm using await pool.query instead of await connection.execute or await connection.query.

  • api.js - internal app logic for handling API requests:
const { private } = require('../middleware/private');   // -> custom middleware for private routes
const pool = require('../sql_util');                    // -> creates connection pool (see above) 
const jwt = require("jsonwebtoken");                    // -> "jsonwebtoken" package from npm

// Express Router setup
const router = express.Router();
router.use(private);

// GET /user    --> returns user information
router.get('/user', async (req, res) => {

  const decoded = jwt.decode(req.token, {complete: true});
  const name = decoded.payload.username.toString();

  const sql = "select username, elo, lb_rank, tank_color from users where username=?"

  try {
    const [query_result, fields, err] = await pool.query(sql, [name]);

    if (query_result.length > 0) {

      const q = query_result[0]

      // BinaryRow {
      //   username: 'Lucca',
      //   elo: 1000,
      //   lb_rank: 1,
      //   tank_color: #12fca7
      // }

      const userData = {
        error: false,
        name: q.username,
        elo: q.elo,
        lb_rank: q.lb_rank,
        tank_color: q.tank_color
      }

      res.status(200);
      res.json(userData);
      return;

    } else {
      console.log('user not found');

      res.status(404);
      res.json({
        error: true,
        message: 'User not found'
      });

      return;
    }

  }

  catch (e) {
    res.status(500);
    res.json({
      error: true,
      message: 'Server Error'
    });
    console.log(e);
    return;
  }

});

module.exports = router;

@ArthurMartiros
Copy link

I have come across the same error and found out that i have:
connection.release();
by mistake in my transactional code body before the actual end of the transaction. So the connection.release(); should be in the end of the execution as a normal release or in the rollback part as a failover release of the transaction because in transactional queries the separated several queries share same connection object. Whenever the participant queries or code snippets of the transactional query release the connection before the normal end of the transaction, they cause connection object to hang the socket connection by raising error prone behavior for consecutive queries which take part in transaction and only after the consecutive queries execution we will receive the mentioned error:
Error: Can't add new command when connection is in closed state.

The above mentioned behavior is common for all type of queries that share same connection object. Thanks.

@Nedudi
Copy link

Nedudi commented Oct 14, 2021

I am having the same issue (mysql 8 and knex.js with pool of connections)

@multiwebinc
Copy link

multiwebinc commented Oct 22, 2021

This is what I did to prevent this error from happening. I'm using async functions, so you may need to adapt this to your code, however, the important things to note are that you can check the _closing state of the connection to see if it is closed and you can hook onto the connection stream's close event to handle that:

const mysql = require('mysql2/promise');
const bluebird = require('bluebird');

let mysqlConnection = null;
const getMysqlConnection = async () => {
  // Check to see if connection exists and is not in the "closing" state
  if (!mysqlConnection || mysqlConnection?.connection?._closing) {
    mysqlConnection = await createNewMysqlConnection();
  }
  return mysqlConnection;
}

const createNewMysqlConnection = async () => {
  const connection = await mysql.createConnection({
    host: process.env.MYSQL_HOST,
    database: process.env.MYSQL_DATABASE,
    user: process.env.MYSQL_USER,
    password: process.env.MYSQL_PASSWORD,
    Promise: bluebird,
  });

  // You can do something here to handle the connection
  // being closed when it occurs.
  connection.connection.stream.on('close', () => {
    console.log("MySQL connection closed");
  });
  return connection;
}

Then in any functions that need access, just do

const connection = await getMysqlConnection();

@OpakAlex
Copy link

OpakAlex commented Dec 8, 2021

2 years, no solution, amazing why people still use this library?))))

@ChromeUniverse
Copy link

2 years, no solution, amazing why people still use this library?))))

Many people have already provided their own solutions on this issue thread, and most of them appear to work just fine. Could you at least read through a couple of them?

@OpakAlex
Copy link

OpakAlex commented Dec 8, 2021

yes, copy/paste the best solution, since 1950 ;) i think we did some progress at 2021, but looks like no))) Good i will copy paste code as all world.

@ChromeUniverse
Copy link

Alex

????????

<rant>
Look, here's my point: no, this issue hasn't gone without any solutions for two years, people have already solved this problem before as detailed in previous comments (including the ones I made a couple of months ago) - but God forbid actually taking some time to read any of those, right?

Now, If you're so keen on figuring out stuff on your own, then by all means, feel free to keep banging your head on your keyboard while you sift through the meh-ish docs. Alternatively, you could save some time and frustration and take advantage of the work that the kind folks here have done for you by detailing their solutions to this, frankly, very dumb problem. Honestly, you won't be gaining much by not copying and pasting code from this thread.

Since this discussion is incredibly unproductive, I'll shut up now. Sorry.
</rant>

@sidorares
Copy link
Owner

While I don't have a good actual answer there are some useful proposals. If anyone wants to help to contribute there are some good ways do do so:

  • reliable way to reproduce so local debugging is easy. Good examples: 1) a docker image with everything ready ( including server ), 2) repro repo with code and documentation for external dependencies
  • a PR with reliable failing test

Closing the issue and locking comments, looks like discussion here does not add much value. =

Repository owner locked as too heated and limited conversation to collaborators Dec 9, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests