Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

read error on connection #18

Closed
fr0x opened this issue Oct 17, 2012 · 22 comments
Closed

read error on connection #18

fr0x opened this issue Oct 17, 2012 · 22 comments

Comments

@fr0x
Copy link

fr0x commented Oct 17, 2012

I saw you popped into this issue with PHPRedis here:
phpredis/phpredis#70

It might completely reside with them but thought I would post here in case you had found a solution outside of PHPRedis to handle this with Cm_Cache_Backend_Redis.

The only new thing I saw of note in that thread was within the past few weeks:

"I encountered this same error message. Using the tip from @redzarf - I did some research and found that some of the strings we were trying to store are indeed more than 64KB (65,536 Characters).

As a quick solution I started gzip'ing data before storing it in redis. Havn't seen the "read error on connection" error since."

Here are some of my dump(s) when this hits:

Error:
read error on connection

Trace:
#0 /app/code/community/Cm/Cache/Backend/Redis.php(128): Credis_Client->__call('hGet', Array)
#1 /app/code/community/Cm/Cache/Backend/Redis.php(128): Credis_Client->hGet('zc:k:139_TRANSL...', 'd')
#2 /lib/Zend/Cache/Core.php(303): Cm_Cache_Backend_Redis->load('139_TRANSLATE_E...', false)
#3 /app/code/core/Mage/Core/Model/Cache.php(351): Zend_Cache_Core->load('TRANSLATE_EN_US...')
#4 /app/code/core/Mage/Core/Model/App.php(1126): Mage_Core_Model_Cache->load('translate_en_US...')
#5 /app/code/core/Mage/Core/Model/Translate.php(521): Mage_Core_Model_App->loadCache('translate_en_US...')
#6 /app/code/core/Mage/Core/Model/Translate.php(121): Mage_Core_Model_Translate->_loadCache()
#7 /app/code/core/Mage/Core/Model/App/Area.php(146): Mage_Core_Model_Translate->init('frontend')
#8 /app/code/core/Mage/Core/Model/App/Area.php(121): Mage_Core_Model_App_Area->_initTranslate()
#9 /app/code/core/Mage/Core/Model/App/Area.php(93): Mage_Core_Model_App_Area->_loadPart('translate')
#10 /app/code/core/Mage/Core/Model/App.php(768): Mage_Core_Model_App_Area->load()
#11 /app/code/core/Mage/Core/Controller/Varien/Action.php(493): Mage_Core_Model_App->loadArea('frontend')
#12 /app/code/core/Mage/Core/Controller/Front/Action.php(59): Mage_Core_Controller_Varien_Action->preDispatch()
#13 /app/code/core/Mage/Core/Controller/Varien/Action.php(409): Mage_Core_Controller_Front_Action->preDispatch()
#14 /app/code/core/Mage/Core/Controller/Varien/Router/Standard.php(250): Mage_Core_Controller_Varien_Action->dispatch('view')
#15 /app/code/core/Mage/Core/Controller/Varien/Front.php(176): Mage_Core_Controller_Varien_Router_Standard->match(Object(Mage_Core_Controller_Request_Http))
#16 /app/code/core/Mage/Core/Model/App.php(349): Mage_Core_Controller_Varien_Front->dispatch()
#17 /app/Mage.php(640): Mage_Core_Model_App->run(Array)
#18 /index.php(83): Mage::run('', 'store')
#19 {main}

Error:
read error on connection

Trace:
#0 /app/code/community/Cm/Cache/Backend/Redis.php(128): Credis_Client->__call('hGet', Array)
#1 /app/code/community/Cm/Cache/Backend/Redis.php(128): Credis_Client->hGet('zc:k:139_DB_PDO...', 'd')
#2 /lib/Zend/Cache/Core.php(303): Cm_Cache_Backend_Redis->load('139_DB_PDO_MYSQ...', false)
#3 /lib/Varien/Db/Adapter/Pdo/Mysql.php(1442): Zend_Cache_Core->load('DB_PDO_MYSQL_DD...')
#4 /lib/Varien/Db/Adapter/Pdo/Mysql.php(1564): Varien_Db_Adapter_Pdo_Mysql->loadDdlCache('eav_attribute', 1)
#5 /app/code/core/Mage/Catalog/Model/Resource/Product/Attribute/Collection.php(55): Varien_Db_Adapter_Pdo_Mysql->describeTable('eav_attribute')
#6 /app/code/core/Mage/Core/Model/Resource/Db/Collection/Abstract.php(135): Mage_Catalog_Model_Resource_Product_Attribute_Collection->_initSelect()
#7 /app/code/core/Mage/Core/Model/Config.php(1350): Mage_Core_Model_Resource_Db_Collection_Abstract->__construct(Array)
#8 /app/code/core/Mage/Core/Model/Config.php(1386): Mage_Core_Model_Config->getModelInstance('catalog_resourc...', Array)
#9 /app/Mage.php(460): Mage_Core_Model_Config->getResourceModelInstance('catalog/product...', Array)
#10 /app/code/core/Mage/Catalog/Model/Layer.php(226): Mage::getResourceModel('catalog/product...')
#11 /app/code/core/Mage/Catalog/Block/Layer/View.php(163): Mage_Catalog_Model_Layer->getFilterableAttributes()
#12 /app/code/local/Lp/Catalog/Block/Layer/View.php(121): Mage_Catalog_Block_Layer_View->_getFilterableAttributes()
#13 /app/code/core/Mage/Core/Block/Abstract.php(238): Lp_Catalog_Block_Layer_View->_prepareLayout()
#14 /app/code/core/Mage/Core/Model/Layout.php(430): Mage_Core_Block_Abstract->setLayout(Object(Mage_Core_Model_Layout))
#15 /app/code/core/Mage/Core/Model/Layout.php(446): Mage_Core_Model_Layout->createBlock('catalog/layer_v...', 'catalog.leftnav')
#16 /app/code/core/Mage/Core/Model/Layout.php(238): Mage_Core_Model_Layout->addBlock('catalog/layer_v...', 'catalog.leftnav')
#17 /app/code/core/Mage/Core/Model/Layout.php(204): Mage_Core_Model_Layout->_generateBlock(Object(Mage_Core_Model_Layout_Element), Object(Mage_Core_Model_Layout_Element))
#18 /app/code/core/Mage/Core/Model/Layout.php(209): Mage_Core_Model_Layout->generateBlocks(Object(Mage_Core_Model_Layout_Element))
#19 /app/code/core/Mage/Core/Controller/Varien/Action.php(345): Mage_Core_Model_Layout->generateBlocks()
#20 /app/code/core/Mage/Catalog/controllers/CategoryController.php(150): Mage_Core_Controller_Varien_Action->generateLayoutBlocks()
#21 /app/code/core/Mage/Core/Controller/Varien/Action.php(420): Mage_Catalog_CategoryController->viewAction()
#22 /app/code/core/Mage/Core/Controller/Varien/Router/Standard.php(250): Mage_Core_Controller_Varien_Action->dispatch('view')
#23 /app/code/core/Mage/Core/Controller/Varien/Front.php(176): Mage_Core_Controller_Varien_Router_Standard->match(Object(Mage_Core_Controller_Request_Http))
#24 /app/code/core/Mage/Core/Model/App.php(349): Mage_Core_Controller_Varien_Front->dispatch()
#25 /app/Mage.php(640): Mage_Core_Model_App->run(Array)
#26 /index.php(83): Mage::run('', 'store')
#27 {main}

@fr0x
Copy link
Author

fr0x commented Oct 17, 2012

As far as consistency....this seems to happen around once a day (when it does trigger, it tends to trigger 4-7 errors in a row, all read error on connection). Time it does happen is not consistent nor is the load on the server at the time of it happening.

@colinmollenhour
Copy link
Owner

Interesting. If you don't like compression perhaps setting the "compress_threshold" option at 64k would be a good solution then.

You might also check your Redis config regarding disk activity as the cache is pretty low-priority and should be written to disk rarely if at all and the append-only log doesn't need to be enabled IMO.

@fr0x
Copy link
Author

fr0x commented Oct 17, 2012

The over 64KB post was actually from someone else. I do have compression enabled in my Cm_Cache_Backend_Redis

<compress_data>1</compress_data>
<compress_tags>1</compress_tags>
<compress_threshold>20480</compress_threshold>
<compression_lib>gzip</compression_lib>

(think those are all default values)

Just was more curious if you still see the "read error on connection" error popup with Cm_Cache_Backend_Redis and/or the times you ran into the issue if it was data that was > 64KB

@colinmollenhour
Copy link
Owner

I don't recall ever discovering some exact size that triggered errors consistently and I can't remember what the largest size I tested was but it seems like it may have been over 64k..

@jonathanselander
Copy link

I run into this as well, but only when i flushAll() from PHP, it doesn't seem to happen if i use redis-cli.

@davidyilee
Copy link

We had the same issue, turning on lzf compression for strings above 40k fix this for us. We did not try pushing it up to 64k to test the actual boundary.

@colinmollenhour
Copy link
Owner

Yeah this issue is definitely a phpredis bug: phpredis/phpredis#70

Redis server's redis.h contains a 64k constant named: REDIS_INLINE_MAX_SIZE. However, I have no idea how to fix the bug but I believe the fault lies with phpredis. I suppose either Credis_Client or Cm_Cache_Backend_Redis could try to split larger string into multiple keys but with expiration and all this would be messy..

@colinmollenhour
Copy link
Owner

I just updated the unit tests for lib/Credis and can't reproduce the issue with 128K length strings.. I was using an old phpredis version and also just built the latest and still couldn't reproduce it.

@jonathanselander
Copy link

There are moments when we have the issue on every "flush cache" inside magento

Also, it appeared on the frontend when two phpredis cli calls (that's php-cli) issued flushAll in parallell to the same redis server

Connection over TCP

19 nov 2012 kl. 23:34 skrev Colin Mollenhour notifications@github.com:

I just updated the unit tests for lib/Credis and can't reproduce the issue with 128K length strings.. I was using an old phpredis version and also just built the latest and still couldn't reproduce it.


Reply to this email directly or view it on GitHub.

@colinmollenhour
Copy link
Owner

The other stated reason for "read error on connection" issues seemed to be timeouts. Does the flushAll operation take much time to complete? If so it may be that the read timeout is occurring. I can see this may be the case if flushAll causes a wait on disk activity and your disk is slow or highly utilized.. A similar situation could explain the errors on cache loads: the Redis server is busy handling other requests and the read timeout is too short.

@colinmollenhour
Copy link
Owner

I somehow overlooked this before, but phpredis currently uses the connect timeout as the read timeout as well.. This is not ideal, but as a workaround you could set your timeout higher to avoid the read errors. The drawback is if there actually is a connection problem the error will not be thrown as quickly so don't set it too high.

I added a setReadTimeout method to Credis_Client and support for a read_timeout option in Cm_Cache_Backend_Redis which will not work for phpredis until phpredis/phpredis#260 is merged. So, go post a note there kindly asking the maintainer to merge it.. :)

If it is very urgent you could merge that pull request with phpredis yourself and run with your forked version.

If anyone tries this please let me know if it fixes your issues.

DerekMarcinyshyn pushed a commit to DerekMarcinyshyn/Cm_Cache_Backend_Redis that referenced this issue Mar 18, 2013
@colinmollenhour
Copy link
Owner

FYI, a version of phpredis including the read timeout feature has finally been merged and tagged. Does setting a longer read timeout fix your issue?

@fr0x
Copy link
Author

fr0x commented May 3, 2013

Funny you should ask.

I actually installed the latest version of phpredis yesterday and plan on putting on the latest version of Cm_Cache_Backend_Redis either today or next week.

@fr0x
Copy link
Author

fr0x commented May 16, 2013

After upgrading both phpredis and Cm_Cache_Backend_Redis I haven't had this issue come back up (has been about 2 weeks). Not sure if the combination was required or if upgrading both was required (I dont believe the version of Cm_Cache_Backend_Redis had timeout / retry values implemented yet).

@colinmollenhour
Copy link
Owner

Cool, thanks for the update. Upgrading both would be probably be required in your case. Closing ticket unless there are any objections.

@LukeHandle
Copy link

LukeHandle commented Apr 23, 2017

Hey all,

This bug is referenced on the configuration example for Cm_RedisSession (and in current Magento documentation as a result):

https://github.com/colinmollenhour/Cm_RedisSession/blob/master/README.md#configuration-example

[compression_threshold] Known bug with strings over 64k: #18

The implication (or ambiguity) from that text is that there are problems compressing when over 64K - this bug is saying the opposite as far as I can see? The solutions here involved enabling compression due to the hard set limit of 64K (I couldn't find the 3.2 version).

edit: Though was then not seen when tested here #18 (comment) ? So there is no bug here anymore and the Cm_RedisSession can have updated docs?

We should therefore encourage compression with a threshold of say, ~60K, even on local instances (both for sessions and here at Cm_Cache_Backend_Redis)?

@colinmollenhour
Copy link
Owner

I suppose it is already encouraged by demonstrating a default non-0 value. It is not entirely clear to me if the 64k issue is still a thing or not.. However, would be happy to accept PR with README improvements.

@midlan
Copy link

midlan commented Aug 10, 2021

I'm confused with that README too. What should I do to avoid 64k bug?

  1. Should I set compression_threshold to 2048
  2. Should I set compression_threshold to 0
  3. Should I set compression_threshold to different value?

Can someone clarify that for me?

@colinmollenhour
Copy link
Owner

I don't know if there is a one-size fits all, but I think generally a value around 2k makes sense. Compressing a really small string offers no benefit, just adds extra processing.

@midlan
Copy link

midlan commented Aug 13, 2021

Ok I understand, but is there some relationship between compression_threshold and 64k bug?

@LukeHandle
Copy link

@midlan I think my conclusion I drew previously was that the 64k bug was resolved mitigated by compressing the data. Un-compressed, they saw issues when the data was over 64k, but compressed would have been a fraction of that inserted into Redis, thus they did not see the bug anymore.

Summary: Set it to 2048. Compression is good.

===

Also note - the "read error on connection" does not mean "64k bug" (it's fairly generic for issues). If you're seeing that now and assuming it's related - it's probably not. By all means check the size of the data you're pushing 🤷‍♂️

@colinmollenhour
Copy link
Owner

Yes, compression just makes it less likely to reach 64k. I would hope the 64k bug was fixed long ago at this point though so it seems unlikely that was actually the issue you had.

Also, I think 64k is pretty excessive for session data and I would audit what is being stored in the session to cause it to become so large and cut that out regardless of there being a redis bug or not. At 64KB you can only hold 16k sessions using a full gig of RAM and that just isn't very scalable/economical.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants