Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Redis based session for HHVM #2177

Closed
vaibkamble opened this issue Mar 25, 2014 · 37 comments
Closed

Redis based session for HHVM #2177

vaibkamble opened this issue Mar 25, 2014 · 37 comments

Comments

@vaibkamble
Copy link

Hi,

I was trying to configure Redis based session store for HHVM and made following changes to php.ini for HHVM.

session.save_handler = redis
session.save_path = "tcp://localhost:6379"

However, this is not working. Am I doing anything wrong here?

It is throwing error like \nFatal error: Failed to initialize storage module: user (path: )

Thanks,
Vaibhav

@vaibkamble
Copy link
Author

Here are my hhvm details:

HipHop VM 3.0.0-dev+2014.03.21 (rel)
Compiler: heads/master-0-gc8f32c4f783888a7f532252c7d759a8ef01405d1
Repo schema: 04e0be7c507069cc63a4c31b7d94e6c980a733c8

@vaibkamble
Copy link
Author

@ptarjan Is this unsupported feature? If yes, is any other session store supported?

@ptarjan
Copy link
Contributor

ptarjan commented Mar 26, 2014

@jmarrama is our local session expert

@jmarrama
Copy link
Contributor

This could be related to #1883 - to confirm, try setting those values in code via ini_set() after you have called something in the sessions module.

@vaibkamble
Copy link
Author

@jmarrama Thanks! I shall try this.

@cheets
Copy link

cheets commented Apr 1, 2014

I've tested this extensively today with 2.4 and then 3.0.1.

I can't get memcached or redis sessions to work at all. I started by configuring php.ini and adding

session.save_handler = redis
session.save_path = "tcp://localhost:6379"

or

session.save_handler = memcache
session.save_path = "tcp://localhost:11211"

With 2.4 and redis I got this error:

HipHop Warning: Failed to write session data (redis). Please verify that the current setting of session.save_path is correct (tcp://localhost)

With 3.0.1 I get these in both cases (redis and memcached), so looks like my settings from php.ini are not read at all:

\nWarning: open(/tmp/sess_2623667363660313268356262646363366162303835333932393469383534656560313265323434303935616832383662363938383930373566336933383933313838363162383733633034613362356836626631353262356234693532603938346732636935653133333734636166353367316031683362623160373633366, O_RDWR) failed: File name too long (36)
\nWarning: Failed to write session data (files). Please verify that the current setting of session.save_path is correct ()

Then I tried doing the settings in my php file:

ini_set('session.save_handler', 'memcache');
ini_set('session.save_path', 'tcp://127.0.0.1:11211');

would give

\nFatal error: No storage module chosen - failed to initialize session in /home/.../functions.php on line 21

And redis:

ini_set('session.save_handler', 'redis');
ini_set('session.save_path', 'tcp://localhost:6379');

result:

\nWarning: Division by zero
\nFatal error: Uncaught exception 'RedisException' with message 'Ran out of weights selecting redis host for session: 2623667363660313268356262646363366162303835333932393469383534656560313265323434303935616832383662363938383930373566336933383933313838363162383733633034613362356836626631353262356234693532603938346732636935653133333734636166353367316031683362623160373633366' in :\nStack trace:\n#0 (): RedisSessionModule->selectWeight()\n#1 (): RedisSessionModule->connect()\n#2 /home/.../functions.php(21): RedisSessionModule->read()\n#3 /home/.../wp-includes/plugin.php(429): init_session()\n#4 /home/.../wp-settings.php(347): do_action()\n#5 /home/.../wp-config.php(92): include()\n#6 /home/.../wp-load.php(29): include()\n#7 /home/.../wp-blog-header.php(12): include()\n#8 /home/.../index.php(17): include()\n#9 {main}

I'm using Ubuntu 12.04 and Nginx 1.5.12. With HHVM 2.4 I used hhvm-fastcgi.

@scragg0x
Copy link

scragg0x commented Apr 1, 2014

Having the same issue. Ubuntu 12.04 HHVM 3.0.1
I think the session.save_path coming to RedisSessionModule::open is blank.

@csdougliss
Copy link
Contributor

#2075 still an issue for me as well. However using file based sessions.

@scragg0x
Copy link

scragg0x commented Apr 1, 2014

@craigcarnell if this is an issue with file based sessions, are using sessions currently possible with hhvm? What is the default save path if any?

@vaibkamble
Copy link
Author

@craigcarnell @scragg0x File based sessions are working well for me right now. I hope that by the time we port our application to HHVM in my company, redis based sessions would be available. To make it work, you shall set session settings using ini_set.

@kristapsk
Copy link
Contributor

I think I will test this soon in our environment, as we plan to migrate to HHVM 2.4 or 3.0 soon. Some half a year ago, when we moved our production environment to hacked HHVM 2.2.0, Redis-based sessions were the only ones working without problems (see #1234 ).

@cheets
Copy link

cheets commented Apr 2, 2014

Currently default (file-based) sessions are not working for me either. I have nothing configured for sessions in php.ini or in the php file when starting the session. I get these errors:

\nWarning: open(/tmp/sess_2623667363660313268356262646363366162303835333932393469383534656560313265323434303935616832383662363938383930373566336933383933313838363162383733633034613362356836626631353262356234693532603938346732636935653133333734636166353367316031683362623160373633366, O_RDWR) failed: File name too long (36)
\nWarning: Failed to write session data (files). Please verify that the current setting of session.save_path is correct ()

I only have these two lines in my php-file to start the session:

ini_set('session.hash_function', 'whirlpool');
session_start();

@scragg0x
Copy link

scragg0x commented Apr 2, 2014

@cheets

Try to remove

ini_set('session.hash_function', 'whirlpool');

whirlpool has a large digest size. I think the default is sha1

@scragg0x
Copy link

scragg0x commented Apr 2, 2014

Here is a work around I made until a proper fix is released.

https://gist.github.com/scragg0x/9935933

@sinaa
Copy link
Contributor

sinaa commented Apr 2, 2014

The default path /tmp works fine for us (on a large platform in production). Though it creates millions of tiny sess_ files over a few days.

I really hope the memcache/redis issue can be resolved soon.

Also, is the garbage collector not supposed to clean old session files? Is there anyone else facing garbage collector issues?

@csdougliss
Copy link
Contributor

I created a separate issue #2289 for file based sessions so they don't get mixed up with the redis issue, but I think they might be related.

@cheets
Copy link

cheets commented Apr 2, 2014

@scragg0x I tried removing it, changing to md5 or sha1 but same error always. I tried accessing the page through different machines and I found it weird that every machine / browser gave error about same sess_number. Could this have something to do with nginx caching? I use sessions for a custom login on a wordpress site and I have enabled nginx caching there. Or can this be about hhvm caching? I understood that hhvm doesn't always compile the php-file if it thinks there are no changes, or am I wrong?

I tried your workaround and it works. I monitored redis for a while and I can see GET, SETEX and DEL for PHPREDIS_SESSION when user logs in, browses and logs out. So thank you!

@scragg0x
Copy link

scragg0x commented Apr 2, 2014

@cheets

Good, this might also work too, it's smaller, I haven't tested though.
https://gist.github.com/scragg0x/9938053

What do you have fastcgi_ignore_headers in nginx set to?

@cheets
Copy link

cheets commented Apr 2, 2014

@scragg0x

The shorter workaround code works too, so currently HHVM code does not properly set the save_path.

I have

fastcgi_ignore_headers Cache-Control Expires Set-Cookie;

@scragg0x
Copy link

scragg0x commented Apr 2, 2014

@cheets

Try removing Set-Cookie

@cheets
Copy link

cheets commented Apr 3, 2014

@scragg0x

Removed it and restarted nginx and hhvm. Still errors from different hosts are about same sess_number.

If I use Redis and monitor it the session numbers are different for different users so there's no actual problem.

To begin with we shouldn't get any errors about sessions not working (even file-based), so I hope all should be fine when this issue is fixed.

@kristapsk
Copy link
Contributor

Justed tested with the following code (which works in HHVM 2.2.0):

<?php
session_name('PHPSESSID');
ini_set('session.save_handler', 'redis');
ini_set('session.save_path', 'localhost:6379');
ini_set('session.gc_maxlifetime', 1440);
session_start();

Doesn't work with 3.0.1, but already fixed in master.

@sinaa
Copy link
Contributor

sinaa commented Apr 13, 2014

Trying with the latest master ( e084186 ), save_path still does not work for me on Redis (unlike @kristapsk ). I get:

 Warning: Failed to write session data (redis). Please verify that the current setting of session.save_path is correct (tcp://localhost:6379)

Without the tcp:// I get an additional error saying Notice: Undefined index: scheme on my session_start().

However, @scragg0x's short solution works fine.

@r3wt
Copy link

r3wt commented Apr 20, 2014

I too am using scragg0x's solution, after fighting with it for days. Bizarrely, i can't connect to redis with phpRedisAdmin either, so it looks like it could be an issue with the redis extension IMO

@csdougliss
Copy link
Contributor

I can connect to redis just fine using cache (magento) but not with sessions

@pepijnblom
Copy link

What is the content of your local.xml? Make sure you are not using persistent connections, I had problems with that before.

@csdougliss
Copy link
Contributor

 <redis_session> 
    <host><![CDATA[/tmp/redis_6380.sock]]></host>            <!-- Specify an absolute path if using a unix socket -->
    <port>6380</port>
    <password></password>             <!-- Specify if your Redis server requires authentication -->
    <timeout>2.5</timeout>            <!-- This is the Redis connection timeout, not the locking timeout -->
    <persistent></persistent>         <!-- Specify unique string to enable persistent connections. E.g.: sess-db0; bugs with phpredis and php-fpm are known: https://github.com/nicolasff/phpredis/issues/70 -->
    <db>3</db>                        <!-- Redis database number; protection from accidental loss is improved by using a unique DB number for sessions -->
    <compression_threshold>0</compression_threshold>  <!-- Set to 0 to disable compression (recommended when suhosin.session.encrypt=on); known bug with strings over 64k: https://github.com/colinmollenhour/Cm_Cache_Backend_Redis/issues/18 -->
    <compression_lib>lzf</compression_lib>              <!-- gzip, lzf or snappy -->
    <log_level>1</log_level>               <!-- 0 (emergency: system is unusable), 4 (warning; additional information, recommended), 5 (notice: normal but significant condition), 6 (info: informational messages), 7 (debug: the most information for development/testing) -->
    <max_concurrency>6</max_concurrency>                 <!-- maximum number of processes that can wait for a lock on one session; for large production clusters, set this to at least 10% of the number of PHP processes -->
    <break_after_frontend>5</break_after_frontend>       <!-- seconds to wait for a session lock in the frontend; not as critical as admin -->
    <break_after_adminhtml>30</break_after_adminhtml>
    <bot_lifetime>7200</bot_lifetime>                    <!-- Bots get shorter session lifetimes. 0 to disable -->
    <disable_locking>0</disable_locking>                 <!-- Disable session locking entirely. -->
 </redis_session>

@danslo
Copy link
Contributor

danslo commented Apr 30, 2014

Try using gzip compression also, I've heard there were some issues with the other ones.

@csdougliss
Copy link
Contributor

Thanks @danslo but compression is disabled with compression_threshold. I'll try it again tomorrow

@r3wt
Copy link

r3wt commented May 14, 2014

any news on this?

@lbogdan
Copy link
Contributor

lbogdan commented May 18, 2014

I fixed this, please let me know the required steps in order to become a contributor and submit pull requests.

@sinaa
Copy link
Contributor

sinaa commented May 18, 2014

@lbogdan this sounds great. Please read the section "Contributing" here https://github.com/facebook/hhvm .

@lbogdan
Copy link
Contributor

lbogdan commented May 18, 2014

@sinaa Unfortunately I don't have a printer / scanner handy for the moment. Is there some other way?

@danslo
Copy link
Contributor

danslo commented May 18, 2014

@lbogdan No facebook account either? You can do it online if you have one.

@sinaa
Copy link
Contributor

sinaa commented May 18, 2014

@lbogdan Just go here and Agree: https://code.facebook.com/cla
The print/scan is just an alternative to this.

@lbogdan
Copy link
Contributor

lbogdan commented May 18, 2014

Dumb me, I only clicked on the "pdf" link. 😒

Is there a tag / branch I should base my pull request on, or master should be just fine?

@WizKid
Copy link
Contributor

WizKid commented May 19, 2014

Master is best.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.