New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Notice: unserialize() [function.unserialize]: Error at offset #268

Open
tontof opened this Issue Sep 2, 2013 · 3 comments

Comments

Projects
None yet
2 participants
@tontof
Owner

tontof commented Sep 2, 2013

Try to find out why there are some data corruption.
For now I don't think it's because of kriss_feed as used php functions are :

PHPPREFIX
. base64_encode(gzdeflate(serialize($this->_data)))
. PHPSUFFIX

and

 $this->_data = unserialize(
                    gzinflate(
                        base64_decode(
                            substr(
                                file_get_contents($this->dataFile),
                                strlen(PHPPREFIX),
                                -strlen(PHPSUFFIX)
                                )
                            )
                        )
                    );

The problem is that php seems to fail to serialize the data.

This issue will be here to list php versions and try to find why there is such strange problem :-(
#140 #159

@Seb-C

This comment has been minimized.

Show comment
Hide comment
@Seb-C

Seb-C Sep 9, 2013

I didn't read all the source or the way you coded it, but it seems that you're not using any locking mechanism.
So, what if the user tries to read or write data while a cronjob is writing at the same time ?
You should try to replace file_get_contents and file_put_contents everywhere by manual reads, and then use the flock function to force the other threads to wait the end of an operation to start another.

EDIT : it seems that file_put_contents has a flag named "LOCK_EX", maybe it's an easier solution.

Seb-C commented Sep 9, 2013

I didn't read all the source or the way you coded it, but it seems that you're not using any locking mechanism.
So, what if the user tries to read or write data while a cronjob is writing at the same time ?
You should try to replace file_get_contents and file_put_contents everywhere by manual reads, and then use the flock function to force the other threads to wait the end of an operation to start another.

EDIT : it seems that file_put_contents has a flag named "LOCK_EX", maybe it's an easier solution.

@tontof

This comment has been minimized.

Show comment
Hide comment
@tontof

tontof Sep 9, 2013

Owner

I've thought about that but the strange thing is that the data is well written meaning for me that it's not linked to a concurrent access to the file. Actually, it's possible to unzip and decode the file. The problem occurs when unserializing the data.
For example :
I've got this on problematic data :

aFy8gDA0k7DF:{ia:2s:10at10:"64879083:101t10atee";

while it should look like this :

s:12:"1pXdLQZlExjw";a:2:{i:0;i:1364966392;i:1;i:1;}

I don't know if it's possible that this is effectively link to LOCK

Owner

tontof commented Sep 9, 2013

I've thought about that but the strange thing is that the data is well written meaning for me that it's not linked to a concurrent access to the file. Actually, it's possible to unzip and decode the file. The problem occurs when unserializing the data.
For example :
I've got this on problematic data :

aFy8gDA0k7DF:{ia:2s:10at10:"64879083:101t10atee";

while it should look like this :

s:12:"1pXdLQZlExjw";a:2:{i:0;i:1364966392;i:1;i:1;}

I don't know if it's possible that this is effectively link to LOCK

@tontof

This comment has been minimized.

Show comment
Hide comment
@tontof

tontof Sep 10, 2013

Owner

I've added a LOCK_EX when writing data and cache
175d888
Wait and see

Owner

tontof commented Sep 10, 2013

I've added a LOCK_EX when writing data and cache
175d888
Wait and see

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment