You can clone with
HTTPS or Subversion.
Patch from crowbot:
I've done a bit more work on trying to isolate this bug, using the same data set as frabcus. I've had some success in tracking it down further, and got a fix but not a definitive answer as to what the issue is.
Basically, the Actions plugin in piwik makes nested arrays to represent different 'actions' (url segments) in the logs of site visits. These then get made into DataTables, with various different sorting and limiting filters being applied to them.
The situation that I think causes the segfault is that there's a url with more than 10 slash-separated path segments in it, on a day when there are also more than 100 rows to record and archive. The url causes 10-deep array to be built (which is normally fine) but then the fact that there are more than 100 records causes filters to be applied to sort the records in the resulting data table, remove some, and add a summary row. This process somehow corrupts the data table such that any subsequent attempt to access rows in it causes a segfault (or on OSX, a bus error). It's fairly icky as the filter classes are generated on the fly and I think the problem may be something to do with several on-the-fly created classes holding a reference to this deeply nested array with objects in it. For now, the problem seems solved by unsetting a reference to it in the final filter after use.
--- core/DataTable/Filter/AddSummaryRow.php (revision 1152)
+++ core/DataTable/Filter/AddSummaryRow.php (working copy)
@@ -68,5 +68,6 @@
$this->labelSummaryRow) + $newRow->getColumns());
did you test this patch in prod?
did unit tests passed?
Unit tests pass and it doesn't seem to have any negative side effects on my system, but I don't have deeply nested URLs. crowbot proposed and another user confirmed the fix on the forum.
I'll try to set-up some dummy data to test before commiting.
That said, I'm perplexed how this is a fix as it unsets a local variable. How messed up is reference counting in PHP?!
Ok, an unofficial explanation: http://ca.php.net/manual/en/function.unset.php#56332
This is still being investigated as I haven't been able to reproduce the segfault or memory leak. (In fact, when I add the unset(), memory_get_usage() appears to be higher than without.) I also tried increasing the max recursion level to the maximum (15) in test_serializeWithDeepNesting().
(In ) refs #824 - workaround reported seg fault in some php builds/versions with deeply nested action URLs