New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update to v. 2.0.15 tt_news catmenu and pagebrowser not working any more #244

Closed
triograf opened this Issue Sep 8, 2016 · 82 comments

Comments

Projects
None yet
8 participants
@triograf

triograf commented Sep 8, 2016

After updating realurl to v. 2.0.15. (TYPO3 v. 7.6.9) the category menu and the pagebrowser of the ext tt_news are not working anymore. The url changes but the content in the frontend remains the same without any change.

reaulurl for tt_news is configured like this:

'postVarSets' => array(
                '_DEFAULT' => array(

                    'artikel' => array(
                        array(
                            'GETvar' => 'tx_ttnews[tt_news]',
                            'lookUpTable' => array(
                                'table' => 'tt_news',
                                'id_field' => 'uid',
                                'alias_field' => 'title',
                                'addWhereClause' => ' AND NOT deleted',
                                'useUniqueCache' => 1,
                                'useUniqueCache_conf' => array(
                                    'strtolower' => 1,
                                    'spaceCharacter' => '-',
                                ),
                            ),
                        ),
                    ),

                    'kategorie' => array(
                        array(
                            'GETvar' => 'tx_ttnews[cat]',
                            'lookUpTable' => array(
                                'table' => 'tt_news_cat',
                                'id_field' => 'uid',
                                'alias_field' => 'title',
                                'addWhereClause' => ' AND NOT deleted',
                                'useUniqueCache' => 1,
                                'useUniqueCache_conf' => array(
                                    'strtolower' => 1,
                                    'spaceCharacter' => '-',
                                ),
                            ),
                        ),
                    ),

                    'browse' => array(
                        array(
                            'GETvar' => 'tx_ttnews[pointer]',
                        ),
                    ),
                ),
            ),
@fiedomedia

This comment has been minimized.

Show comment
Hide comment
@fiedomedia

fiedomedia Sep 8, 2016

Same issue with ext:news: news URL of single view changes, but shows always the same news.
Deactivating TYPO3-Cache helps, but is no solution.

fiedomedia commented Sep 8, 2016

Same issue with ext:news: news URL of single view changes, but shows always the same news.
Deactivating TYPO3-Cache helps, but is no solution.

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 8, 2016

Owner

Guys, please, ask in mailing lists. You have cHash issues in links. This is not something that realurl can fix for you anymore. It did until security team decided it is not wanted. So you have to set cHash correctly now...

Owner

dmitryd commented Sep 8, 2016

Guys, please, ask in mailing lists. You have cHash issues in links. This is not something that realurl can fix for you anymore. It did until security team decided it is not wanted. So you have to set cHash correctly now...

@AndreasA

This comment has been minimized.

Show comment
Hide comment
@AndreasA

AndreasA Sep 9, 2016

Contributor

What happens with the 2.0.15 change if one calls an URL that requires a cHash prior to the Link being generated?
If I understand it correctly that means that no cache entry was stored and therefore in the decoder there is no cHash value at all which means that decoding will fails and will result in 404 if pageNotFoundOnCHashError is enabled?

Also during realurl encoding it is possible that parameters are removed from URL due to nomatch bypass and that would change the cHash from the original calculated URL although as long as the speaking URL and the desired URL are stored that shouldn't be a problem.

I think one solution would be to only generate cHash if all parameters are valid speaking URL parameters.
No arbitrary parameters that could have been modififed in any way possible as all would have been decoded (or removed) by realurl decoding process.

All other URLs would have the cHash appended GET parameter anyway.

Contributor

AndreasA commented Sep 9, 2016

What happens with the 2.0.15 change if one calls an URL that requires a cHash prior to the Link being generated?
If I understand it correctly that means that no cache entry was stored and therefore in the decoder there is no cHash value at all which means that decoding will fails and will result in 404 if pageNotFoundOnCHashError is enabled?

Also during realurl encoding it is possible that parameters are removed from URL due to nomatch bypass and that would change the cHash from the original calculated URL although as long as the speaking URL and the desired URL are stored that shouldn't be a problem.

I think one solution would be to only generate cHash if all parameters are valid speaking URL parameters.
No arbitrary parameters that could have been modififed in any way possible as all would have been decoded (or removed) by realurl decoding process.

All other URLs would have the cHash appended GET parameter anyway.

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 9, 2016

Owner

@AndreasA You are repeating approximately the same arguments that I told to the security team. Guess the answer.

Owner

dmitryd commented Sep 9, 2016

@AndreasA You are repeating approximately the same arguments that I told to the security team. Guess the answer.

@seirerman

This comment has been minimized.

Show comment
Hide comment
@seirerman

seirerman Sep 9, 2016

Since updating from 2.0.14 to 2.0.15 I get broken links for tt_news detail views, too.
The RealURL config and therefore the URLs stayed the same.

Even with the RealURL autoconfig the errors remain. No hints in the Apache error log, either.

I also noticed that after truncating the RealURL tables the URLs work once, but fail to load on the second try (cache is disabled on the test site).

seirerman commented Sep 9, 2016

Since updating from 2.0.14 to 2.0.15 I get broken links for tt_news detail views, too.
The RealURL config and therefore the URLs stayed the same.

Even with the RealURL autoconfig the errors remain. No hints in the Apache error log, either.

I also noticed that after truncating the RealURL tables the URLs work once, but fail to load on the second try (cache is disabled on the test site).

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 9, 2016

Owner

@seirerman You should never have cache disabled, even on a test site. Disabling it on a test site and enable in production, makes your two environments very different. This way your test site does not represent your live site.

And with 2.0.15 you will always get 404s if you have cache disabled. This may change in 2.1.0 because I am thinking about bringing this functionality back but as optional (it will require special enabling in the configuration).

Owner

dmitryd commented Sep 9, 2016

@seirerman You should never have cache disabled, even on a test site. Disabling it on a test site and enable in production, makes your two environments very different. This way your test site does not represent your live site.

And with 2.0.15 you will always get 404s if you have cache disabled. This may change in 2.1.0 because I am thinking about bringing this functionality back but as optional (it will require special enabling in the configuration).

@AndreasA

This comment has been minimized.

Show comment
Hide comment
@AndreasA

AndreasA Sep 9, 2016

Contributor

@dmitryd
So only calculating the cHash if all arguments were decoded via realurl is a no go too?
Or to be more precise for valueMap and lookupTable only
Because I think that shouldn't create any security issues as the path would only be correct if the parameter e.g. exists in the database or valueMap.

I also remember an instance (was with realurl 1.x) where I had an issue in that regard that I removed parameters via the realurl configuration as the were unnecessary but then I had some links that appended the cHash and some that didn't which then resulted in problems because the speaking URL was always the same.
The solution was of course to ensure that the unnessary parameters were never added.

Contributor

AndreasA commented Sep 9, 2016

@dmitryd
So only calculating the cHash if all arguments were decoded via realurl is a no go too?
Or to be more precise for valueMap and lookupTable only
Because I think that shouldn't create any security issues as the path would only be correct if the parameter e.g. exists in the database or valueMap.

I also remember an instance (was with realurl 1.x) where I had an issue in that regard that I removed parameters via the realurl configuration as the were unnecessary but then I had some links that appended the cHash and some that didn't which then resulted in problems because the speaking URL was always the same.
The solution was of course to ensure that the unnessary parameters were never added.

@seirerman

This comment has been minimized.

Show comment
Hide comment
@seirerman

seirerman Sep 9, 2016

Well, i humbly disagree with you on the no_cache thing for developmental sites.

But even with config.no_cache=0 I still get the same behaviour (404s on the second try).

seirerman commented Sep 9, 2016

Well, i humbly disagree with you on the no_cache thing for developmental sites.

But even with config.no_cache=0 I still get the same behaviour (404s on the second try).

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 9, 2016

Owner

@AndreasA This is what realurl actually did before 2.0.15: if there is no entry in the url cache and there are decoded URL parameters, it created a cHash. This allowed pageNotFoundOnCHashError check and reqCHash() functions to see a proper cHash for the URL. In my opinion it is a valid part of the decoding process because the encoder strips cHash. So if the encoder strips it, the decoder should re-create it. However, it opens possibilities for the attacker to request every page of the site with, for example, URL part of the tx_news and potentially fill thousands of page cache entries. Thus it can be treated as DoS attack because page cache may grow too much.

In my opinion this should not be realurl issue but a page cache issue. Cache should be able to keep its size under control and make sure that it works efficiently even with many entries. For example, memcached does that: it has a limit to the memory and starts discarding less used entries when there are too many of them. Nothing like that exists in TYPO3 caching framework. It just adds and adds new data. However security team disagreed that it should be solved on the page cache side. I do not blame them and understand their position: it is easier to remove cHash recalculation that to rework caching framework to keep caches under control. So we got easy but worse solution to the page cache DoS issue. If I were on their side, I would probably vote for such solution as well. Given lack of full time TYPO3 core developers, solving this on the core side is very unlikely.

Owner

dmitryd commented Sep 9, 2016

@AndreasA This is what realurl actually did before 2.0.15: if there is no entry in the url cache and there are decoded URL parameters, it created a cHash. This allowed pageNotFoundOnCHashError check and reqCHash() functions to see a proper cHash for the URL. In my opinion it is a valid part of the decoding process because the encoder strips cHash. So if the encoder strips it, the decoder should re-create it. However, it opens possibilities for the attacker to request every page of the site with, for example, URL part of the tx_news and potentially fill thousands of page cache entries. Thus it can be treated as DoS attack because page cache may grow too much.

In my opinion this should not be realurl issue but a page cache issue. Cache should be able to keep its size under control and make sure that it works efficiently even with many entries. For example, memcached does that: it has a limit to the memory and starts discarding less used entries when there are too many of them. Nothing like that exists in TYPO3 caching framework. It just adds and adds new data. However security team disagreed that it should be solved on the page cache side. I do not blame them and understand their position: it is easier to remove cHash recalculation that to rework caching framework to keep caches under control. So we got easy but worse solution to the page cache DoS issue. If I were on their side, I would probably vote for such solution as well. Given lack of full time TYPO3 core developers, solving this on the core side is very unlikely.

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 9, 2016

Owner

@seirerman

Well, i humbly disagree with you on the no_cache thing for developmental sites.

It is up to you :) But you should always realize that you put your def site into completely different behavior than your live site. For example, if you forget a typolink.userCacheHash = 1, it will work properly on your dev but it will not work on your live host. Just beware of consequences.

Owner

dmitryd commented Sep 9, 2016

@seirerman

Well, i humbly disagree with you on the no_cache thing for developmental sites.

It is up to you :) But you should always realize that you put your def site into completely different behavior than your live site. For example, if you forget a typolink.userCacheHash = 1, it will work properly on your dev but it will not work on your live host. Just beware of consequences.

@seirerman

This comment has been minimized.

Show comment
Hide comment
@seirerman

seirerman Sep 9, 2016

@dmitryd: True, but my point still stands: I get 404s regardless of the cache settings with RealURL 2.0.15

And you mean useCacheHash, right?

seirerman commented Sep 9, 2016

@dmitryd: True, but my point still stands: I get 404s regardless of the cache settings with RealURL 2.0.15

And you mean useCacheHash, right?

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 9, 2016

Owner

@seirerman

True, but my point still stands: I get 404s regardless of the cache settings with RealURL 2.0.15

Yes, you will get those if you are logged in to BE because that disables realurl cache.

And you mean useCacheHash, right?

Yes. That was a typo.

Owner

dmitryd commented Sep 9, 2016

@seirerman

True, but my point still stands: I get 404s regardless of the cache settings with RealURL 2.0.15

Yes, you will get those if you are logged in to BE because that disables realurl cache.

And you mean useCacheHash, right?

Yes. That was a typo.

@seirerman

This comment has been minimized.

Show comment
Hide comment
@seirerman

seirerman Sep 9, 2016

@dmitryd: But I am not logged in!

seirerman commented Sep 9, 2016

@dmitryd: But I am not logged in!

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 9, 2016

Owner

@seirerman Not logged in OR have cache disabled is the same. See how it works with and without cache.

With cache
User opens the page. All links are processed by the encoder and saved to the url cache with cHash. When the user clicks on the link, the decoder checks the cache, finds the "speaking" url and a corresponding original url with cHash. It re-creates all parameters including cHash in $_GET. TYPO3 is happy.

When BE user is logged in or URL cache is disabled
User opens the page. All links are processed by the encoder but not saved to the url cache. When the user clicks on the link, the decoder checks the cache, finds nothing and decodes the URL according to the configuration.

Prior to version 2.0.15 the decoder also checked if there are URL parameters but cHash is missing. If yes and yes, it would create a cHash. Since 2.0.15 cHash is not created. Than the decoder puts all parameters to $_GET. TYPO3 calls the plugin. The plugin tells TYPO3 that cHash is required ($GLOBALS['TSFE']->reqCHash()). TYPO3 checks if cHash exists, finds none and aborts with a 404. There is your current 404.

Owner

dmitryd commented Sep 9, 2016

@seirerman Not logged in OR have cache disabled is the same. See how it works with and without cache.

With cache
User opens the page. All links are processed by the encoder and saved to the url cache with cHash. When the user clicks on the link, the decoder checks the cache, finds the "speaking" url and a corresponding original url with cHash. It re-creates all parameters including cHash in $_GET. TYPO3 is happy.

When BE user is logged in or URL cache is disabled
User opens the page. All links are processed by the encoder but not saved to the url cache. When the user clicks on the link, the decoder checks the cache, finds nothing and decodes the URL according to the configuration.

Prior to version 2.0.15 the decoder also checked if there are URL parameters but cHash is missing. If yes and yes, it would create a cHash. Since 2.0.15 cHash is not created. Than the decoder puts all parameters to $_GET. TYPO3 calls the plugin. The plugin tells TYPO3 that cHash is required ($GLOBALS['TSFE']->reqCHash()). TYPO3 checks if cHash exists, finds none and aborts with a 404. There is your current 404.

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 9, 2016

Owner

Until 2.1.0 is out, everybody, who suffer from this issue, can use one of two solutions:

  • Make sure that URL cache is enabled (no cache => disable = true in your config). Use a different browser to see FE when you are logged in to BE.
  • Install 2.0.14 again (if you cannot get it from TER, use dmitryd\typo3-realurl in composer instead of typo3-ter/realurl)
Owner

dmitryd commented Sep 9, 2016

Until 2.1.0 is out, everybody, who suffer from this issue, can use one of two solutions:

  • Make sure that URL cache is enabled (no cache => disable = true in your config). Use a different browser to see FE when you are logged in to BE.
  • Install 2.0.14 again (if you cannot get it from TER, use dmitryd\typo3-realurl in composer instead of typo3-ter/realurl)
@seirerman

This comment has been minimized.

Show comment
Hide comment
@seirerman

seirerman Sep 9, 2016

The pages are cached (config.no_cache=0) AND I'm not logged in (Different URL than BE, different browser, doesn't matter), is what I'm trying to say. ;-)

Anyway, would going back to RealURL 1.13.6 be a valid option, too, regarding the DOS possibility and until 2.1.0 comes out?
We're still on TYPO3 6.2 and I never had a problem with that version.

seirerman commented Sep 9, 2016

The pages are cached (config.no_cache=0) AND I'm not logged in (Different URL than BE, different browser, doesn't matter), is what I'm trying to say. ;-)

Anyway, would going back to RealURL 1.13.6 be a valid option, too, regarding the DOS possibility and until 2.1.0 comes out?
We're still on TYPO3 6.2 and I never had a problem with that version.

@triograf

This comment has been minimized.

Show comment
Hide comment
@triograf

triograf Sep 9, 2016

@dmitryd thanks for the suggested two solutions until v. 2.1.0 is out.

But will v. 2.1.0 fix the initial problem i had with the catmenu and the page browser of tt_news? Or is that just a configurations issue which should be fixed in the realurlconfiguration.php itself regardless of the realurl version?

Thank you for the quick support/response.

triograf commented Sep 9, 2016

@dmitryd thanks for the suggested two solutions until v. 2.1.0 is out.

But will v. 2.1.0 fix the initial problem i had with the catmenu and the page browser of tt_news? Or is that just a configurations issue which should be fixed in the realurlconfiguration.php itself regardless of the realurl version?

Thank you for the quick support/response.

@dmitryd dmitryd closed this in b0901d4 Sep 9, 2016

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 9, 2016

Owner

@triograf The issue should not happen if you do not clear or disable URL cache.

Owner

dmitryd commented Sep 9, 2016

@triograf The issue should not happen if you do not clear or disable URL cache.

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd
Owner

dmitryd commented Sep 9, 2016

@AndreasA

This comment has been minimized.

Show comment
Hide comment
@AndreasA

AndreasA Sep 9, 2016

Contributor

Shouldn't that ifMissing actually solve the DoS attack too? Because there can only be one cHash value stored then.
[EDIT]
Sorry, of course it wouldn't fix it as modifying the parameters would result in another CacheEntry without cHash.
[/EDIT]

In regards to DoS: That is why I said we would have to add a check that only does cHash calculation if ALL parameters are coming from realurl and are of a limited ranch, like lookupTable, valueMap.
And maybe userFunc if it sets a certain value.
Shouldn't that ifMissing actually solve the DoS attack too? Because there can only be one cHash value stored then.

In regards to DoS: That is why I said we would have to add a check that only does cHash calculation if ALL parameters are coming from realurl and are of a limited ranch, like lookupTable, valueMap.
And maybe userFunc if it sets a certain value.

You are probably right though that the best solution would be better page cache handling in the core in regards to the DoS attack but I also don't think it is likely to happen in the near future.

Another thing: So one should never clear the decode cache afterwards?
What about old URLs that have to be updated?
I don't want to clear the cache here per page, would it be enough to clear the encode cache or maybe storing cHash values separate again would be a good idea?
Or also have a task that then recreates all necessary URLs.

Contributor

AndreasA commented Sep 9, 2016

Shouldn't that ifMissing actually solve the DoS attack too? Because there can only be one cHash value stored then.
[EDIT]
Sorry, of course it wouldn't fix it as modifying the parameters would result in another CacheEntry without cHash.
[/EDIT]

In regards to DoS: That is why I said we would have to add a check that only does cHash calculation if ALL parameters are coming from realurl and are of a limited ranch, like lookupTable, valueMap.
And maybe userFunc if it sets a certain value.
Shouldn't that ifMissing actually solve the DoS attack too? Because there can only be one cHash value stored then.

In regards to DoS: That is why I said we would have to add a check that only does cHash calculation if ALL parameters are coming from realurl and are of a limited ranch, like lookupTable, valueMap.
And maybe userFunc if it sets a certain value.

You are probably right though that the best solution would be better page cache handling in the core in regards to the DoS attack but I also don't think it is likely to happen in the near future.

Another thing: So one should never clear the decode cache afterwards?
What about old URLs that have to be updated?
I don't want to clear the cache here per page, would it be enough to clear the encode cache or maybe storing cHash values separate again would be a good idea?
Or also have a task that then recreates all necessary URLs.

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 9, 2016

Owner

That is why I said we would have to add a check that only does cHash calculation if ALL parameters are coming from realurl and are of a limited ranch, like lookupTable, valueMap. And maybe userFunc if it sets a certain value.

Not really because noMatch=bypass can remove some variables that were in the original URL and thus change cHash.

You are probably right though that the best solution would be better page cache handling in the core in regards to the DoS attack but I also don't think it is likely to happen in the near future.

Unfortunately, yes.

Another thing: So one should never clear the decode cache afterwards? What about old URLs that have to be updated?

Starting from 2.1.0, realurl will handle page renames by adding expiration time to such URLs. So URL cache clearing will be semi-automatic.

I'd say that cache should be left alone. Realurl is good enough to manage it.

Owner

dmitryd commented Sep 9, 2016

That is why I said we would have to add a check that only does cHash calculation if ALL parameters are coming from realurl and are of a limited ranch, like lookupTable, valueMap. And maybe userFunc if it sets a certain value.

Not really because noMatch=bypass can remove some variables that were in the original URL and thus change cHash.

You are probably right though that the best solution would be better page cache handling in the core in regards to the DoS attack but I also don't think it is likely to happen in the near future.

Unfortunately, yes.

Another thing: So one should never clear the decode cache afterwards? What about old URLs that have to be updated?

Starting from 2.1.0, realurl will handle page renames by adding expiration time to such URLs. So URL cache clearing will be semi-automatic.

I'd say that cache should be left alone. Realurl is good enough to manage it.

@triograf

This comment has been minimized.

Show comment
Hide comment
@triograf

triograf Sep 9, 2016

@dmitryd but i did not touched nor disabled the url cache. Just updated to v. 2.0.15. After that the catmenu and the page browser did not work. The detail page work as expected.
So where’s the fault? Wrong config?

triograf commented Sep 9, 2016

@dmitryd but i did not touched nor disabled the url cache. Just updated to v. 2.0.15. After that the catmenu and the page browser did not work. The detail page work as expected.
So where’s the fault? Wrong config?

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 9, 2016

Owner

@triograf Check if you actually have entries in the URL log on the corresponding page.

Owner

dmitryd commented Sep 9, 2016

@triograf Check if you actually have entries in the URL log on the corresponding page.

@AndreasA

This comment has been minimized.

Show comment
Hide comment
@AndreasA

AndreasA Sep 9, 2016

Contributor

Starting from 2.1.0, realurl will handle page renames by adding expiration time to such URLs. So URL cache clearing will be semi-automatic.

That's good because I already had that problem once (old URLs).

In that regard: Will a change of a pagename (in any language) clear all URLs of that page, also that of different languages?
Because due to content fallback that might need to be cleared too - at least if you have a corresponding getPageOverlay Hook.

A nice feature here would be (in regards to SEO) to add an option to automatically redirect from the old URL to the new URL (with a 301)

Contributor

AndreasA commented Sep 9, 2016

Starting from 2.1.0, realurl will handle page renames by adding expiration time to such URLs. So URL cache clearing will be semi-automatic.

That's good because I already had that problem once (old URLs).

In that regard: Will a change of a pagename (in any language) clear all URLs of that page, also that of different languages?
Because due to content fallback that might need to be cleared too - at least if you have a corresponding getPageOverlay Hook.

A nice feature here would be (in regards to SEO) to add an option to automatically redirect from the old URL to the new URL (with a 301)

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 9, 2016

Owner

It will mark URL as expired. If somebody visits such URL, he will be redirected to the current URL. And only the URL for the edited language is marked as expired.

Owner

dmitryd commented Sep 9, 2016

It will mark URL as expired. If somebody visits such URL, he will be redirected to the current URL. And only the URL for the edited language is marked as expired.

@AndreasA

This comment has been minimized.

Show comment
Hide comment
@AndreasA

AndreasA Sep 9, 2016

Contributor

So if one wants to support content fallback or similar additional hooks will be required?

Would be great if you add a possibility to add a hook at that position.

The redirect will be 301 (Google etc. I think need 301 redirects to transfer the rating of the page to the new page) or can it be defined what type of redirect?

Contributor

AndreasA commented Sep 9, 2016

So if one wants to support content fallback or similar additional hooks will be required?

Would be great if you add a possibility to add a hook at that position.

The redirect will be 301 (Google etc. I think need 301 redirects to transfer the rating of the page to the new page) or can it be defined what type of redirect?

@helhum

This comment has been minimized.

Show comment
Hide comment
@helhum

helhum Sep 9, 2016

Contributor

What happens with the 2.0.15 change if one calls an URL that requires a cHash prior to the Link being generated?

@AndreasA @dmitryd

Let's see how TYPO3 works without realurl:

Can anybody know the URI to a location (e.g. news detail view) which requires a cHash before TYPO3 generated it? No. It is impossible because the cHash is built using (some) URI arguments and a secret (encryption key), which is only known to TYPO3. This means that a URI (which requires a cHash to work correctly) does not exist before TYPO3 created it!
The secret is included in the cHash to not allow attackers to create arbitrary cache entries.

This is a base concept of TYPO3 (page caching), which we have to live with until we change it.

realurl as an extension to TYPO3 has to build upon these concepts.

As a logical consequence, "if one calls an URL that requires a cHash prior to the Link being generated" it'll fail like any other link that requires a cHash even without realurl.

If realurl now removes the cHash from the URL, it is responsible to remember the original URL including the removed including the cHash argument. Since this is vital information it should NOT be stored in a volatile cache, but in a more persistent place to make clear that removing this information equals removing the URLs resulting in a 404.

Contributor

helhum commented Sep 9, 2016

What happens with the 2.0.15 change if one calls an URL that requires a cHash prior to the Link being generated?

@AndreasA @dmitryd

Let's see how TYPO3 works without realurl:

Can anybody know the URI to a location (e.g. news detail view) which requires a cHash before TYPO3 generated it? No. It is impossible because the cHash is built using (some) URI arguments and a secret (encryption key), which is only known to TYPO3. This means that a URI (which requires a cHash to work correctly) does not exist before TYPO3 created it!
The secret is included in the cHash to not allow attackers to create arbitrary cache entries.

This is a base concept of TYPO3 (page caching), which we have to live with until we change it.

realurl as an extension to TYPO3 has to build upon these concepts.

As a logical consequence, "if one calls an URL that requires a cHash prior to the Link being generated" it'll fail like any other link that requires a cHash even without realurl.

If realurl now removes the cHash from the URL, it is responsible to remember the original URL including the removed including the cHash argument. Since this is vital information it should NOT be stored in a volatile cache, but in a more persistent place to make clear that removing this information equals removing the URLs resulting in a 404.

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 9, 2016

Owner

Can anybody know the URI to a location (e.g. news detail view) which requires a cHash before TYPO3 generated it? No.

Actually it can be guessed easily. Sometimes I even type those urls manually during development :)

As a logical consequence, "if one calls an URL that requires a cHash prior to the Link being generated" it'll fail like any other link that requires a cHash even without realurl.

Incorrect :( This will work only if cHash is persistently stored somewhere for the URL.

If realurl now removes the cHash from the URL, it is responsible to remember the original URL including the removed including the cHash argument....

All correct but I want to add/stress two points here:

  1. Realurl always removed cHash (even in 1.x) if there are no more parameters in the URL
  2. Realurl stores cHash if URL cache is enabled. In version 1.x there was a separate cHash cache but often it got out of control and people struggled with millions of entries in that cache, which may cause DoS with realurl 1.x. Version 2.x solves the issue by storing cHash together with URL in the URL cache. If URL cache is not filled in or cleared or disabled, the only way to get a proper cHash is to re-calculate it. Therefore it is very important to leave realurl caches alone and not touch them.
Owner

dmitryd commented Sep 9, 2016

Can anybody know the URI to a location (e.g. news detail view) which requires a cHash before TYPO3 generated it? No.

Actually it can be guessed easily. Sometimes I even type those urls manually during development :)

As a logical consequence, "if one calls an URL that requires a cHash prior to the Link being generated" it'll fail like any other link that requires a cHash even without realurl.

Incorrect :( This will work only if cHash is persistently stored somewhere for the URL.

If realurl now removes the cHash from the URL, it is responsible to remember the original URL including the removed including the cHash argument....

All correct but I want to add/stress two points here:

  1. Realurl always removed cHash (even in 1.x) if there are no more parameters in the URL
  2. Realurl stores cHash if URL cache is enabled. In version 1.x there was a separate cHash cache but often it got out of control and people struggled with millions of entries in that cache, which may cause DoS with realurl 1.x. Version 2.x solves the issue by storing cHash together with URL in the URL cache. If URL cache is not filled in or cleared or disabled, the only way to get a proper cHash is to re-calculate it. Therefore it is very important to leave realurl caches alone and not touch them.
@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 10, 2016

Owner

@AndreasA You are welcome to study realurl code to get answers to all your questions.

@helmum I do not want to argue. I spent enough time in this thread and on realurl in general to continue the same discussion again and again. I suggest to stop here. There are other things in life that deserve time...

Owner

dmitryd commented Sep 10, 2016

@AndreasA You are welcome to study realurl code to get answers to all your questions.

@helmum I do not want to argue. I spent enough time in this thread and on realurl in general to continue the same discussion again and again. I suggest to stop here. There are other things in life that deserve time...

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 10, 2016

Owner

Oops, sorry for the typo :(

Owner

dmitryd commented Sep 10, 2016

Oops, sorry for the typo :(

@AndreasA

This comment has been minimized.

Show comment
Hide comment
@AndreasA

AndreasA Sep 10, 2016

Contributor

As mentioned I checked and yes there is an expire and clear cache function but as mentioned in the code it only works for pages - and you won't get all possible parameters here or am I missing somewhere where the urlcache is expired / cleared because realurl parameters do not exist anymore?
Of course, mostly in that case the corresponding page has been removed too so that works but that must not necessarily be the case, e.g. I could just swap a plugin for another on the same page.

Also as mentioned 'cache/disable' will disable caching and therefore cHash won't be stored at all.

And then there is the backend module that allows one to clear the Url-Cache but with the cHash change due to 2.0.15 one cannot do so anymore without "consequences".

Contributor

AndreasA commented Sep 10, 2016

As mentioned I checked and yes there is an expire and clear cache function but as mentioned in the code it only works for pages - and you won't get all possible parameters here or am I missing somewhere where the urlcache is expired / cleared because realurl parameters do not exist anymore?
Of course, mostly in that case the corresponding page has been removed too so that works but that must not necessarily be the case, e.g. I could just swap a plugin for another on the same page.

Also as mentioned 'cache/disable' will disable caching and therefore cHash won't be stored at all.

And then there is the backend module that allows one to clear the Url-Cache but with the cHash change due to 2.0.15 one cannot do so anymore without "consequences".

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 10, 2016

Owner

And then there is the backend module that allows one to clear the Url-Cache but with the cHash change due to 2.0.15 one cannot do so anymore without "consequences

Yes, this is one of reasons why chash calc was there since the beginning. For the rest you can search the code more.

Owner

dmitryd commented Sep 10, 2016

And then there is the backend module that allows one to clear the Url-Cache but with the cHash change due to 2.0.15 one cannot do so anymore without "consequences

Yes, this is one of reasons why chash calc was there since the beginning. For the rest you can search the code more.

@AndreasA

This comment has been minimized.

Show comment
Hide comment
@AndreasA

AndreasA Sep 10, 2016

Contributor

1.) You said you had problems with people asking question why things didn't work correctly due them e.g. deleting cache tables with the current solution you are back to that "issue".

2.) I agree that with the previous method everything was fine in regards to caches.
With the current it isn't anymore as they contain necessary information.
And I also I agree that I would prefer to generate the correct cHash upon decoding always but
due to helhum we know that creates a possible DoS attack so there needs to be a better solution.
Which is why I still think the solution idea from helhum has merrit.

It only generates a hash if realurl is allowed to generate the cHash on its own during decode, nothing more.
Also: Nothing prevents you from cleaning up that table the same way you clean up the current urlcache / pathcache.
For instance you can store a page id with it and use that to cleanup the table if a page is deleted.
You could store further meta data in a separate table like which records are used (and which parameters to remove the values if the parameters don't exist in realurl configuration anymore) etc. to clean it by DataHandler hook or scheduler task (as in the backend performance isn't that big an issue)

If one goes with the 256 bytes due to overhead that helhum mentioned you would need over 4 million entries to reach one gigabyte.
If you achieve that many you will run into the same problem by adding the cHash to the current encode/decode cache,.

3.) I checked the code more and:
So you are taking care of various page cache clearing by using clearCachePostProc and some via DataHandler hook that checks for unique aliases.
However, that also means that cHash information is removed upon cache clearing due to e.g. set clearCacheCmd
Or you clear the urlcache for a whole page id, if any language record was deleted which results in cHash values being lost of other languages too.
Also there can be other parameters that are not stored in unique aliases.

The issue now is just that we are storing necessary data cHash into tables that are not supposed to be necessary (cache)

Contributor

AndreasA commented Sep 10, 2016

1.) You said you had problems with people asking question why things didn't work correctly due them e.g. deleting cache tables with the current solution you are back to that "issue".

2.) I agree that with the previous method everything was fine in regards to caches.
With the current it isn't anymore as they contain necessary information.
And I also I agree that I would prefer to generate the correct cHash upon decoding always but
due to helhum we know that creates a possible DoS attack so there needs to be a better solution.
Which is why I still think the solution idea from helhum has merrit.

It only generates a hash if realurl is allowed to generate the cHash on its own during decode, nothing more.
Also: Nothing prevents you from cleaning up that table the same way you clean up the current urlcache / pathcache.
For instance you can store a page id with it and use that to cleanup the table if a page is deleted.
You could store further meta data in a separate table like which records are used (and which parameters to remove the values if the parameters don't exist in realurl configuration anymore) etc. to clean it by DataHandler hook or scheduler task (as in the backend performance isn't that big an issue)

If one goes with the 256 bytes due to overhead that helhum mentioned you would need over 4 million entries to reach one gigabyte.
If you achieve that many you will run into the same problem by adding the cHash to the current encode/decode cache,.

3.) I checked the code more and:
So you are taking care of various page cache clearing by using clearCachePostProc and some via DataHandler hook that checks for unique aliases.
However, that also means that cHash information is removed upon cache clearing due to e.g. set clearCacheCmd
Or you clear the urlcache for a whole page id, if any language record was deleted which results in cHash values being lost of other languages too.
Also there can be other parameters that are not stored in unique aliases.

The issue now is just that we are storing necessary data cHash into tables that are not supposed to be necessary (cache)

@AndreasA

This comment has been minimized.

Show comment
Hide comment
@AndreasA

AndreasA Sep 12, 2016

Contributor

I just saw that you renamed those tables which is great that way people will not just delete data that might be necessary.

Contributor

AndreasA commented Sep 12, 2016

I just saw that you renamed those tables which is great that way people will not just delete data that might be necessary.

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 12, 2016

Owner

Please, don't post entries like this to the bug tracker because they only loose my time and do not add any value to the product.

Owner

dmitryd commented Sep 12, 2016

Please, don't post entries like this to the bug tracker because they only loose my time and do not add any value to the product.

@Chrissitopher

This comment has been minimized.

Show comment
Hide comment
@Chrissitopher

Chrissitopher Sep 13, 2016

The following fix has been merged in TYPO3 6.2, 7.6 and in master: https://review.typo3.org/#/c/49925/.

Setting [FE][cHashIncludePageId] = true will now enable some kind of cache flooding protection. This setting is set to true for new installations. It is false for existing installations, but the reports module will warn users and advise them to manually set it to true.

Chrissitopher commented Sep 13, 2016

The following fix has been merged in TYPO3 6.2, 7.6 and in master: https://review.typo3.org/#/c/49925/.

Setting [FE][cHashIncludePageId] = true will now enable some kind of cache flooding protection. This setting is set to true for new installations. It is false for existing installations, but the reports module will warn users and advise them to manually set it to true.

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 13, 2016

Owner

Yes, I am aware of that. It will solve issues for those, who still use 2.0.14.

Owner

dmitryd commented Sep 13, 2016

Yes, I am aware of that. It will solve issues for those, who still use 2.0.14.

@helhum

This comment has been minimized.

Show comment
Hide comment
@helhum

helhum Sep 13, 2016

Contributor

It will solve issues for those, who still use 2.0.14.

Uhm, no. With [FE][cHashIncludePageId] = true realurl 2.0.14 will cause an exception to be thrown. Thus 2.0.15 is the only realurl release, that will work (with the exceptions from above) in that case.

Contributor

helhum commented Sep 13, 2016

It will solve issues for those, who still use 2.0.14.

Uhm, no. With [FE][cHashIncludePageId] = true realurl 2.0.14 will cause an exception to be thrown. Thus 2.0.15 is the only realurl release, that will work (with the exceptions from above) in that case.

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 13, 2016

Owner

Okay, good to know. I expected it to work. @helhum If you already tested it, do you have exception text under hand (no problem if not, just wondering what the exception is)?

Owner

dmitryd commented Sep 13, 2016

Okay, good to know. I expected it to work. @helhum If you already tested it, do you have exception text under hand (no problem if not, just wondering what the exception is)?

@helhum

This comment has been minimized.

Show comment
Hide comment
@helhum

helhum Sep 13, 2016

Contributor

The problematic lines in realurl are these:

$requestVariables = $cacheEntry->getRequestVariables();
$cacheHashCalculator = GeneralUtility::makeInstance('TYPO3\\CMS\\Frontend\\Page\\CacheHashCalculator');
/* @var \TYPO3\CMS\Frontend\Page\CacheHashCalculator $cacheHashCalculator */
$cHashParameters = $cacheHashCalculator->getRelevantParameters(GeneralUtility::implodeArrayForUrl('', $requestVariables));

This exception will be thrown: https://github.com/TYPO3/TYPO3.CMS/blob/master/typo3/sysext/frontend/Classes/Page/CacheHashCalculator.php#L131

Reason is, that cHash calculation will then require the id argument to be passed to \TYPO3\CMS\Frontend\Page\CacheHashCalculator::getRelevantParameters

Contributor

helhum commented Sep 13, 2016

The problematic lines in realurl are these:

$requestVariables = $cacheEntry->getRequestVariables();
$cacheHashCalculator = GeneralUtility::makeInstance('TYPO3\\CMS\\Frontend\\Page\\CacheHashCalculator');
/* @var \TYPO3\CMS\Frontend\Page\CacheHashCalculator $cacheHashCalculator */
$cHashParameters = $cacheHashCalculator->getRelevantParameters(GeneralUtility::implodeArrayForUrl('', $requestVariables));

This exception will be thrown: https://github.com/TYPO3/TYPO3.CMS/blob/master/typo3/sysext/frontend/Classes/Page/CacheHashCalculator.php#L131

Reason is, that cHash calculation will then require the id argument to be passed to \TYPO3\CMS\Frontend\Page\CacheHashCalculator::getRelevantParameters

@AndreasA

This comment has been minimized.

Show comment
Hide comment
@AndreasA

AndreasA Sep 13, 2016

Contributor

Which is not done because doDecoding does not add the 'id' to the $requestVariables.

Also I think it doesn't really fix the bugfix of 2.0.15 in 2.0.14 because one could still create arbitrary arguments.

Also in UrlEncoder for storeInUrlCache there i a comment that says:
// $this->originalUrlParameters['uid'] can be an alias, we need a number here!

which might also create troubles?

Contributor

AndreasA commented Sep 13, 2016

Which is not done because doDecoding does not add the 'id' to the $requestVariables.

Also I think it doesn't really fix the bugfix of 2.0.15 in 2.0.14 because one could still create arbitrary arguments.

Also in UrlEncoder for storeInUrlCache there i a comment that says:
// $this->originalUrlParameters['uid'] can be an alias, we need a number here!

which might also create troubles?

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 13, 2016

Owner

@helhum Thanks! So I should modify the code to add 'id' there if I use cHash calculator...

Owner

dmitryd commented Sep 13, 2016

@helhum Thanks! So I should modify the code to add 'id' there if I use cHash calculator...

@AndreasA

This comment has been minimized.

Show comment
Hide comment
@AndreasA

AndreasA Sep 13, 2016

Contributor

As far as I understand:
You would just have to add the id to the requestVariables prior to calling getRelevantPArameters as that one already fail otherwise.

But I think you actually meant that already.
My fault prior to EDIT. Tought only about adding the parameter to caclulateCacheHash before.

Contributor

AndreasA commented Sep 13, 2016

As far as I understand:
You would just have to add the id to the requestVariables prior to calling getRelevantPArameters as that one already fail otherwise.

But I think you actually meant that already.
My fault prior to EDIT. Tought only about adding the parameter to caclulateCacheHash before.

@AndreasA

This comment has been minimized.

Show comment
Hide comment
@AndreasA

AndreasA Sep 13, 2016

Contributor

Btw. I forgot to ask when using the current development or 2.0.15 and use bypass variables can that create problems in regards to cHash and v2.x
I knew it sometime created problems with v1.x because the same URL could be accessed with different parameters (both URLs were valid).
But I guess because the request_parameters are now stored with the URL that should work fine?

Contributor

AndreasA commented Sep 13, 2016

Btw. I forgot to ask when using the current development or 2.0.15 and use bypass variables can that create problems in regards to cHash and v2.x
I knew it sometime created problems with v1.x because the same URL could be accessed with different parameters (both URLs were valid).
But I guess because the request_parameters are now stored with the URL that should work fine?

@Chrissitopher

This comment has been minimized.

Show comment
Hide comment
@Chrissitopher

Chrissitopher Sep 13, 2016

So I should modify the code to add 'id' there if I use cHash calculator...

Yes, that is also what the security bulletin advises:

Additionally, calling the CacheHashCalculator API will require the id argument to be set in the URL provided. This means, that switching this option on, may break existing extensions, that are using this API.

Chrissitopher commented Sep 13, 2016

So I should modify the code to add 'id' there if I use cHash calculator...

Yes, that is also what the security bulletin advises:

Additionally, calling the CacheHashCalculator API will require the id argument to be set in the URL provided. This means, that switching this option on, may break existing extensions, that are using this API.

@helhum

This comment has been minimized.

Show comment
Hide comment
@helhum

helhum Sep 13, 2016

Contributor

the bugfix of 2.0.15

@AndreasA What bugfix of 2.0.15?

Contributor

helhum commented Sep 13, 2016

the bugfix of 2.0.15

@AndreasA What bugfix of 2.0.15?

@dmitryd

This comment has been minimized.

Show comment
Hide comment
@dmitryd

dmitryd Sep 13, 2016

Owner

FYI: I am not receiving notifications in this thread anymore because it makes too much email noise.

Owner

dmitryd commented Sep 13, 2016

FYI: I am not receiving notifications in this thread anymore because it makes too much email noise.

@helhum

This comment has been minimized.

Show comment
Hide comment
@helhum

helhum Sep 13, 2016

Contributor

I unsubscribed as well now

Contributor

helhum commented Sep 13, 2016

I unsubscribed as well now

@AndreasA

This comment has been minimized.

Show comment
Hide comment
@AndreasA

AndreasA Sep 13, 2016

Contributor

Meant the one regarding storing cHash instead of calculating it - the reason for 2.0.15 release (security issue).

Contributor

AndreasA commented Sep 13, 2016

Meant the one regarding storing cHash instead of calculating it - the reason for 2.0.15 release (security issue).

@MirekKmiec

This comment has been minimized.

Show comment
Hide comment
@MirekKmiec

MirekKmiec Dec 1, 2016

For tt_news fix for "read more link" without chash:

plugin.tt_news {
    allowCaching = 1
}

MirekKmiec commented Dec 1, 2016

For tt_news fix for "read more link" without chash:

plugin.tt_news {
    allowCaching = 1
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment