You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First of all, I know quite a few issues have been written about link cards not working. However, I believe in this case the solution does lie on our end if we could just get the right info.
In short: link preview fetching fails because the Akamai configuration for our site is set up to block most bots and crawlers (returning a 403) except for those we whitelisted. As such, if we can create a pattern that matches the cardyb crawler, this would fix the problem. This is an example of a link that fails.
Is there any info on the referrer, user agent and/or ip range being used so we can add a rule?
The text was updated successfully, but these errors were encountered:
I've just tested this - the user agent for Cardy B is (currently) Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Bluesky Cardyb/1.1; +mailto:support@bsky.app) Chrome/W.X.Y.Z Safari/537.36. So it looks like the solution is to allow UAs containing Bluesky Cardyb through (personally I wouldn't add the version number since we don't know how often that's likely to change)
I've just tested this - the user agent for Cardy B is (currently) Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Bluesky Cardyb/1.1; +mailto:support@bsky.app) Chrome/W.X.Y.Z Safari/537.36. So it looks like the solution is to allow UAs containing Bluesky Cardyb through (personally I wouldn't add the version number since we don't know how often that's likely to change)
First of all, I know quite a few issues have been written about link cards not working. However, I believe in this case the solution does lie on our end if we could just get the right info.
In short: link preview fetching fails because the Akamai configuration for our site is set up to block most bots and crawlers (returning a 403) except for those we whitelisted. As such, if we can create a pattern that matches the cardyb crawler, this would fix the problem. This is an example of a link that fails.
Is there any info on the referrer, user agent and/or ip range being used so we can add a rule?
The text was updated successfully, but these errors were encountered: