'Data too long for column' issue with URLs #1525

amygdala opened this Issue Apr 1, 2013 · 9 comments


None yet

7 participants


I noticed an issue with the field sizes in the *_links and *_links_short tables, with thinkup-2.0-beta.4.

When the ExpandURLs plugin runs, I often see errors like this:

PDOException: Database error! ThinkUp could not execute the following query: INSERT INTO tu_links_short (link_id, short_url, click_count) VALUES (:link_id , :short_url, 0) PDOException: SQLSTATE[22001]: String data, right truncated: 1406 Data too long for column 'short_url' at row 1

(For me this would cause the crawler to bail).

This can happen when the original shortlink triggers a series of redirects/expansions -- some end up being astoundingly long.
I would get these errors at times for the url, expanded_url, short_url, and error fields. (staggeringly, some of the URLs were over 255 chars).

I patched this for myself by just checking the strlen for the relevant fields before doing an insert, and not executing the insert if the string was too long. However, that is not quite the right patch, as it doesn't mark the original url as processed, and it will get re-checked when the crawler runs in future.

Anyway, a better approach might be to catch the PDOException.

ThinkUp LLC member

Crud. There's a unique key on the *_links.url and post_key fields, which we can't have if we make it text. Gotta think this over... thanks Amy.


I got this issue too on my first crawl :(

Fresh install of 2.0-beta.8

For noobs just installing like me, I was able to delete the plugins/expandurls dir to disable the offending plugin for now


Wondered if there was any progress on this - or at least a workaround? Haven't been able to run crawler since May! Any help gratefull received..

Also - should there be a unique key on *_links.url? Does that mean can't archive more than one tweet referring to same link?

@ginatrapani ginatrapani referenced this issue Aug 11, 2013

Crawler #1663


Has this been resolved yet? I am getting it after install. Downloaded the latest version.


It looks like setting your link loopback value to zero in the ExpandURLs plugin configuration also avoids this issue.


Still an issue - still no updates since May. If delete ExpandURLs doesn't that mean the t.co links will be stored and will they be futureproof?

Wondering if this is connected with Twitter changing their t.co link length was that about May?

@tingham 'link loopback value to zero in the ExpandURLs plugin configuration' - can't find where to set this? and does that mean it still decodes?

ThinkUp LLC member

@csc4 What @tingham was referencing was that if you go to /account/?p=expandurls under your ThinkUp install, you'll see the config for the Expand URLs plugin. (You'll need to be logged in as an Admin, and click "Show Settings".) The first field is "Links to expand per crawl:" and you'll want to set that value to zero.

Doing this will not expand the URLs, but will allow the crawl to continue. Once this is fixed, ThinkUp will properly go back and expand the links.


Sorry about the lack of response there - I got super busy and completely forgot about it. Thanks Anil!

ThinkUp LLC member

Closed in 24375b7 (mistyped issue number in commit message)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment