Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

In case of very long string in text column fdw fails on row without any error message #181

Open
michaj01 opened this issue Mar 12, 2019 · 6 comments

Comments

@michaj01
Copy link

michaj01 commented Mar 12, 2019

I have problem with mysql_fdw with handling data with long text in text column. In case of very long text fdw didn´t return all rows from table and return only rows before this row. And didn`t send any error message or warning.

My docker-compose.yml

version: '3'
services:
   postgre:
      image: "geographica/postgis-fdw-mysql"
      ports: 
      - "5432:5432"
      volumes:
      - ..:/scripts
      environment:
         POSTGRES_PASSWD: "my-secret-pw"
   mysql:
      image: "mariadb:10.1"
      restart: "always"
      environment:
         MYSQL_ROOT_PASSWORD: "my-secret-pw"
      ports: 
      - "3306:3306"

Data in mysql

CREATE TABLE `tab` (
  `id` int(11) NOT NULL AUTO_INCREMENT,
  `data` longtext NOT NULL,
   PRIMARY KEY (id)
) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8;

INSERT INTO `tab` (`data`) SELECT repeat('abcdefgh ', 7500);

In postgre

BEGIN;

CREATE EXTENSION mysql_fdw;

CREATE SERVER mysql_server
     FOREIGN DATA WRAPPER mysql_fdw
     OPTIONS (host 'mysql', port '3306');

CREATE USER MAPPING FOR postgres
SERVER mysql_server
OPTIONS (username 'root', password 'my-secret-pw');


CREATE SCHEMA foreign_:db;

IMPORT FOREIGN SCHEMA :db 
FROM SERVER mysql_server INTO foreign_:db;

COMMIT;

Problem

test=# SELECT id FROM foreign_test.tab ;
┌────┐
│ id │
╞════╡
│  1 │
└────┘
(1 row)

test=# SELECT id, data FROM foreign_test.tab ;
┌────┬──────┐
│ id │ data │
╞════╪══════╡
└────┴──────┘
(0 rows)

test=# EXPLAIN ANALYZE SELECT id FROM foreign_test.tab ;
┌────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│                                               QUERY PLAN                                               │
╞════════════════════════════════════════════════════════════════════════════════════════════════════════╡
│ Foreign Scan on tab  (cost=25.00..1025.00 rows=1000 width=4) (actual time=0.059..0.060 rows=1 loops=1) │
│ Planning time: 0.157 ms                                                                                │
│ Execution time: 0.416 ms                                                                               │
└────────────────────────────────────────────────────────────────────────────────────────────────────────┘
(3 rows)

test=# EXPLAIN ANALYZE SELECT id, data FROM foreign_test.tab ;
┌─────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│                                               QUERY PLAN                                                │
╞═════════════════════════════════════════════════════════════════════════════════════════════════════════╡
│ Foreign Scan on tab  (cost=25.00..1025.00 rows=1000 width=36) (actual time=0.343..0.343 rows=0 loops=1) │
│ Planning time: 0.158 ms                                                                                 │
│ Execution time: 1.073 ms                                                                                │
└─────────────────────────────────────────────────────────────────────────────────────────────────────────┘
(3 rows)

test=#g

@michaj01
Copy link
Author

If I create in foreign table column with bztea tzpe instad of text, it works (and I must use encode function to obtain data). Maybe CASE input for text in mysql_bind_sql_var (mysql_query.c) can help (but im not able to write correct code).

@michaj01
Copy link
Author

Problem is in MAXDATALEN, because there is value for 65535 characters. This is max length of text column, but, if I add more characters to this column, MySQL change type to longtext and longtext is not sanitized in fdw. Simple solution is increase MAXDATALEN, or is possible to CASE for length of TEXT value in mysql_bind_result, as in BYTEA type.

I hotfixed mysql_query.c like this

450a451,457
>                       case TEXTOID:
>                                       mbind->buffer_type = MYSQL_TYPE_VAR_STRING;
>                                       column->value = (Datum) palloc0(MAX_BLOB_WIDTH);
>                                       mbind->buffer = (char *) column->value;
>                                       mbind->buffer_length = MAX_BLOB_WIDTH;
>                                       break;
>

@renlm
Copy link

renlm commented Feb 8, 2020

I have the same problem as you

@gracieliao
Copy link

Problem is in MAXDATALEN, because there is value for 65535 characters. This is max length of text column, but, if I add more characters to this column, MySQL change type to longtext and longtext is not sanitized in fdw. Simple solution is increase MAXDATALEN, or is possible to CASE for length of TEXT value in mysql_bind_result, as in BYTEA type.

I hotfixed mysql_query.c like this

450a451,457
>                       case TEXTOID:
>                                       mbind->buffer_type = MYSQL_TYPE_VAR_STRING;
>                                       column->value = (Datum) palloc0(MAX_BLOB_WIDTH);
>                                       mbind->buffer = (char *) column->value;
>                                       mbind->buffer_length = MAX_BLOB_WIDTH;
>                                       break;
>

I updated mysql_query.c with your method, but it still doesn't work. Is there somthing else should do to activate mysql_query.c?

mhw added a commit to mhw/mysql_fdw that referenced this issue Jul 16, 2020
@mhw
Copy link

mhw commented Jul 16, 2020

I've run in to the same issue on a system I'm working on, and found the addition above resolves the problem. My change is here in my fork. I'll see how it performs over the next few days, as I've got it running on a production site.

mhw added a commit to mhw/mysql_fdw that referenced this issue Sep 9, 2020
mhw added a commit to mhw/mysql_fdw that referenced this issue Oct 14, 2020
@slava-pagerduty
Copy link

slava-pagerduty commented Oct 28, 2020

nice. I just run into same issue with mediumtext column 86K long .. thanks for the fix - worked great!

mhw added a commit to mhw/mysql_fdw that referenced this issue Oct 26, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants