You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
SELECT fp.particleId
FROM
MDR1.FOFParticles AS fp
WHERE fp.fofId = 85000000000
LIMIT 10-- The query plan used to run this query: -------------------------------------------------- CALL paquExec('SELECT fp.particleId AS fp.particleId FROM MDR1.FOFParticles AS fp WHERE ( fp.fofId = 85000000000 ) LIMIT 0,10', 'aggregation_tmp_86794282')-- USE spider_tmp_shard-- SET @i=0-- CREATE TABLE multidark_user_kristin.test-simple ENGINE=MyISAM SELECT @i:=@i+1 AS row_id, distinct_res_table.* FROM ( SELECT DISTINCT fp.particleId
FROM aggregation_tmp_86794282 LIMIT 0,10 ) as distinct_res_table-- CALL paquDropTmp('aggregation_tmp_86794282')
gives the error:
Table 'aggregation_tmp_86794282' already exists
I can't spot a mistake in the query plan, however.
The text was updated successfully, but these errors were encountered:
retrieve IP lists properly from spider tables - problem is, that if multiple tables point to different nodes, all nodes are asked to send results. if IPs are different, but nodes identical, the problem above arises
A query like this:
SELECT fp.particleId
FROM
MDR1.FOFParticles AS fp
WHERE fp.fofId = 85000000000
LIMIT 10-- The query plan used to run this query: -------------------------------------------------- CALL paquExec('SELECT fp.particleId AS
fp.particleId
FROM MDR1.FOFParticles ASfp
WHERE ( fp.fofId = 85000000000 ) LIMIT 0,10', 'aggregation_tmp_86794282')-- USE spider_tmp_shard-- SET @i=0-- CREATE TABLE multidark_user_kristin.test-simple
ENGINE=MyISAM SELECT @i:=@i+1 ASrow_id
,distinct_res_table
.* FROM ( SELECT DISTINCTfp.particleId
FROM
aggregation_tmp_86794282
LIMIT 0,10 ) asdistinct_res_table
-- CALL paquDropTmp('aggregation_tmp_86794282')gives the error:
Table 'aggregation_tmp_86794282' already exists
I can't spot a mistake in the query plan, however.
The text was updated successfully, but these errors were encountered: