-
Notifications
You must be signed in to change notification settings - Fork 836
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Content-Type header [] is not supported #350
Comments
I answered my own question with a simple read of the documentation. I added:
|
For me it's not working. I am using elasticdump to migrate some data from 5.5.1.1 (1 node) to 5.6.2.1 (3 nodes).
|
I've noticed something : the error appears after the first batch of elements.
|
Update : This does not happen when I am using the file method, only when I use elasticsearch instances as input and output. |
This is a required change for version 6, might make sense to have this be a default, I don't think it has a negative impact when working with older versions. |
@cRUSHr2012 i have the same issues that you had. Do you figured out a solution? |
@cRUSHr2012 @fernandowiek I have the same error after 100 documents. But also when I'm using the file method.. Is there any solution? I started using ES 6 from the beginning, so without any migration process |
Oke, I answered my own post. For me adding --limit=2000 as parameter works fine! |
I'm getting: Fri, 24 Nov 2017 03:51:17 GMT | starting dump |
@Amit-A |
@JelmenGuhlke Could you share your elasticdump command here? I added the --limit=2000 to mine and the same issue persist. |
@cRUSHr2012 Your 🐛 seems to be related to #344 |
@ferronrsmith I am not using Elasticsearch 6.x |
@cRUSHr2012 yea I know, but I was having the same issue and it fixed it for me. |
Just an added note, might help someone. |
Thanks for the tip.
I am using it with AWS ES6.0. Is it because of this https://www.elastic.co/guide/en/elasticsearch/reference/6.x/removal-of-types.html ? Edit: Apparently so, there should be only one mapping type. Thanks everyone. |
I am getting @maniankara error (unrecognised token): I am sending both
|
@aliostad I get the same error as you by adding both --type: data and --header. I am using ES6.0. Is there any solution? Thanks. |
Closing in favor of the above. |
when is the release? |
For those who get error using |
I try to use -H'Content-Type: application/json',but the same error I got |
@morgango using your solution and '--limit=10000' I only can dump 10000 records , but my data more than 10000 , I need your help , thanks |
I found the way to solve the [Error Emitted => {"root_cause":[{"type":"illegal_argument_exception".....] problem. |
Error: {"error":"Content-Type header [application/x-ndjson] is not supported","status":406}. |
@smileTou : any luck with figuring out how to dump more than 10000 records? I am facing the same issue. Also, curious to know why is that the error that I got gets removed when I specify a limit 10000 for the command. How does just increasing the number of records affect the way the request body is being parsed? The error I am referring to is: Mon, 29 Jan 2018 21:49:32 GMT | Error Emitted => {"error":{"root_cause":[{"type":"illegal_argument_exception","reason":"Failed to parse request body"}],"type":"illegal_argument_exception","reason":"Failed to parse request body","caused_by":{"type":"json_parse_exception","reason":"Unrecognized token 'DnF1ZXJ5VGhlbkZldGNoBQAAAAAAAIaxFjg5V1pqT1Z2UlVPYXlHMWNtUXltWXcAAAAAAACJWRZNSjMxS2dGUFNnZWZnV0RHbkw1WUNnAAAAAAAAjTIWYXhOVm5FamtRZ3FMTHdJblpIV19vQQAAAAAAAI0zFmF4TlZuRWprUWdxTEx3SW5aSFdfb0EAAAAAAACJWhZNSjMxS2dGUFNnZWZnV0RHbkw1WUNn': was expecting ('true', 'false' or 'null')\n at [Source: org.elasticsearch.transport.netty4.ByteBufStreamInput@4651072e; line: 1, column: 457]"}},"status":400} The Elasticsearch version that I am using is 6.1.0 |
Elasticdump 3.3.2 onwards seems to solve it. This commit, specifically: |
This might happen because of a self-signed ssl certificate. If that is the case, none of the above solutions will help. |
I am getting data from elastic search to pandas. My document count is in billions and millions. I noticed a weird thing. When I am matching the numbers in my pandas and kibana. The numbers are not the same. Sometimes its more in kibana and sometimes in pandas for the same time period. IS this normal? or it is happening because of the volume of data I am parsing. |
In order to help us troubleshoot issues with this project, you must provide the following details:
I seem to be getting the error
Content-Type header [] is not supported
with Elastic 6 beta.is there a command line argument I can add to include the content type? This isn't a problem specific to elasticdump, it is a change in Elasticsearch 6.
The text was updated successfully, but these errors were encountered: