Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using es.mapping.exclude/include and still getting StrictDynamicMappingException on excluded fields #1015

Closed
1 task
JohnS89 opened this issue Jul 18, 2017 · 7 comments

Comments

@JohnS89
Copy link

JohnS89 commented Jul 18, 2017

What kind an issue is this?

  • [X ] Bug report. If you’ve found a bug, please provide a code snippet or test to reproduce it below.
    The easier it is to track down the bug, the faster it is solved.
  • Feature Request. Start by telling us what problem you’re trying to solve.
    Often a solution already exists! Don’t send pull requests to implement new features without
    first getting our support. Sometimes we leave features out on purpose to keep the project small.

Issue description

Description:
I am trying to import data into ES using ES-Hadoop. I need the document ID to be a combination of two fields concatenated together: _. I don't want the concatenated string to be a field in the document since the other two fields are already there.

Steps to reproduce

Code:

DEFINE EsStorage org.elasticsearch.hadoop.pig.EsStorage('es.nodes=${es_client_nodes}',
                'es.port=443'
                , 'es.write.operation=upsert'
                , 'es.mapping.exclude=documentID'
                , 'es.mapping.include=businessDayDate, retailStoreID, workstationID, transactionNumber, transactionDate, transactionLineItemSequenceNumber, retnRetailStoreID, retnWorkstationID, retnTransactionNumber, retnBusinessDayDate, retnDataSourceCode, retnGlobalTransactionID, retnTransactionLineItemSequenceNumber'
                , 'es.mapping.id=documentID'
                , 'es.http.timeout=1000m'
                , 'es.batch.write.retry.count=1'
                , 'es.batch.http.retries=1'
                ,'es.index.auto.create = true'
                ,'es.net.ssl= true'
                ,'es.nodes.wan.only=true'
                ,'es.net.http.auth.user=${es_user}'
                ,'es.net.http.auth.pass=${es_key}'
                ,'es.net.ssl.protocol=TLS'

... Various filters and joins ...

retn_items = FOREACH retn_details_without_voids GENERATE
                        documentID as documentID
                        , businessDayDate as businessDayDate
                        , retailStoreID as retailStoreID
                        , workstationID as workstationID
                        , transactionNumber as transactionNumber
                        , transactionDate as transactionDate
                        , transactionLineItemSequenceNumber as transactionLineItemSequenceNumber
                        , retnRetailStoreID as retnRetailStoreID
                        , retnWorkstationID as retnWorkstationID
                        , retnTransactionNumber as retnTransactionNumber
                        , retnBusinessDayDate as retnBusinessDayDate
                        , retnDataSourceCode as retnDataSourceCode
                        , retnGlobalTransactionID as retnGlobalTransactionID
                        , retnTransactionLineItemSequenceNumber as retnTransactionLineItemSequenceNumber;

STORE retn_items INTO '$tran_index/retnitem' USING EsStorage;

Strack trace:

Driver:

6825 [main] INFO  org.apache.pig.backend.hadoop.executionengine.tez.TezDagBuilder  - For vertex - scope-370: parallelism=70, memory=7296, java opts=-Djava.net.preferIPv4Stack=true -Dhadoop.metrics.log.level=WARN -Xmx5836m -Dlog4j.configuratorClass=org.apache.tez.common.TezLog4jConfigurator -Dlog4j.configuration=tez-container-log4j.properties -Dyarn.app.container.log.dir=<LOG_DIR> -Dtez.root.logger=TRACE,CLA
17/07/18 18:44:41 INFO tez.TezDagBuilder: For vertex - scope-370: parallelism=70, memory=7296, java opts=-Djava.net.preferIPv4Stack=true -Dhadoop.metrics.log.level=WARN -Xmx5836m -Dlog4j.configuratorClass=org.apache.tez.common.TezLog4jConfigurator -Dlog4j.configuration=tez-container-log4j.properties -Dyarn.app.container.log.dir=<LOG_DIR> -Dtez.root.logger=TRACE,CLA
6825 [main] INFO  org.apache.pig.backend.hadoop.executionengine.tez.TezDagBuilder  - Processing aliases: retn_detail_with_void_data,retn_details_without_voids,retn_items
17/07/18 18:44:41 INFO tez.TezDagBuilder: Processing aliases: retn_detail_with_void_data,retn_details_without_voids,retn_items
6825 [main] INFO  org.apache.pig.backend.hadoop.executionengine.tez.TezDagBuilder  - Detailed locations: retn_detail_with_void_data[477,29],retn_detail_with_void_data[-1,-1],retn_details_without_voids[478,29],retn_items[480,13]
17/07/18 18:44:41 INFO tez.TezDagBuilder: Detailed locations: retn_detail_with_void_data[477,29],retn_detail_with_void_data[-1,-1],retn_details_without_voids[478,29],retn_items[480,13]
6826 [main] INFO  org.apache.pig.backend.hadoop.executionengine.tez.TezDagBuilder  - Pig features in the vertex: HASH_JOIN
17/07/18 18:44:41 INFO tez.TezDagBuilder: Pig features in the vertex: HASH_JOIN
17/07/18 18:44:41 WARN mr.EsOutputFormat: Speculative execution enabled for reducer - consider disabling it to prevent data corruption
6878 [main] INFO  org.apache.pig.backend.hadoop.executionengine.tez.TezJobCompiler  - Total estimated parallelism is 89
17/07/18 18:44:41 INFO tez.TezJobCompiler: Total estimated parallelism is 89
6987 [PigTezLauncher-0] INFO  org.apache.pig.tools.pigstats.tez.TezScriptState  - Pig script settings are added to the job
17/07/18 18:44:41 INFO tez.TezScriptState: Pig script settings are added to the job
17/07/18 18:44:41 INFO client.TezClient: Tez Client Version: [ component=tez-api, version=0.8.4, revision=154f1ef53e2d6ed126b0957d7995e0a610947608, SCM-URL=scm:git:https://git-wip-us.apache.org/repos/asf/tez.git, buildTime=2017-06-01T05:53:19Z ]
17/07/18 18:44:41 INFO impl.TimelineClientImpl: Timeline service address: http://ip-172-18-109-126.us-west-2.compute.internal:8188/ws/v1/timeline/
17/07/18 18:44:41 INFO client.RMProxy: Connecting to ResourceManager at ip-172-18-109-126.us-west-2.compute.internal/172.18.109.126:8032
17/07/18 18:44:41 INFO client.TezClient: Using org.apache.tez.dag.history.ats.acls.ATSHistoryACLPolicyManager to manage Timeline ACLs
17/07/18 18:44:41 INFO impl.TimelineClientImpl: Timeline service address: http://ip-172-18-109-126.us-west-2.compute.internal:8188/ws/v1/timeline/
17/07/18 18:44:41 INFO client.TezClient: Session mode. Starting session.
17/07/18 18:44:41 INFO client.TezClientUtils: Using tez.lib.uris value from configuration: hdfs:///apps/tez/tez.tar.gz
17/07/18 18:44:41 INFO client.TezClientUtils: Using tez.lib.uris.classpath value from configuration: null
17/07/18 18:44:42 INFO client.TezClient: Tez system stage directory hdfs://ip-172-18-109-126.us-west-2.compute.internal:8020/tmp/temp-1469282549/.tez/application_1500403084622_0001 doesn't exist and is created
17/07/18 18:44:42 INFO acls.ATSHistoryACLPolicyManager: Created Timeline Domain for History ACLs, domainId=Tez_ATS_application_1500403084622_0001
17/07/18 18:44:42 INFO Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS
17/07/18 18:44:42 INFO impl.YarnClientImpl: Submitted application application_1500403084622_0001
17/07/18 18:44:42 INFO client.TezClient: The url to track the Tez Session: http://ip-172-18-109-126.us-west-2.compute.internal:20888/proxy/application_1500403084622_0001/
14588 [PigTezLauncher-0] INFO  org.apache.pig.backend.hadoop.executionengine.tez.TezJob  - Submitting DAG PigLatin:retnItems2.pig-0_scope-0
17/07/18 18:44:48 INFO tez.TezJob: Submitting DAG PigLatin:retnItems2.pig-0_scope-0
17/07/18 18:44:48 INFO client.TezClient: Submitting dag to TezSession, sessionName=PigLatin:retnItems2.pig, applicationId=application_1500403084622_0001, dagName=PigLatin:retnItems2.pig-0_scope-0, callerContext={ context=PIG, callerType=PIG_SCRIPT_ID, callerId=PIG-retnItems2.pig-6f4dd558-53be-4c9d-bbc5-941d2aee240b }
17/07/18 18:44:49 INFO client.TezClient: Submitted dag to TezSession, sessionName=PigLatin:retnItems2.pig, applicationId=application_1500403084622_0001, dagName=PigLatin:retnItems2.pig-0_scope-0
17/07/18 18:44:49 INFO impl.TimelineClientImpl: Timeline service address: http://ip-172-18-109-126.us-west-2.compute.internal:8188/ws/v1/timeline/
17/07/18 18:44:49 INFO client.RMProxy: Connecting to ResourceManager at ip-172-18-109-126.us-west-2.compute.internal/172.18.109.126:8032
15505 [PigTezLauncher-0] INFO  org.apache.pig.backend.hadoop.executionengine.tez.TezJob  - Submitted DAG PigLatin:retnItems2.pig-0_scope-0. Application id: application_1500403084622_0001
17/07/18 18:44:49 INFO tez.TezJob: Submitted DAG PigLatin:retnItems2.pig-0_scope-0. Application id: application_1500403084622_0001
15940 [main] INFO  org.apache.pig.backend.hadoop.executionengine.tez.TezLauncher  - HadoopJobId: job_1500403084622_0001
17/07/18 18:44:50 INFO tez.TezLauncher: HadoopJobId: job_1500403084622_0001
16512 [Timer-0] INFO  org.apache.pig.backend.hadoop.executionengine.tez.TezJob  - DAG Status: status=RUNNING, progress=TotalTasks: 74 Succeeded: 0 Running: 0 Failed: 0 Killed: 0, diagnostics=, counters=null
17/07/18 18:44:50 INFO tez.TezJob: DAG Status: status=RUNNING, progress=TotalTasks: 74 Succeeded: 0 Running: 0 Failed: 0 Killed: 0, diagnostics=, counters=null
36512 [Timer-0] INFO  org.apache.pig.backend.hadoop.executionengine.tez.TezJob  - DAG Status: status=RUNNING, progress=TotalTasks: 6 Succeeded: 5 Running: 1 Failed: 0 Killed: 0 FailedTaskAttempts: 2, diagnostics=, counters=null
17/07/18 18:45:10 INFO tez.TezJob: DAG Status: status=RUNNING, progress=TotalTasks: 6 Succeeded: 5 Running: 1 Failed: 0 Killed: 0 FailedTaskAttempts: 2, diagnostics=, counters=null
17/07/18 18:45:15 INFO counters.Limits: Counter limits initialized with parameters:  GROUP_NAME_MAX=256, MAX_GROUPS=500, COUNTER_NAME_MAX=64, MAX_COUNTERS=120
40670 [PigTezLauncher-0] INFO  org.apache.pig.backend.hadoop.executionengine.tez.TezJob  - DAG Status: status=FAILED, progress=TotalTasks: 6 Succeeded: 5 Running: 0 Failed: 1 Killed: 0 FailedTaskAttempts: 4, diagnostics=Vertex failed, vertexName=scope-370, vertexId=vertex_1500403084622_0001_1_05, diagnostics=[Task failed, taskId=task_1500403084622_0001_1_05_000000, diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task ( failure ) : attempt_1500403084622_0001_1_05_000000_0:org.elasticsearch.hadoop.rest.EsHadoopInvalidRequest: Found unrecoverable error [es-elb.ertm-e3.aws.cloud.nordstrom.net:443] returned Bad Request(400) - [StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]]; Bailing out..
        at org.elasticsearch.hadoop.rest.RestClient.retryFailedEntries(RestClient.java:207)
        at org.elasticsearch.hadoop.rest.RestClient.bulk(RestClient.java:170)
        at org.elasticsearch.hadoop.rest.RestRepository.tryFlush(RestRepository.java:225)
        at org.elasticsearch.hadoop.rest.RestRepository.flush(RestRepository.java:248)
        at org.elasticsearch.hadoop.rest.RestRepository.close(RestRepository.java:267)
        at org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.doClose(EsOutputFormat.java:214)
        at org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.close(EsOutputFormat.java:196)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.close(PigOutputFormat.java:154)
        at org.apache.tez.mapreduce.output.MROutput.flush(MROutput.java:546)
        at org.apache.pig.backend.hadoop.executionengine.tez.runtime.PigProcessor.run(PigProcessor.java:272)
        at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370)
        at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)
        at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
        at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)
        at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)
        at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)
], TaskAttempt 1 failed, info=[Error: Error while running task ( failure ) : attempt_1500403084622_0001_1_05_000000_1:org.elasticsearch.hadoop.rest.EsHadoopInvalidRequest: Found unrecoverable error [es-elb.ertm-e3.aws.cloud.nordstrom.net:443] returned Bad Request(400) - [StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]]; Bailing out..
        at org.elasticsearch.hadoop.rest.RestClient.retryFailedEntries(RestClient.java:207)
        at org.elasticsearch.hadoop.rest.RestClient.bulk(RestClient.java:170)
        at org.elasticsearch.hadoop.rest.RestRepository.tryFlush(RestRepository.java:225)
        at org.elasticsearch.hadoop.rest.RestRepository.flush(RestRepository.java:248)
        at org.elasticsearch.hadoop.rest.RestRepository.close(RestRepository.java:267)
        at org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.doClose(EsOutputFormat.java:214)
        at org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.close(EsOutputFormat.java:196)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.close(PigOutputFormat.java:154)
        at org.apache.tez.mapreduce.output.MROutput.flush(MROutput.java:546)
        at org.apache.pig.backend.hadoop.executionengine.tez.runtime.PigProcessor.run(PigProcessor.java:272)
        at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370)
        at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)
        at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
        at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)
        at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)
        at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)
], TaskAttempt 2 failed, info=[Error: Error while running task ( failure ) : attempt_1500403084622_0001_1_05_000000_2:org.elasticsearch.hadoop.rest.EsHadoopInvalidRequest: Found unrecoverable error [es-elb.ertm-e3.aws.cloud.nordstrom.net:443] returned Bad Request(400) - [StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]]; Bailing out..

Worker:

2017-07-18 19:24:45,801 [DEBUG] [TezChild] |rest.NetworkClient|: Opening (pinned) network client to
 es-elb.x-e3.aws.cloud.example.net:443
2017-07-18 19:24:45,801 [DEBUG] [TezChild] |params.DefaultHttpParams|: Set parameter http.method.re
try-handler = org.elasticsearch.hadoop.rest.commonshttp.CommonsHttpTransport$1@169ef9f2
2017-07-18 19:24:45,801 [DEBUG] [TezChild] |params.DefaultHttpParams|: Set parameter http.connectio
n-manager.timeout = 60000000
2017-07-18 19:24:45,801 [DEBUG] [TezChild] |params.DefaultHttpParams|: Set parameter http.socket.ti
meout = 60000000
2017-07-18 19:24:45,801 [DEBUG] [TezChild] |params.DefaultHttpParams|: Set parameter http.protocol.
credential-charset = UTF-8
2017-07-18 19:24:45,801 [DEBUG] [TezChild] |params.DefaultHttpParams|: Set parameter http.protocol.
content-charset = UTF-8
2017-07-18 19:24:45,801 [DEBUG] [TezChild] |commonshttp.CommonsHttpTransport|: SSL Connection enabl
ed
2017-07-18 19:24:45,801 [TRACE] [TezChild] |httpclient.HttpState|: enter HttpState.setCredentials(A
uthScope, Credentials)
2017-07-18 19:24:45,802 [INFO] [TezChild] |commonshttp.CommonsHttpTransport|: Using detected HTTP A
uth credentials...
2017-07-18 19:24:45,802 [DEBUG] [TezChild] |params.DefaultHttpParams|: Set parameter http.authentic
ation.preemptive = true
2017-07-18 19:24:45,802 [DEBUG] [TezChild] |params.DefaultHttpParams|: Set parameter http.tcp.nodel
ay = true
2017-07-18 19:24:45,802 [TRACE] [TezChild] |commonshttp.CommonsHttpTransport|: Opening HTTP transpo
rt to es-elb.x-e3.aws.cloud.example.net:443
2017-07-18 19:24:45,803 [TRACE] [TezChild] |mr.EsOutputFormat|: Starting heartbeat for task_1500403
0846225_0004_r_000000
2017-07-18 19:24:45,809 [TRACE] [TezChild] |bulk.AbstractBulkFactory|: Instantiated value writer [o
rg.elasticsearch.hadoop.pig.PigValueWriter@9723224]
2017-07-18 19:24:45,809 [TRACE] [TezChild] |bulk.AbstractBulkFactory|: Instantiated id extractor [P
igFieldExtractor for field [[documentID]]]
2017-07-18 19:24:45,828 [DEBUG] [TezChild] |impl.IFile|: currentKeyLength=-1, currentValueLength=-1
, bytesRead=172, length=140
2017-07-18 19:24:45,828 [TRACE] [TezChild] |pig.EsStorage|: Writing out tuple (1110108647_1,2017-01
-12T16:00:00.000-0800,0808,8567,4082,2017-01-07T16:00:00.000-0800,1,808,8568,6552,2017-01-16T16:00:
00.000-0800,RETN,1110108647,1)
2017-07-18 19:24:45,828 [DEBUG] [TezChild] |impl.IFile|: currentKeyLength=-1, currentValueLength=-1
, bytesRead=42, length=49
2017-07-18 19:24:45,829 [DEBUG] [TezChild] |impl.IFile|: currentKeyLength=-1, currentValueLength=-1
, bytesRead=22, length=35
2017-07-18 19:24:45,829 [DEBUG] [TezChild] |impl.IFile|: currentKeyLength=-1, currentValueLength=-1
, bytesRead=168, length=135
2017-07-18 19:24:45,829 [TRACE] [TezChild] |pig.EsStorage|: Writing out tuple (1110122661_1,2017-01
-08T16:00:00.000-0800,4,4316,1371,2017-01-16T16:00:00.000-0800,1,4,4316,1410,2017-01-23T16:00:00.00
0-0800,RETN,1110122661,1)
2017-07-18 19:24:45,829 [DEBUG] [TezChild] |impl.IFile|: currentKeyLength=-1, currentValueLength=-1
, bytesRead=22, length=35
2017-07-18 19:24:45,830 [DEBUG] [TezChild] |impl.IFile|: currentKeyLength=-1, currentValueLength=-1
, bytesRead=42, length=50
2017-07-18 19:24:45,830 [DEBUG] [TezChild] |impl.IFile|: currentKeyLength=-1, currentValueLength=-1
, bytesRead=62, length=59
2017-07-18 19:24:45,830 [DEBUG] [TezChild] |impl.IFile|: currentKeyLength=-1, currentValueLength=-1
, bytesRead=22, length=35
2017-07-18 19:24:45,830 [DEBUG] [TezChild] |impl.IFile|: currentKeyLength=-1, currentValueLength=-1
, bytesRead=166, length=137
2017-07-18 19:24:45,830 [TRACE] [TezChild] |pig.EsStorage|: Writing out tuple (1110327539_1,2017-01
-09T16:00:00.000-0800,48,1201,7,2017-01-18T16:00:00.000-0800,1,48,1206,584,2017-02-20T16:00:00.000-
0800,RETN,1110327539,1)
2017-07-18 19:24:45,830 [DEBUG] [TezChild] |impl.IFile|: currentKeyLength=-1, currentValueLength=-1
, bytesRead=166, length=136
2017-07-18 19:24:45,831 [TRACE] [TezChild] |pig.EsStorage|: Writing out tuple (1110327545_1,2017-01
-09T16:00:00.000-0800,48,1201,10,2017-01-18T16:00:00.000-0800,3,48,1221,11,2017-02-20T16:00:00.000-
0800,RETN,1110327545,1)
2017-07-18 19:24:45,831 [DEBUG] [TezChild] |impl.IFile|: currentKeyLength=-1, currentValueLength=-1
, bytesRead=165, length=138
2017-07-18 19:24:45,831 [TRACE] [TezChild] |pig.EsStorage|: Writing out tuple (1110327546_1,2017-01-09T16:00:00.000-0800,48,1201,9,2017-01-18T16:00:00.000-0800,3,48,1221,12,2017-02-20T16:00:00.000-0800,RETN,1110327546,1)
2017-07-18 19:24:45,831 [DEBUG] [TezChild] |impl.IFile|: currentKeyLength=-1, currentValueLength=-1, bytesRead=42, length=50
2017-07-18 19:24:45,832 [DEBUG] [TezChild] |impl.IFile|: currentKeyLength=-1, currentValueLength=-1, bytesRead=22, length=35
2017-07-18 19:24:45,832 [DEBUG] [IPC Client (1754444726) connection to ip-XXX-XXX-XXX-XXX.us-west-2.compute.internal/XXX.XXX.XXX.XXX:43380 from application_1500403084622_0004] |security.SaslRpcClient|: reading next wrapped RPC packet
2017-07-18 19:24:45,832 [DEBUG] [IPC Parameter Sending Thread #0] |ipc.Client|: IPC Client (1754444726) connection to ip-XXX-XXX-XXX-XXX.us-west-2.compute.internal/XXX.XXX.XXX.XXX:43380 from application_1500403084622_0004 sending #8
2017-07-18 19:24:45,832 [DEBUG] [IPC Parameter Sending Thread #0] |security.SaslRpcClient|: wrapping token of length:232
2017-07-18 19:24:45,833 [DEBUG] [IPC Client (1754444726) connection to ip-XXX-XXX-XXX-XXX.us-west-2.compute.internal/XXX.XXX.XXX.XXX:43380 from application_1500403084622_0004] |security.SaslRpcClient|: unwrapping token of length:62
2017-07-18 19:24:45,834 [DEBUG] [IPC Client (1754444726) connection to ip-XXX-XXX-XXX-XXX.us-west-2.compute.internal/XXX.XXX.XXX.XXX:43380 from application_1500403084622_0004] |ipc.Client|: IPC Client (1754444726) connection to ip-XXX-XXX-XXX-XXX.us-west-2.compute.internal/XXX.XXX.XXX.XXX:43380 from application_1500403084622_0004 got value #8
2017-07-18 19:24:45,834 [DEBUG] [TezChild] |ipc.RPC|: Call: canCommit 2
2017-07-18 19:24:45,834 [TRACE] [TezChild] |mr.EsOutputFormat|: Closing RecordWriter [null][null]
2017-07-18 19:24:45,834 [TRACE] [TezChild] |mr.EsOutputFormat|: Stopping heartbeat for task_15004030846225_0004_r_000000
2017-07-18 19:24:45,834 [DEBUG] [TezChild] |rest.RestRepository|: Closing repository and connection to Elasticsearch ...
2017-07-18 19:24:45,834 [DEBUG] [TezChild] |rest.RestRepository|: Sending batch of [3236] bytes/[6] entries
2017-07-18 19:24:45,836 [TRACE] [TezChild] |methods.EntityEnclosingMethod|: enter EntityEnclosingMethod.clearRequestBody()
2017-07-18 19:24:45,836 [TRACE] [TezChild] |commonshttp.CommonsHttpTransport|: Tx [PUT]@[es-elb.x-e3.aws.cloud.example.net:443][retnitems-e3-2017-01/retnitem/_bulk] w/ payload [{"update":{"_id":"1110108646_1"}}
{"doc_as_upsert":true,"doc":{"documentID":"1110108646_1",---sensitive fields removed----, "retnTransactionLineItemSequenceNumber":"1"}}
{"update":{"_id":"1110108647_1"}}
{"doc_as_upsert":true,"doc":{"documentID":"1110108647_1",---sensitive fields removed----,"retnTransactionLineItemSequenceNumber":"1"}}
{"update":{"_id":"1110122661_1"}}
{"doc_as_upsert":true,"doc":{"documentID":"1110122661_1",---sensitive fields removed----"retnTransactionLineItemSequenceNumber":"1"}}
{"update":{"_id":"1110327539_1"}}
{"doc_as_upsert":true,"doc":{"documentID":"1110327539_1",---sensitive fields removed----"retnTransactionLineItemSequenceNumber":"1"}}
{"update":{"_id":"1110327545_1"}}
{"doc_as_upsert":true,"doc":{"documentID":"1110327545_1",---sensitive fields removed----"retnTransactionLineItemSequenceNumber":"1"}}
{"update":{"_id":"1110327546_1"}}
{"doc_as_upsert":true,"doc":{"documentID":"1110327546_1",---sensitive fields removed----"retnTransactionLineItemSequenceNumber":"1"}}
]
2017-07-18 19:24:45,836 [TRACE] [TezChild] |httpclient.HttpClient|: enter HttpClient.executeMethod(HttpMethod)
2017-07-18 19:24:45,836 [TRACE] [TezChild] |httpclient.HttpClient|: enter HttpClient.executeMethod(HostConfiguration,HttpMethod,HttpState)
2017-07-18 19:24:45,836 [DEBUG] [TezChild] |httpclient.HttpMethodDirector|: Preemptively sending default basic credentials
2017-07-18 19:24:45,836 [DEBUG] [TezChild] |httpclient.HttpMethodDirector|: Authenticating with BASIC <any realm>@es-elb.x-e3.aws.cloud.example.net:443
2017-07-18 19:24:45,836 [TRACE] [TezChild] |httpclient.HttpState|: enter HttpState.getCredentials(AuthScope)
2017-07-18 19:24:45,836 [TRACE] [TezChild] |auth.BasicScheme|: enter BasicScheme.authenticate(Credentials, HttpMethod)
2017-07-18 19:24:45,836 [TRACE] [TezChild] |auth.BasicScheme|: enter BasicScheme.authenticate(UsernamePasswordCredentials, String)
2017-07-18 19:24:45,836 [TRACE] [TezChild] |httpclient.HttpMethodBase|: HttpMethodBase.addRequestHeader(Header)
2017-07-18 19:24:45,836 [TRACE] [TezChild] |httpclient.HttpMethodDirector|: Attempt number 1 to process request
2017-07-18 19:24:45,836 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.open()
2017-07-18 19:24:45,836 [DEBUG] [TezChild] |httpclient.HttpConnection|: Open connection to es-elb.x-e3.aws.cloud.example.net:443
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.execute(HttpState, HttpConnection)
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.writeRequest(HttpState, HttpConnection)
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.writeRequestLine(HttpState, HttpConnection)
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.generateRequestLine(HttpConnection, String, String, String, String)
2017-07-18 19:24:45,840 [DEBUG] [TezChild] |wire.header|: >> "PUT /retnitems-e3-2017-01/retnitem/_bulk HTTP/1.1[\r][\n]"
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.print(String)
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.write(byte[])
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.write(byte[], int, int)
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.writeRequestHeaders(HttpState,HttpConnection)
2017-07-18 19:24:45,840 [TRACE] [TezChild] |methods.EntityEnclosingMethod|: enter EntityEnclosingMethod.addRequestHeaders(HttpState, HttpConnection)
2017-07-18 19:24:45,840 [TRACE] [TezChild] |methods.ExpectContinueMethod|: enter ExpectContinueMethod.addRequestHeaders(HttpState, HttpConnection)
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.addRequestHeaders(HttpState, HttpConnection)
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.addUserAgentRequestHeaders(HttpState, HttpConnection)
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.addHostRequestHeader(HttpState, HttpConnection)
2017-07-18 19:24:45,840 [DEBUG] [TezChild] |httpclient.HttpMethodBase|: Adding Host request header
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.addCookieRequestHeader(HttpState, HttpConnection)
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpState|: enter HttpState.getCookies()
2017-07-18 19:24:45,840 [TRACE] [TezChild] |cookie.CookieSpec|: enter CookieSpecBase.match(String, int, String, boolean, Cookie[])
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.addProxyConnectionHeader(HttpState, HttpConnection)
2017-07-18 19:24:45,840 [TRACE] [TezChild] |methods.EntityEnclosingMethod|: enter EntityEnclosingMethod.addContentLengthRequestHeader(HttpState, HttpConnection)
2017-07-18 19:24:45,840 [TRACE] [TezChild] |methods.EntityEnclosingMethod|: enter EntityEnclosingMethod.getRequestContentLength()
2017-07-18 19:24:45,840 [TRACE] [TezChild] |methods.EntityEnclosingMethod|: enter EntityEnclosingMethod.hasRequestContent()
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpMethodBase|: HttpMethodBase.addRequestHeader(Header)
2017-07-18 19:24:45,840 [TRACE] [TezChild] |methods.EntityEnclosingMethod|: enter EntityEnclosingMethod.renerateRequestBody()
2017-07-18 19:24:45,840 [DEBUG] [TezChild] |wire.header|: >> "Authorization: Basic ZXNfdXNlcjpFcnRtX2VzX3VzM3I=[\r][\n]"
2017-07-18 19:24:45,840 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.print(String)
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.write(byte[])
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.write(byte[], int, int)
2017-07-18 19:24:45,841 [DEBUG] [TezChild] |wire.header|: >> "User-Agent: Jakarta Commons-HttpClient/3.1[\r][\n]"
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.print(String)
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.write(byte[])
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.write(byte[], int, int)
2017-07-18 19:24:45,841 [DEBUG] [TezChild] |wire.header|: >> "Host: es-elb.x-e3.aws.cloud.example.net[\r][\n]"
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.print(String)
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.write(byte[])
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.write(byte[], int, int)
2017-07-18 19:24:45,841 [DEBUG] [TezChild] |wire.header|: >> "Content-Length: 3236[\r][\n]"
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.print(String)
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.write(byte[])
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.write(byte[], int, int)
2017-07-18 19:24:45,841 [DEBUG] [TezChild] |wire.header|: >> "Content-Type: application/json; charset=UTF-8[\r][\n]"
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.print(String)
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.write(byte[])
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.write(byte[], int, int)
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.writeLine()
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.writeLine()
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.write(byte[])
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.write(byte[], int, int)
2017-07-18 19:24:45,841 [DEBUG] [TezChild] |wire.header|: >> "[\r][\n]"
2017-07-18 19:24:45,841 [TRACE] [TezChild] |methods.EntityEnclosingMethod|: enter EntityEnclosingMethod.writeRequestBody(HttpState, HttpConnection)
2017-07-18 19:24:45,841 [TRACE] [TezChild] |methods.EntityEnclosingMethod|: enter EntityEnclosingMethod.hasRequestContent()
2017-07-18 19:24:45,841 [TRACE] [TezChild] |methods.EntityEnclosingMethod|: enter EntityEnclosingMethod.getRequestContentLength()
2017-07-18 19:24:45,841 [TRACE] [TezChild] |methods.EntityEnclosingMethod|: enter EntityEnclosingMethod.hasRequestContent()
2017-07-18 19:24:45,841 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.getRequestOutputStream()
2017-07-18 19:24:45,841 [DEBUG] [TezChild] |wire.content|: >> "{"update":{"_id":"1110108646_1"}}[\n]"
2017-07-18 19:24:45,841 [DEBUG] [TezChild] |wire.content|: >> "{"doc_as_upsert":true,"doc":{"documentID":"1110108646_1",---sensitive fields removed----,"retnTransactionLineItemSequenceNumber":"1"}}[\n]"
2017-07-18 19:24:45,841 [DEBUG] [TezChild] |wire.content|: >> "{"update":{"_id":"1110108647_1"}}[\n]"
2017-07-18 19:24:45,841 [DEBUG] [TezChild] |wire.content|: >> "{"doc_as_upsert":true,"doc":{"documentID":"1110108647_1",---sensitive fields removed----,"retnTransactionLineItemSequenceNumber":"1"}}[\n]"
2017-07-18 19:24:45,841 [DEBUG] [TezChild] |wire.content|: >> "{"update":{"_id":"1110122661_1"}}[\n]"
2017-07-18 19:24:45,841 [DEBUG] [TezChild] |wire.content|: >> "{"doc_as_upsert":true,"doc":{"documentID":"1110122661_1",---sensitive fields removed----,"retnTransactionLineItemSequenceNumber":"1"}}[\n]"
2017-07-18 19:24:45,853 [DEBUG] [TezChild] |wire.content|: >> "{"update":{"_id":"1110327539_1"}}[\n]"
2017-07-18 19:24:45,854 [DEBUG] [TezChild] |wire.content|:---sensitive fields removed----,"retnTransactionLineItemSequenceNumber":"1"}}[\n]"
2017-07-18 19:24:45,854 [DEBUG] [TezChild] |wire.content|: >> "{"update":{"_id":"1110327545_1"}}[\n]"
2017-07-18 19:24:45,854 [DEBUG] [TezChild] |wire.content|: >> "{"doc_as_upsert":true,"doc":{"documentID":"1110327545_1",---sensitive fields removed----,"retnTransactionLineItemSequenceNumber":"1"}}[\n]"
2017-07-18 19:24:45,854 [DEBUG] [TezChild] |wire.content|: >> "{"update":{"_id":"1110327546_1"}}[\n]"
2017-07-18 19:24:45,854 [DEBUG] [TezChild] |wire.content|: >> "{"doc_as_upsert":true,"doc":{"documentID":"1110327546_1",---sensitive fields removed----,"retnTransactionLineItemSequenceNumber":"1"}}[\n]"
2017-07-18 19:24:45,855 [DEBUG] [TezChild] |methods.EntityEnclosingMethod|: Request body sent
2017-07-18 19:24:45,855 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.flushRequestOutputStream()
2017-07-18 19:24:45,855 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.readResponse(HttpState, HttpConnection)
2017-07-18 19:24:45,855 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.readStatusLine(HttpState, HttpConnection)
2017-07-18 19:24:45,855 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.readLine()
2017-07-18 19:24:45,855 [TRACE] [TezChild] |httpclient.HttpParser|: enter HttpParser.readLine(InputStream, String)
2017-07-18 19:24:45,855 [TRACE] [TezChild] |httpclient.HttpParser|: enter HttpParser.readRawLine()
2017-07-18 19:24:45,860 [DEBUG] [TezChild] |wire.header|: << "HTTP/1.1 200 OK[\r][\n]"
2017-07-18 19:24:45,860 [DEBUG] [TezChild] |wire.header|: << "HTTP/1.1 200 OK[\r][\n]"
2017-07-18 19:24:45,860 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.readResponseHeaders(HttpState,HttpConnection)
2017-07-18 19:24:45,860 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.getResponseInputStream()
2017-07-18 19:24:45,860 [TRACE] [TezChild] |httpclient.HttpParser|: enter HeaderParser.parseHeaders(InputStream, String)
2017-07-18 19:24:45,860 [TRACE] [TezChild] |httpclient.HttpParser|: enter HttpParser.readLine(InputStream, String)
2017-07-18 19:24:45,860 [TRACE] [TezChild] |httpclient.HttpParser|: enter HttpParser.readRawLine()
2017-07-18 19:24:45,860 [DEBUG] [TezChild] |wire.header|: << "Content-Type: application/json; charset=UTF-8[\r][\n]"
2017-07-18 19:24:45,860 [TRACE] [TezChild] |httpclient.HttpParser|: enter HttpParser.readLine(InputStream, String)
2017-07-18 19:24:45,860 [TRACE] [TezChild] |httpclient.HttpParser|: enter HttpParser.readRawLine()
2017-07-18 19:24:45,860 [DEBUG] [TezChild] |wire.header|: << "Content-Length: 1426[\r][\n]"
2017-07-18 19:24:45,860 [TRACE] [TezChild] |httpclient.HttpParser|: enter HttpParser.readLine(InputStream, String)
2017-07-18 19:24:45,860 [TRACE] [TezChild] |httpclient.HttpParser|: enter HttpParser.readRawLine()
2017-07-18 19:24:45,860 [DEBUG] [TezChild] |wire.header|: << "Connection: keep-alive[\r][\n]"
2017-07-18 19:24:45,860 [TRACE] [TezChild] |httpclient.HttpParser|: enter HttpParser.readLine(InputStream, String)
2017-07-18 19:24:45,860 [TRACE] [TezChild] |httpclient.HttpParser|: enter HttpParser.readRawLine()
2017-07-18 19:24:45,860 [DEBUG] [TezChild] |wire.header|: << "[\r][\n]"
2017-07-18 19:24:45,861 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.processResponseHeaders(HttpState, HttpConnection)
2017-07-18 19:24:45,861 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.processCookieHeaders(Header[], HttpState, HttpConnection)
2017-07-18 19:24:45,861 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.readResponseBody(HttpState, HttpConnection)
2017-07-18 19:24:45,861 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.readResponseBody(HttpConnection)
2017-07-18 19:24:45,861 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.getResponseInputStream()
2017-07-18 19:24:45,861 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter HttpMethodBase.canResponseHaveBody(int)
2017-07-18 19:24:45,861 [DEBUG] [TezChild] |httpclient.HttpMethodBase|: Buffering response body
2017-07-18 19:24:45,861 [DEBUG] [TezChild] |wire.content|: << "{"took":4,"errors":true,"items":[{"update":{"_index":"retnitems-e3-2017-01","_type":"retnitem","_id":"1110108646_1","status":400,"error":"StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]"}},{"update":{"_index":"retnitems-e3-2017-01","_type":"retnitem","_id":"1110108647_1","status":400,"error":"StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]"}},{"update":{"_index":"retnitems-e3-2017-01","_type":"retnitem","_id":"1110122661_1","status":400,"error":"StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]"}},{"update":{"_index":"retnitems-e3-2017-01","_type":"retnitem","_id":"1110327539_1","status":400,"error":"StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]"}},{"update":{"_index":"retnitems-e3-2017-01","_type":"retnitem","_id":"1110327545_1","status":400,"error":"StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]"}},{"update":{"_index":"retnitems-e3-2017-01","_type":"retnitem","_id":"1110327546_1","status":400,"error":"StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]"}}]}"
2017-07-18 19:24:45,861 [DEBUG] [TezChild] |httpclient.HttpMethodBase|: Should NOT close connection in response to directive: keep-alive
2017-07-18 19:24:45,861 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.isResponseAvailable()
2017-07-18 19:24:45,861 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.releaseConnection()
2017-07-18 19:24:45,861 [DEBUG] [TezChild] |httpclient.HttpConnection|: Releasing connection back to connection manager.
2017-07-18 19:24:45,861 [TRACE] [TezChild] |httpclient.HttpMethodBase|: enter getContentCharSet( Header contentheader )
2017-07-18 19:24:45,861 [TRACE] [TezChild] |httpclient.HeaderElement|: enter HeaderElement.parseElements(String)
2017-07-18 19:24:45,861 [TRACE] [TezChild] |httpclient.HeaderElement|: enter HeaderElement.parseElements(char[])
2017-07-18 19:24:45,861 [TRACE] [TezChild] |httpclient.HeaderElement|: enter HeaderElement.getParameterByName(String)
2017-07-18 19:24:45,861 [TRACE] [TezChild] |commonshttp.CommonsHttpTransport|: Rx @[XXX.XXX.XXX.XXX] [200-OK] [{"took":4,"errors":true,"items":[{"update":{"_index":"retnitems-e3-2017-01","_type":"retnitem","_id":"1110108646_1","status":400,"error":"StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]"}},{"update":{"_index":"retnitems-e3-2017-01","_type":"retnitem","_id":"1110108647_1","status":400,"error":"StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]"}},{"update":{"_index":"retnitems-e3-2017-01","_type":"retnitem","_id":"1110122661_1","status":400,"error":"StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]"}},{"update":{"_index":"retnitems-e3-2017-01","_type":"retnitem","_id":"1110327539_1","status":400,"error":"StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]"}},{"update":{"_index":"retnitems-e3-2017-01","_type":"retnitem","_id":"1110327545_1","status":400,"error":"StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]"}},{"update":{"_index":"retnitems-e3-2017-01","_type":"retnitem","_id":"1110327546_1","status":400,"error":"StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]"}}]}]
2017-07-18 19:24:45,861 [DEBUG] [TezChild] |httpclient.HttpMethodBase|: re-creating response stream from byte array
2017-07-18 19:24:45,861 [DEBUG] [TezChild] |httpclient.HttpMethodBase|: re-creating response stream from byte array
2017-07-18 19:24:45,871 [TRACE] [TezChild] |commonshttp.CommonsHttpTransport|: Closing HTTP transport to es-elb.x-e3.aws.cloud.example.net:443
2017-07-18 19:24:45,871 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.close()
2017-07-18 19:24:45,871 [TRACE] [TezChild] |httpclient.HttpConnection|: enter HttpConnection.closeSockedAndStreams()
2017-07-18 19:24:45,872 [ERROR] [TezChild] |runtime.PigProcessor|: Encountered exception while processing:
org.elasticsearch.hadoop.rest.EsHadoopInvalidRequest: Found unrecoverable error [es-elb.x-e3.aws.cloud.example.net:443] returned Bad Request(400) - [StrictDynamicMappingException[mapping set to strict, dynamic introduction of [documentID] within [retnitem] is not allowed]]; Bailing out..
        at org.elasticsearch.hadoop.rest.RestClient.retryFailedEntries(RestClient.java:207)
        at org.elasticsearch.hadoop.rest.RestClient.bulk(RestClient.java:170)
        at org.elasticsearch.hadoop.rest.RestRepository.tryFlush(RestRepository.java:225)
        at org.elasticsearch.hadoop.rest.RestRepository.flush(RestRepository.java:248)
        at org.elasticsearch.hadoop.rest.RestRepository.close(RestRepository.java:267)
        at org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.doClose(EsOutputFormat.java:214)
        at org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.close(EsOutputFormat.java:196)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.close(PigOutputFormat.java:154)
        at org.apache.tez.mapreduce.output.MROutput.flush(MROutput.java:546)
        at org.apache.pig.backend.hadoop.executionengine.tez.runtime.PigProcessor.run(PigProcessor.java:272)
        at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370)
        at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)
        at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
        at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)
        at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)
        at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:748)



Version Info

OS: : 4.4.35-33.55.amzn1.x86_64
JVM : openjdk version "1.8.0_131"
Hadoop/Spark: Hadoop 2.7.3-amzn-2
ES-Hadoop : pig-2.4.2
ES : 1.x

Feature description

@jbaiera
Copy link
Member

jbaiera commented Jul 18, 2017

Hi there, could you include TRACE level logging here from your job? Hopefully this will help clarify the issue.

@JohnS89
Copy link
Author

JohnS89 commented Jul 18, 2017

I set the log level to trace and updates the stack trace.

Also, is creating a new field and excluding it from being inserted the only way to get elasticsearch-hadoop to use a document id that is a combination of two fields?

@jbaiera
Copy link
Member

jbaiera commented Jul 18, 2017

is creating a new field and excluding it from being inserted the only way to get elasticsearch-hadoop to use a document id that is a combination of two fields?

For the time being, that is correct. It's fairly easy to generate a field via the provided tools that we integrate with, so we just provide the ability to pull a value from a field for specifying those pieces of metadata. This keeps the connector simpler and easier to test.

Thanks for the logs, but the attached log contents seem like they're just from the job driver. Could you include the logs from the workers? I have a hunch on where this might be breaking, but I want to make sure that it's not something else.

Edit: To elaborate - it seems that the serialization code for Pig does not uniformly apply the field filtering logic the same way that serialization code for other integrations do. I've still got some work to do in order to flesh out the exact place that it's breaking, but it definitely smells like a bug with Pig.

@jbaiera jbaiera added the bug label Jul 18, 2017
@JohnS89
Copy link
Author

JohnS89 commented Jul 18, 2017

Added the workers logs

@jbaiera
Copy link
Member

jbaiera commented Jul 19, 2017

Thanks for the worker logs. I'll see about getting a clean reproduction of this.

@JohnS89
Copy link
Author

JohnS89 commented Jul 21, 2017

Just for the record, writing a hive script with the same parameters instead of using pig worked as expected.

@jbaiera
Copy link
Member

jbaiera commented Jul 21, 2017

@JohnS89 I haven't had a chance to look too deeply into this yet, but I have a suspicion that this has to do with how the Pig serialization logic in the connector handles tuples: See these lines. Normally before writing an entry that has a name that name should be filtered against the mapping exclusion and inclusion properties. It looks like it's just missing here. Will get a fix in hopefully soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants