You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
On my system the pipeline-controller loads faster than spark-thriftserver and thus fails to launch. Need to adjust config to control timing?
$ docker-compose -f docker/compose-controller-spark-sql-single.yaml up --force-recreate
WARNING: Found orphan containers (hapi-fhir-db, hapi-server) for this project. If you removed or renamed this service in your compose file, you can run this command with the --remove-orphans flag to clean it up.
Creating pipeline-controller ... done
Creating spark-thriftserver ... done
Attaching to pipeline-controller, spark-thriftserver
spark-thriftserver | spark 22:22:02.55
spark-thriftserver | spark 22:22:02.55 Welcome to the Bitnami spark container
spark-thriftserver | spark 22:22:02.56 Subscribe to project updates by watching https://github.com/bitnami/containers
spark-thriftserver | spark 22:22:02.56 Submit issues and feature requests at https://github.com/bitnami/containers/issues
spark-thriftserver | spark 22:22:02.56
spark-thriftserver |
spark-thriftserver | starting org.apache.spark.sql.hive.thriftserver.HiveThriftServer2, logging to /opt/bitnami/spark/logs/spark--org.apache.spark.sql.hive.thriftserver.HiveThriftServer2-1-88bc91abb91a.out
spark-thriftserver | Spark Command: /opt/bitnami/java/bin/java -cp /opt/bitnami/spark/conf/:/opt/bitnami/spark/jars/* -Xmx1g --add-exports java.base/sun.nio.ch=ALL-UNNAMED -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED org.apache.spark.deploy.SparkSubmit --conf spark.driver.extraJavaOptions=--add-exports java.base/sun.nio.ch=ALL-UNNAMED --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 --name Thrift JDBC/ODBC Server spark-internal
spark-thriftserver | ========================================
spark-thriftserver | 23/05/23 22:22:04 INFO HiveThriftServer2: Started daemon with process name: 20@88bc91abb91a
spark-thriftserver | 23/05/23 22:22:04 INFO SignalUtils: Registering signal handler for TERM
spark-thriftserver | 23/05/23 22:22:04 INFO SignalUtils: Registering signal handler for HUP
spark-thriftserver | 23/05/23 22:22:04 INFO SignalUtils: Registering signal handler for INT
spark-thriftserver | 23/05/23 22:22:04 INFO HiveThriftServer2: Starting SparkContext
pipeline-controller | 22:22:04,781 |-WARN in Logger[org.springframework.jndi.JndiTemplate] - No appenders present in context [default] for logger [org.springframework.jndi.JndiTemplate].
pipeline-controller | 22:22:04,790 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@49c386c8 - URL [jar:file:/app/controller.jar!/BOOT-INF/lib/common-0.1.0-SNAPSHOT.jar!/logback.xml] is not of type file
pipeline-controller | 22:22:04,797 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [STDOUT]
pipeline-controller | 22:22:04,797 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender]
pipeline-controller | 22:22:04,798 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property
pipeline-controller | 22:22:04,798 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.openmrs.analytics] to INFO
pipeline-controller | 22:22:04,798 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [ca.uhn.fhir] to INFO
pipeline-controller | 22:22:04,798 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [com.cerner.bunsen] to INFO
pipeline-controller | 22:22:04,798 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.flink.metrics] to ERROR
pipeline-controller | 22:22:04,798 |-INFO in ch.qos.logback.classic.model.processor.RootLoggerModelHandler - Setting level of ROOT logger to WARN
pipeline-controller | 22:22:04,799 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [STDOUT] to Logger[ROOT]
pipeline-controller | 22:22:04,799 |-INFO in ch.qos.logback.core.model.processor.DefaultProcessor@56528192 - End of configuration.
pipeline-controller | 22:22:04,799 |-INFO in org.springframework.boot.logging.logback.SpringBootJoranConfigurator@6e0dec4a - Registering current configuration as safe fallback point
pipeline-controller |
pipeline-controller |
pipeline-controller | . ____ _ __ _ _
pipeline-controller | /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
pipeline-controller | ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
pipeline-controller | \\/ ___)| |_)| | | | | || (_| | ) ) ) )
pipeline-controller | ' |____| .__|_| |_|_| |_\__, | / / / /
pipeline-controller | =========|_|==============|___/=/_/_/_/
pipeline-controller | :: Spring Boot :: (v3.0.5)
pipeline-controller |
spark-thriftserver | 23/05/23 22:22:04 INFO HiveConf: Found configuration file null
pipeline-controller | 22:22:04.933 [main] INFO o.o.a.ControlPanelApplication org.springframework.boot.StartupInfoLogger.logStarting:51 - Starting ControlPanelApplication using Java 17.0.7 with PID 7 (/app/controller.jar started by root in /app)
pipeline-controller | 22:22:04.942 [main] INFO o.o.a.ControlPanelApplication org.springframework.boot.SpringApplication.logStartupProfileInfo:632 - No active profile set, falling back to 1 default profile: "default"
spark-thriftserver | 23/05/23 22:22:05 INFO SparkContext: Running Spark version 3.3.2
spark-thriftserver | 23/05/23 22:22:05 INFO ResourceUtils: ==============================================================
spark-thriftserver | 23/05/23 22:22:05 INFO ResourceUtils: No custom resources configured for spark.driver.
spark-thriftserver | 23/05/23 22:22:05 INFO ResourceUtils: ==============================================================
spark-thriftserver | 23/05/23 22:22:05 INFO SparkContext: Submitted application: Thrift JDBC/ODBC Server
spark-thriftserver | 23/05/23 22:22:05 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
spark-thriftserver | 23/05/23 22:22:05 INFO ResourceProfile: Limiting resource is cpu
spark-thriftserver | 23/05/23 22:22:05 INFO ResourceProfileManager: Added ResourceProfile id: 0
spark-thriftserver | 23/05/23 22:22:05 INFO SecurityManager: Changing view acls to: spark
spark-thriftserver | 23/05/23 22:22:05 INFO SecurityManager: Changing modify acls to: spark
spark-thriftserver | 23/05/23 22:22:05 INFO SecurityManager: Changing view acls groups to:
spark-thriftserver | 23/05/23 22:22:05 INFO SecurityManager: Changing modify acls groups to:
spark-thriftserver | 23/05/23 22:22:05 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark); groups with view permissions: Set(); users with modify permissions: Set(spark); groups with modify permissions: Set()
spark-thriftserver | 23/05/23 22:22:05 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
spark-thriftserver | 23/05/23 22:22:05 INFO Utils: Successfully started service 'sparkDriver' on port 44219.
spark-thriftserver | 23/05/23 22:22:05 INFO SparkEnv: Registering MapOutputTracker
spark-thriftserver | 23/05/23 22:22:05 INFO SparkEnv: Registering BlockManagerMaster
spark-thriftserver | 23/05/23 22:22:05 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
spark-thriftserver | 23/05/23 22:22:05 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
spark-thriftserver | 23/05/23 22:22:05 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
spark-thriftserver | 23/05/23 22:22:05 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-56012894-2320-450f-a799-c5a9e97ef5be
spark-thriftserver | 23/05/23 22:22:05 INFO MemoryStore: MemoryStore started with capacity 434.4 MiB
spark-thriftserver | 23/05/23 22:22:05 INFO SparkEnv: Registering OutputCommitCoordinator
spark-thriftserver | 23/05/23 22:22:06 INFO Utils: Successfully started service 'SparkUI' on port 4040.
spark-thriftserver | 23/05/23 22:22:06 INFO Executor: Starting executor ID driver on host 88bc91abb91a
spark-thriftserver | 23/05/23 22:22:06 INFO Executor: Starting executor with user classpath (userClassPathFirst = false): ''
spark-thriftserver | 23/05/23 22:22:06 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40741.
spark-thriftserver | 23/05/23 22:22:06 INFO NettyBlockTransferService: Server created on 88bc91abb91a:40741
spark-thriftserver | 23/05/23 22:22:06 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
spark-thriftserver | 23/05/23 22:22:06 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 88bc91abb91a, 40741, None)
spark-thriftserver | 23/05/23 22:22:06 INFO BlockManagerMasterEndpoint: Registering block manager 88bc91abb91a:40741 with 434.4 MiB RAM, BlockManagerId(driver, 88bc91abb91a, 40741, None)
spark-thriftserver | 23/05/23 22:22:06 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 88bc91abb91a, 40741, None)
spark-thriftserver | 23/05/23 22:22:06 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 88bc91abb91a, 40741, None)
spark-thriftserver | 23/05/23 22:22:06 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir.
spark-thriftserver | 23/05/23 22:22:06 INFO SharedState: Warehouse path is 'file:/opt/bitnami/spark/spark-warehouse'.
pipeline-controller | May 23, 2023 10:22:06 PM org.apache.coyote.AbstractProtocol init
pipeline-controller | INFO: Initializing ProtocolHandler ["http-nio-8080"]
pipeline-controller | May 23, 2023 10:22:06 PM org.apache.catalina.core.StandardService startInternal
pipeline-controller | INFO: Starting service [Tomcat]
pipeline-controller | May 23, 2023 10:22:06 PM org.apache.catalina.core.StandardEngine startInternal
pipeline-controller | INFO: Starting Servlet engine: [Apache Tomcat/10.1.7]
pipeline-controller | May 23, 2023 10:22:06 PM org.apache.catalina.core.ApplicationContext log
pipeline-controller | INFO: Initializing Spring embedded WebApplicationContext
pipeline-controller | 22:22:06.843 [main] WARN o.s.c.LocalVariableTableParameterNameDiscoverer o.s.core.LocalVariableTableParameterNameDiscoverer.inspectClass:123 - Using deprecated '-debug' fallback for parameter name resolution. Compile the affected code with '-parameters' instead or avoid its introspection: org.openmrs.analytics.DataProperties$ConfigFields
pipeline-controller | 22:22:06.864 [main] INFO o.openmrs.analytics.DataProperties org.openmrs.analytics.DataProperties.validateProperties:106 - Using JDBC mode since dbConfig is set.
spark-thriftserver | 23/05/23 22:22:07 INFO HiveUtils: Initializing HiveMetastoreConnection version 2.3.9 using Spark classes.
spark-thriftserver | 23/05/23 22:22:07 INFO HiveClientImpl: Warehouse location for Hive client (version 2.3.9) is file:/opt/bitnami/spark/spark-warehouse
spark-thriftserver | 23/05/23 22:22:07 WARN HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
spark-thriftserver | 23/05/23 22:22:07 WARN HiveConf: HiveConf of name hive.stats.retries.wait does not exist
spark-thriftserver | 23/05/23 22:22:07 INFO HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
spark-thriftserver | 23/05/23 22:22:07 INFO ObjectStore: ObjectStore, initialize called
spark-thriftserver | 23/05/23 22:22:07 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
spark-thriftserver | 23/05/23 22:22:07 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
pipeline-controller | 22:22:08.111 [main] INFO o.openmrs.analytics.DataProperties org.openmrs.analytics.DataProperties.createBatchOptions:126 - Converting options for fhirServerUrl null and dbConfig config/hapi-postgres-config_local.json
pipeline-controller | 22:22:08.159 [main] INFO o.openmrs.analytics.DwhFilesManager org.openmrs.analytics.DwhFilesManager.getAllChildDirectories:353 - Child directories : [/dwh/controller_DWH_TIMESTAMP_2023_05_23T22_15_34_195079088Z/]
pipeline-controller | 22:22:08.161 [main] INFO o.openmrs.analytics.PipelineManager org.openmrs.analytics.PipelineManager.initDwhStatus:149 - Initializing with most recent DWH controller_DWH_TIMESTAMP_2023_05_23T22_15_34_195079088Z
pipeline-controller | 22:22:08.190 [main] INFO ca.uhn.fhir.util.VersionUtil ca.uhn.fhir.util.VersionUtil.initialize:85 - HAPI FHIR version 6.4.4 - Rev 107a1bd073
pipeline-controller | 22:22:08.199 [main] INFO ca.uhn.fhir.context.FhirContext ca.uhn.fhir.context.FhirContext.<init>:211 - Creating new FHIR context for FHIR version [R4]
pipeline-controller | 22:22:08.271 [main] INFO c.u.f.c.s.DefaultProfileValidationSupport c.u.f.c.support.DefaultProfileValidationSupport.loadStructureDefinitions:379 - Loading structure definitions from classpath: /org/hl7/fhir/r4/model/profile/profiles-resources.xml
pipeline-controller | 22:22:08.316 [main] INFO ca.uhn.fhir.util.XmlUtil ca.uhn.fhir.util.jar.DependencyLogImpl.logStaxImplementation:75 - FHIR XML procesing will use StAX implementation 'Woodstox' version '5.4.0'
spark-thriftserver | 23/05/23 22:22:09 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
pipeline-controller | 22:22:10.121 [main] INFO c.u.f.c.s.DefaultProfileValidationSupport c.u.f.c.support.DefaultProfileValidationSupport.loadStructureDefinitions:379 - Loading structure definitions from classpath: /org/hl7/fhir/r4/model/profile/profiles-types.xml
pipeline-controller | 22:22:10.202 [main] INFO c.u.f.c.s.DefaultProfileValidationSupport c.u.f.c.support.DefaultProfileValidationSupport.loadStructureDefinitions:379 - Loading structure definitions from classpath: /org/hl7/fhir/r4/model/profile/profiles-others.xml
pipeline-controller | 22:22:10.393 [main] INFO c.u.f.c.s.DefaultProfileValidationSupport c.u.f.c.support.DefaultProfileValidationSupport.loadStructureDefinitions:379 - Loading structure definitions from classpath: /org/hl7/fhir/r4/model/extension/extension-definitions.xml
spark-thriftserver | 23/05/23 22:22:11 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
spark-thriftserver | 23/05/23 22:22:11 INFO ObjectStore: Initialized ObjectStore
spark-thriftserver | 23/05/23 22:22:11 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 2.3.0
spark-thriftserver | 23/05/23 22:22:11 WARN ObjectStore: setMetaStoreSchemaVersion called but recording version is disabled: version = 2.3.0, comment = Set by MetaStore UNKNOWN@192.168.10.5
spark-thriftserver | 23/05/23 22:22:11 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
pipeline-controller | May 23, 2023 10:22:11 PM org.apache.coyote.AbstractProtocol start
pipeline-controller | INFO: Starting ProtocolHandler ["http-nio-8080"]
pipeline-controller | 22:22:11.905 [scheduling-1] INFO o.openmrs.analytics.DwhFilesManager org.openmrs.analytics.DwhFilesManager.checkPurgeScheduleAndTrigger:117 - Last purge run was at null next run is at 2023-05-23T22:22:11.905454039
pipeline-controller | 22:22:11.906 [scheduling-1] INFO o.openmrs.analytics.DwhFilesManager org.openmrs.analytics.DwhFilesManager.checkPurgeScheduleAndTrigger:119 - Purge run triggered at 2023-05-23T22:22:11.906881076
pipeline-controller | 22:22:11.911 [scheduling-1] INFO o.openmrs.analytics.DwhFilesManager org.openmrs.analytics.DwhFilesManager.getAllChildDirectories:353 - Child directories : [/dwh/controller_DWH_TIMESTAMP_2023_05_23T22_15_34_195079088Z/]
pipeline-controller | 22:22:11.913 [scheduling-1] INFO o.openmrs.analytics.DwhFilesManager org.openmrs.analytics.DwhFilesManager.checkPurgeScheduleAndTrigger:121 - Purge run completed at 2023-05-23T22:22:11.913088267
pipeline-controller | 22:22:11.913 [scheduling-1] INFO o.openmrs.analytics.PipelineManager org.openmrs.analytics.PipelineManager.checkSchedule:192 - Last run was at 2023-05-23T22:22:11.308690048 next run is at 2023-05-23T23:00
pipeline-controller | 22:22:11.917 [main] INFO o.o.a.ControlPanelApplication org.springframework.boot.StartupInfoLogger.logStarted:57 - Started ControlPanelApplication in 7.684 seconds (process running for 9.382)
pipeline-controller | 22:22:11.936 [main] INFO o.openmrs.analytics.DwhFilesManager org.openmrs.analytics.DwhFilesManager.getAllChildDirectories:353 - Child directories : [/dwh/controller_DWH_TIMESTAMP_2023_05_23T22_15_34_195079088Z/]
pipeline-controller | 22:22:11.940 [main] INFO o.openmrs.analytics.DwhFilesManager org.openmrs.analytics.DwhFilesManager.getAllChildDirectories:353 - Child directories : [/dwh/controller_DWH_TIMESTAMP_2023_05_23T22_15_34_195079088Z/Observation/, /dwh/controller_DWH_TIMESTAMP_2023_05_23T22_15_34_195079088Z/Encounter/, /dwh/controller_DWH_TIMESTAMP_2023_05_23T22_15_34_195079088Z/Patient/]
spark-thriftserver | 23/05/23 22:22:12 INFO HiveMetaStore: Added admin role in metastore
spark-thriftserver | 23/05/23 22:22:12 INFO HiveMetaStore: Added public role in metastore
pipeline-controller | 22:22:12.118 [main] WARN org.apache.hive.jdbc.HiveConnection org.apache.hive.jdbc.HiveConnection.<init>:237 - Failed to connect to spark-thriftserver:10000
pipeline-controller | 22:22:12.145 [main] ERROR o.openmrs.analytics.PipelineManager org.openmrs.analytics.PipelineManager.createResourceTables:336 - Exception while creating resource tables on thriftserver:
pipeline-controller | java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://spark-thriftserver:10000/default?enabledTLSProtocols=TLSv1.2: java.net.ConnectException: Connection refused
pipeline-controller | at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:256)
pipeline-controller | at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
pipeline-controller | at java.sql/java.sql.DriverManager.getConnection(DriverManager.java:681)
pipeline-controller | at java.sql/java.sql.DriverManager.getConnection(DriverManager.java:252)
pipeline-controller | at org.openmrs.analytics.HiveTableManager.createResourceTable(HiveTableManager.java:118)
pipeline-controller | at org.openmrs.analytics.PipelineManager.createResourceTables(PipelineManager.java:324)
pipeline-controller | at org.openmrs.analytics.PipelineManager.onApplicationEvent(PipelineManager.java:261)
pipeline-controller | at org.openmrs.analytics.PipelineManager.onApplicationEvent(PipelineManager.java:58)
pipeline-controller | at org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener(SimpleApplicationEventMulticaster.java:176)
pipeline-controller | at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:169)
pipeline-controller | at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:143)
pipeline-controller | at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:413)
pipeline-controller | at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:370)
pipeline-controller | at org.springframework.boot.context.event.EventPublishingRunListener.ready(EventPublishingRunListener.java:109)
pipeline-controller | at org.springframework.boot.SpringApplicationRunListeners.lambda$ready$6(SpringApplicationRunListeners.java:80)
pipeline-controller | at java.base/java.lang.Iterable.forEach(Iterable.java:75)
pipeline-controller | at org.springframework.boot.SpringApplicationRunListeners.doWithListeners(SpringApplicationRunListeners.java:118)
pipeline-controller | at org.springframework.boot.SpringApplicationRunListeners.doWithListeners(SpringApplicationRunListeners.java:112)
pipeline-controller | at org.springframework.boot.SpringApplicationRunListeners.ready(SpringApplicationRunListeners.java:80)
pipeline-controller | at org.springframework.boot.SpringApplication.run(SpringApplication.java:329)
pipeline-controller | at org.springframework.boot.SpringApplication.run(SpringApplication.java:1304)
pipeline-controller | at org.springframework.boot.SpringApplication.run(SpringApplication.java:1293)
pipeline-controller | at org.openmrs.analytics.ControlPanelApplication.main(ControlPanelApplication.java:25)
pipeline-controller | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
pipeline-controller | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
pipeline-controller | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
pipeline-controller | at java.base/java.lang.reflect.Method.invoke(Method.java:568)
pipeline-controller | at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:49)
pipeline-controller | at org.springframework.boot.loader.Launcher.launch(Launcher.java:95)
pipeline-controller | at org.springframework.boot.loader.Launcher.launch(Launcher.java:58)
pipeline-controller | at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:65)
pipeline-controller | Caused by: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused
pipeline-controller | at org.apache.thrift.transport.TSocket.open(TSocket.java:226)
pipeline-controller | at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:266)
pipeline-controller | at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
pipeline-controller | at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:343)
pipeline-controller | at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:228)
pipeline-controller | ... 30 common frames omitted
pipeline-controller | Caused by: java.net.ConnectException: Connection refused
pipeline-controller | at java.base/sun.nio.ch.Net.connect0(Native Method)
pipeline-controller | at java.base/sun.nio.ch.Net.connect(Net.java:579)
pipeline-controller | at java.base/sun.nio.ch.Net.connect(Net.java:568)
pipeline-controller | at java.base/sun.nio.ch.NioSocketImpl.connect(NioSocketImpl.java:588)
pipeline-controller | at java.base/java.net.SocksSocketImpl.connect(SocksSocketImpl.java:327)
pipeline-controller | at java.base/java.net.Socket.connect(Socket.java:633)
pipeline-controller | at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
pipeline-controller | ... 34 common frames omitted
spark-thriftserver | 23/05/23 22:22:12 INFO HiveMetaStore: No user is added in admin role, since config is empty
pipeline-controller | 22:22:12.185 [main] ERROR o.s.boot.SpringApplication org.springframework.boot.SpringApplication.reportFailure:822 - Application run failed
pipeline-controller | java.lang.RuntimeException: java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://spark-thriftserver:10000/default?enabledTLSProtocols=TLSv1.2: java.net.ConnectException: Connection refused
pipeline-controller | at org.openmrs.analytics.PipelineManager.createResourceTables(PipelineManager.java:337)
pipeline-controller | at org.openmrs.analytics.PipelineManager.onApplicationEvent(PipelineManager.java:261)
pipeline-controller | at org.openmrs.analytics.PipelineManager.onApplicationEvent(PipelineManager.java:58)
pipeline-controller | at org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener(SimpleApplicationEventMulticaster.java:176)
pipeline-controller | at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:169)
pipeline-controller | at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:143)
pipeline-controller | at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:413)
pipeline-controller | at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:370)
pipeline-controller | at org.springframework.boot.context.event.EventPublishingRunListener.ready(EventPublishingRunListener.java:109)
pipeline-controller | at org.springframework.boot.SpringApplicationRunListeners.lambda$ready$6(SpringApplicationRunListeners.java:80)
pipeline-controller | at java.base/java.lang.Iterable.forEach(Iterable.java:75)
pipeline-controller | at org.springframework.boot.SpringApplicationRunListeners.doWithListeners(SpringApplicationRunListeners.java:118)
pipeline-controller | at org.springframework.boot.SpringApplicationRunListeners.doWithListeners(SpringApplicationRunListeners.java:112)
pipeline-controller | at org.springframework.boot.SpringApplicationRunListeners.ready(SpringApplicationRunListeners.java:80)
pipeline-controller | at org.springframework.boot.SpringApplication.run(SpringApplication.java:329)
pipeline-controller | at org.springframework.boot.SpringApplication.run(SpringApplication.java:1304)
pipeline-controller | at org.springframework.boot.SpringApplication.run(SpringApplication.java:1293)
pipeline-controller | at org.openmrs.analytics.ControlPanelApplication.main(ControlPanelApplication.java:25)
pipeline-controller | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
pipeline-controller | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
pipeline-controller | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
pipeline-controller | at java.base/java.lang.reflect.Method.invoke(Method.java:568)
pipeline-controller | at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:49)
pipeline-controller | at org.springframework.boot.loader.Launcher.launch(Launcher.java:95)
pipeline-controller | at org.springframework.boot.loader.Launcher.launch(Launcher.java:58)
pipeline-controller | at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:65)
pipeline-controller | Caused by: java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://spark-thriftserver:10000/default?enabledTLSProtocols=TLSv1.2: java.net.ConnectException: Connection refused
pipeline-controller | at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:256)
pipeline-controller | at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
pipeline-controller | at java.sql/java.sql.DriverManager.getConnection(DriverManager.java:681)
pipeline-controller | at java.sql/java.sql.DriverManager.getConnection(DriverManager.java:252)
pipeline-controller | at org.openmrs.analytics.HiveTableManager.createResourceTable(HiveTableManager.java:118)
pipeline-controller | at org.openmrs.analytics.PipelineManager.createResourceTables(PipelineManager.java:324)
pipeline-controller | ... 25 common frames omitted
pipeline-controller | Caused by: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused
pipeline-controller | at org.apache.thrift.transport.TSocket.open(TSocket.java:226)
pipeline-controller | at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:266)
pipeline-controller | at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
pipeline-controller | at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:343)
pipeline-controller | at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:228)
pipeline-controller | ... 30 common frames omitted
pipeline-controller | Caused by: java.net.ConnectException: Connection refused
pipeline-controller | at java.base/sun.nio.ch.Net.connect0(Native Method)
pipeline-controller | at java.base/sun.nio.ch.Net.connect(Net.java:579)
pipeline-controller | at java.base/sun.nio.ch.Net.connect(Net.java:568)
pipeline-controller | at java.base/sun.nio.ch.NioSocketImpl.connect(NioSocketImpl.java:588)
pipeline-controller | at java.base/java.net.SocksSocketImpl.connect(SocksSocketImpl.java:327)
pipeline-controller | at java.base/java.net.Socket.connect(Socket.java:633)
pipeline-controller | at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
pipeline-controller | ... 34 common frames omitted
pipeline-controller | May 23, 2023 10:22:12 PM org.apache.coyote.AbstractProtocol pause
pipeline-controller | INFO: Pausing ProtocolHandler ["http-nio-8080"]
pipeline-controller | May 23, 2023 10:22:12 PM org.apache.catalina.core.StandardService stopInternal
pipeline-controller | INFO: Stopping service [Tomcat]
pipeline-controller | May 23, 2023 10:22:12 PM org.apache.coyote.AbstractProtocol stop
pipeline-controller | INFO: Stopping ProtocolHandler ["http-nio-8080"]
pipeline-controller | May 23, 2023 10:22:12 PM org.apache.coyote.AbstractProtocol destroy
pipeline-controller | INFO: Destroying ProtocolHandler ["http-nio-8080"]
spark-thriftserver | 23/05/23 22:22:12 INFO HiveMetaStore: 0: get_database: default
spark-thriftserver | 23/05/23 22:22:12 INFO audit: ugi=spark ip=unknown-ip-addr cmd=get_database: default
spark-thriftserver | 23/05/23 22:22:12 INFO HiveUtils: Initializing execution hive, version 2.3.9
spark-thriftserver | 23/05/23 22:22:12 INFO HiveClientImpl: Warehouse location for Hive client (version 2.3.9) is file:/opt/bitnami/spark/spark-warehouse
spark-thriftserver | 23/05/23 22:22:12 INFO SessionManager: Operation log root directory is created: /tmp/spark/operation_logs
spark-thriftserver | 23/05/23 22:22:12 INFO SessionManager: HiveServer2: Background operation thread pool size: 100
spark-thriftserver | 23/05/23 22:22:12 INFO SessionManager: HiveServer2: Background operation thread wait queue size: 100
spark-thriftserver | 23/05/23 22:22:12 INFO SessionManager: HiveServer2: Background operation thread keepalive time: 10 seconds
spark-thriftserver | 23/05/23 22:22:12 INFO AbstractService: Service:OperationManager is inited.
spark-thriftserver | 23/05/23 22:22:12 INFO AbstractService: Service:SessionManager is inited.
spark-thriftserver | 23/05/23 22:22:12 INFO AbstractService: Service: CLIService is inited.
spark-thriftserver | 23/05/23 22:22:12 INFO AbstractService: Service:ThriftBinaryCLIService is inited.
spark-thriftserver | 23/05/23 22:22:12 INFO AbstractService: Service: HiveServer2 is inited.
spark-thriftserver | 23/05/23 22:22:12 INFO AbstractService: Service:OperationManager is started.
spark-thriftserver | 23/05/23 22:22:12 INFO AbstractService: Service:SessionManager is started.
spark-thriftserver | 23/05/23 22:22:12 INFO AbstractService: Service: CLIService is started.
spark-thriftserver | 23/05/23 22:22:12 INFO AbstractService: Service:ThriftBinaryCLIService is started.
spark-thriftserver | 23/05/23 22:22:12 INFO ThriftCLIService: Starting ThriftBinaryCLIService on port 10000 with 5...500 worker threads
spark-thriftserver | 23/05/23 22:22:12 INFO AbstractService: Service:HiveServer2 is started.
spark-thriftserver | 23/05/23 22:22:12 INFO HiveThriftServer2: HiveThriftServer2 started
pipeline-controller exited with code 1
The text was updated successfully, but these errors were encountered:
On my system the
pipeline-controller
loads faster thanspark-thriftserver
and thus fails to launch. Need to adjust config to control timing?The text was updated successfully, but these errors were encountered: