Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RHEL 7.8 - Execution failed for task ':datahub-web:execYarn'. #1642

Closed
hamzahafeez7 opened this issue Apr 21, 2020 · 5 comments
Closed

RHEL 7.8 - Execution failed for task ':datahub-web:execYarn'. #1642

hamzahafeez7 opened this issue Apr 21, 2020 · 5 comments
Labels
question Question

Comments

@hamzahafeez7
Copy link

hamzahafeez7 commented Apr 21, 2020

Context:
I have formerly successfully setup and tested features of DataHub on Ubuntu 18.04 LTS by creating Docker Containers for each component in the following order
1- MySQL
2- ElasticSearch + Kibana
3- Neo4J
4- Kafka + ZooKeeper + Schema Reg.
5- DataHub GMS
6- Frontend
7- MCE Consumer
8-MAE Consumer
9-MCE Ingestion
I was able to setup the core Datahub components

Problem:
While replicating the same setup on a Red Hat 7.8 Distribution (Limitation to use this OS) on an EC2 - the Frontend service fails to build with the following error logs. The only important thing I have changed apart from the Flavor of Linux is Images Default Directory from "default" to my "directory-in-non-root-ebs-partition".

Error Logs:

> Task :datahub-web:execYarn
[4/4] Building fresh packages...

> Task :metadata-events:mxe-utils-avro-1.7:classes
> Task :metadata-events:mxe-utils-avro-1.7:jar
> Task :metadata-dao-impl:elasticsearch-dao:classes
> Task :metadata-dao-impl:elasticsearch-dao:jar
> Task :metadata-utils:compileJava
> Task :metadata-utils:classes
> Task :metadata-utils:jar
error /datahub-src/datahub-web/node_modules/broccoli-eyeglass/node_modules/node-sass, /datahub-src/datahub-web/node_modules/eyeglass/node_modules/node-sass: Command failed.
Exit code: 1
Command: node scripts/build.js
Arguments:
Directory: /datahub-src/datahub-web/node_modules/broccoli-eyeglass/node_modules/node-sass
Output:
Building: /datahub-src/datahub-web/build/nodejs/node-v10.6.0-linux-x64/bin/node /datahub-src/datahub-web/node_modules/node-gyp/bin/node-gyp.js rebuild --verbose --libsass_ext= --libsass_cflags= --libsass_ldflags= --libsass_library=
gyp info it worked if it ends with ok
gyp verb cli [ '/datahub-src/datahub-web/build/nodejs/node-v10.6.0-linux-x64/bin/node',
gyp verb cli   '/datahub-src/datahub-web/node_modules/node-gyp/bin/node-gyp.js',
gyp verb cli   'rebuild',
gyp verb cli   '--verbose',
gyp verb cli   '--libsass_ext=',
gyp verb cli   '--libsass_cflags=',
gyp verb cli   '--libsass_ldflags=',
gyp verb cli   '--libsass_library=' ]
gyp info using node-gyp@3.8.0
gyp info using node@10.6.0 | linux | x64
gyp verb command rebuild []
gyp verb command clean []
gyp verb clean removing "build" directory
gyp verb command configure []
gyp verb check python checking for Python executable "python2" in the PATH
gyp verb `which` succeeded python2 /usr/bin/python2
gyp verb check python version `/usr/bin/python2 -c "import sys; print "2.7.16
gyp verb check python version .%s.%s" % sys.version_info[:3];"` returned: %j
gyp verb get node dir no --target version specified, falling back to host node version: 10.6.0
gyp verb command install [ '10.6.0' ]
gyp verb install input version string "10.6.0"
gyp verb install installing version: 10.6.0
gyp verb install --ensure was passed, so won't reinstall if already installed
gyp verb install version not already installed, continuing with install 10.6.0
gyp verb ensuring nodedir is created /root/.node-gyp/10.6.0
gyp verb created nodedir /root/.node-gyp
gyp http GET https://nodejs.org/download/release/v10.6.0/node-v10.6.0-headers.tar.gz
gyp http 200 https://nodejs.org/download/release/v10.6.0/node-v10.6.0-headers.tar.gz
gyp verb extracted file from tarball include/node/common.gypi
gyp verb extracted file from tarball include/node/config.gypi
gyp verb extracted file from tarball include/node/node.h
gyp verb extracted file from tarball include/node/node_api.h
gyp verb extracted file from tarball include/node/node_api_types.h
gyp verb extracted file from tarball include/node/node_buffer.h
gyp verb extracted file from tarball include/node/node_object_wrap.h
gyp verb extracted file from tarball include/node/node_version.h
gyp verb extracted file from tarball include/node/uv.h
gyp verb extracted file from tarball include/node/v8-inspector-protocol.h
gyp verb extracted file from tarball include/node/v8-inspector.h
.
.
.
gyp info spawn args   '--depth=.',
gyp info spawn args   '--no-parallel',
gyp info spawn args   '--generator-output',
gyp info spawn args   'build',
gyp info spawn args   '-Goutput_dir=.' ]
gyp verb command build []
gyp verb build type Release
gyp verb architecture x64
gyp verb node dev dir /root/.node-gyp/10.6.0
gyp ERR! build error
gyp ERR! stack Error: not found: make
gyp ERR! stack     at getNotFoundError (/datahub-src/datahub-web/node_modules/which/which.js:13:12)
gyp ERR! stack     at F (/datahub-src/datahub-web/node_modules/which/which.js:68:19)
gyp ERR! stack     at E (/datahub-src/datahub-web/node_modules/which/which.js:80:29)
gyp ERR! stack     at /datahub-src/datahub-web/node_modules/which/which.js:89:16
gyp ERR! stack     at /datahub-src/datahub-web/node_modules/isexe/index.js:42:5
gyp ERR! stack     at /datahub-src/datahub-web/node_modules/isexe/mode.js:8:5
gyp ERR! stack     at FSReqWrap.oncomplete (fs.js:158:21)
gyp ERR! System Linux 3.10.0-1127.el7.x86_64
gyp ERR! command "/datahub-src/datahub-web/build/nodejs/node-v10.6.0-linux-x64/bin/node" "/datahub-src/datahub-web/node_modules/node-gyp/bin/node-gyp.js" "rebuild" "--verbose" "--libsass_ext=" "--libsass_cflags=" "--libsass_ldflags=" "--libsass_library="
gyp ERR! cwd /datahub-src/datahub-web/node_modules/broccoli-eyeglass/node_modules/node-sass
gyp ERR! node -v v10.6.0
gyp ERR! node-gyp -v v3.8.0
gyp ERR! not ok
Build failed with error code: 1

> Task :datahub-web:execYarn FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':datahub-web:execYarn'.
> Process 'command '/datahub-src/datahub-web/build/yarn/yarn-v1.13.0/bin/yarn'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 2m 57s

I have shortened the log a bit. I can provide complete one if required.

@hamzahafeez7 hamzahafeez7 added the question Question label Apr 21, 2020
@hamzahafeez7
Copy link
Author

hamzahafeez7 commented Apr 22, 2020

Update:
I tried installing via QuickStart Script.
The application is setup, but won't let me Login into Frontend.

Quick Start Logs for FrontEnd:

08:40:12,446 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Found resource [datahub-frontend/conf/logback.xml] at [file:/datahub-frontend/conf/logback.xml]
08:40:12,660 |-INFO in ch.qos.logback.core.joran.action.TimestampAction - Using current interpretation time, i.e. now, as time reference.
08:40:12,661 |-INFO in ch.qos.logback.core.joran.action.TimestampAction - Adding property to the context with key="bySecond" and value="2020-04-22_08-40-12" to the LOCAL scope
08:40:12,661 |-INFO in ch.qos.logback.core.joran.action.TimestampAction - Using current interpretation time, i.e. now, as time reference.
08:40:12,661 |-INFO in ch.qos.logback.core.joran.action.TimestampAction - Adding property to the context with key="byDate" and value="2020-04-22" to the LOCAL scope
08:40:12,661 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender]
08:40:12,666 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [STDOUT]
08:40:12,717 |-WARN in ch.qos.logback.core.ConsoleAppender[STDOUT] - This appender no longer admits a layout as a sub-component, set an encoder instead.
08:40:12,717 |-WARN in ch.qos.logback.core.ConsoleAppender[STDOUT] - To ensure compatibility, wrapping your layout in LayoutWrappingEncoder.
08:40:12,717 |-WARN in ch.qos.logback.core.ConsoleAppender[STDOUT] - See also http://logback.qos.ch/codes.html#layoutInsteadOfEncoder for details
08:40:12,718 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [ch.qos.logback.core.rolling.RollingFileAppender]
08:40:12,726 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [FILE]
08:40:12,729 |-INFO in ch.qos.logback.core.joran.action.NestedComplexPropertyIA - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property
08:40:12,748 |-INFO in ch.qos.logback.core.rolling.FixedWindowRollingPolicy@7486b455 - No compression will be used
08:40:12,756 |-INFO in ch.qos.logback.core.rolling.RollingFileAppender[FILE] - Active log file name: /var/tmp/datahub/datahub-frontend-2020-04-22_08-40-12.log
08:40:12,756 |-INFO in ch.qos.logback.core.rolling.RollingFileAppender[FILE] - File property is set to [/var/tmp/datahub/datahub-frontend-2020-04-22_08-40-12.log]
08:40:12,759 |-INFO in ch.qos.logback.classic.joran.action.RootLoggerAction - Setting level of ROOT logger to INFO
08:40:12,759 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [STDOUT] to Logger[ROOT]
08:40:12,760 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [FILE] to Logger[ROOT]
08:40:12,760 |-INFO in ch.qos.logback.classic.joran.action.ConfigurationAction - End of configuration.
08:40:12,761 |-INFO in ch.qos.logback.classic.joran.JoranConfigurator@7dc3712 - Registering current configuration as safe fallback point
08:40:12 [main] WARN  application - Configuration not found for database: No configuration setting found for key 'db'
08:40:16 [application-akka.actor.default-dispatcher-2] INFO  akka.event.slf4j.Slf4jLogger - Slf4jLogger started
08:40:17 [main] WARN  c.l.r.t.h.client.HttpClientFactory - No scheduled executor is provided to HttpClientFactory, using it's own scheduled executor.
08:40:17 [main] WARN  c.l.r.t.h.client.HttpClientFactory - No callback executor is provided to HttpClientFactory, using it's own call back executor.
08:40:17 [main] WARN  c.l.r.t.h.client.HttpClientFactory - No Compression executor is provided to HttpClientFactory, using it's own compression executor.
08:40:17 [main] INFO  c.l.r.t.h.client.HttpClientFactory - The service 'null' has been assigned to the ChannelPoolManager with key 'noSpecifiedNamePrefix 1138266797 '
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console. Set system property 'log4j2.debug' to show Log4j2 internal initialization logging.
08:40:20 [main] INFO  play.api.Play - Application started (Prod)
08:40:21 [main] INFO  play.core.server.AkkaHttpServer - Listening for HTTP on /0.0.0.0:9001

Logs when Login Fails:

08:45:22 [application-akka.actor.default-dispatcher-10] ERROR application -

! @7fh26p910 - Internal server error, for (GET) [/api/v1/user/me] ->

play.api.UnexpectedException: Unexpected exception[RuntimeException: com.linkedin.restli.client.RestLiResponseException: Response status 500, serviceErrorMessage: javax.persistence.PersistenceException: Query threw SQLException:Table 'datahub.metadata_aspect' doesn't exist Bind values:[] Query was:select t0.urn, t0.aspect, t0.version, t0.createdOn, t0.createdBy, t0.createdFor from metadata_aspect t0 where ((urn = ? and aspect = ? and version = ?) or (urn = ? and aspect = ? and version = ?))]
        at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:247)
        at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:176)
        at play.core.server.AkkaHttpServer$$anonfun$2.applyOrElse(AkkaHttpServer.scala:363)
        at play.core.server.AkkaHttpServer$$anonfun$2.applyOrElse(AkkaHttpServer.scala:361)
        at scala.concurrent.Future$$anonfun$recoverWith$1.apply(Future.scala:346)
        at scala.concurrent.Future$$anonfun$recoverWith$1.apply(Future.scala:345)
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36)
        at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
        at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:91)
        at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
        at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
        at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
        at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:90)
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:43)
        at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.RuntimeException: com.linkedin.restli.client.RestLiResponseException: Response status 500, serviceErrorMessage: javax.persistence.PersistenceException: Query threw SQLException:Table 'datahub.metadata_aspect' doesn't exist Bind values:[] Query was:select t0.urn, t0.aspect, t0.version, t0.createdOn, t0.createdBy, t0.createdFor from metadata_aspect t0 where ((urn = ? and aspect = ? and version = ?) or (urn = ? and aspect = ? and version = ?))
        at controllers.api.v1.User.getLoggedInUser(User.java:51)
        at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$12$$anonfun$apply$12.apply(Routes.scala:772)
        at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$12$$anonfun$apply$12.apply(Routes.scala:772)
        at play.core.routing.HandlerInvokerFactory$$anon$3.resultCall(HandlerInvoker.scala:134)
        at play.core.routing.HandlerInvokerFactory$$anon$3.resultCall(HandlerInvoker.scala:133)
        at play.core.routing.HandlerInvokerFactory$JavaActionInvokerFactory$$anon$8$$anon$2$$anon$1.invocation(HandlerInvoker.scala:108)
        at play.core.j.JavaAction$$anon$1.call(JavaAction.scala:88)
        at play.http.DefaultActionCreator$1.call(DefaultActionCreator.java:31)
        at play.mvc.Security$AuthenticatedAction.call(Security.java:69)
        at play.core.j.JavaAction$$anonfun$9.apply(JavaAction.scala:138)
        at play.core.j.JavaAction$$anonfun$9.apply(JavaAction.scala:138)
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
        at play.core.j.HttpExecutionContext$$anon$2.run(HttpExecutionContext.scala:56)
        at play.api.libs.streams.Execution$trampoline$.execute(Execution.scala:70)
        at play.core.j.HttpExecutionContext.execute(HttpExecutionContext.scala:48)
        at scala.concurrent.impl.Future$.apply(Future.scala:31)
        at scala.concurrent.Future$.apply(Future.scala:494)
        at play.core.j.JavaAction.apply(JavaAction.scala:138)
        at play.api.mvc.Action$$anonfun$apply$2.apply(Action.scala:96)
        at play.api.mvc.Action$$anonfun$apply$2.apply(Action.scala:89)
        at play.api.libs.streams.StrictAccumulator$$anonfun$mapFuture$2$$anonfun$1.apply(Accumulator.scala:174)
        at play.api.libs.streams.StrictAccumulator$$anonfun$mapFuture$2$$anonfun$1.apply(Accumulator.scala:174)
        at scala.util.Try$.apply(Try.scala:192)
        at play.api.libs.streams.StrictAccumulator$$anonfun$mapFuture$2.apply(Accumulator.scala:174)
        at play.api.libs.streams.StrictAccumulator$$anonfun$mapFuture$2.apply(Accumulator.scala:170)
        at scala.Function1$$anonfun$andThen$1.apply(Function1.scala:52)
        at play.api.libs.streams.StrictAccumulator.run(Accumulator.scala:207)
        at play.core.server.AkkaHttpServer$$anonfun$14.apply(AkkaHttpServer.scala:357)
        at play.core.server.AkkaHttpServer$$anonfun$14.apply(AkkaHttpServer.scala:355)
        at akka.http.scaladsl.util.FastFuture$.akka$http$scaladsl$util$FastFuture$$strictTransform$1(FastFuture.scala:41)
        at akka.http.scaladsl.util.FastFuture$$anonfun$transformWith$extension1$1.apply(FastFuture.scala:51)
        at akka.http.scaladsl.util.FastFuture$$anonfun$transformWith$extension1$1.apply(FastFuture.scala:50)
        ... 13 common frames omitted
Caused by: com.linkedin.restli.client.RestLiResponseException: com.linkedin.restli.client.RestLiResponseException: Response status 500, serviceErrorMessage: javax.persistence.PersistenceException: Query threw SQLException:Table 'datahub.metadata_aspect' doesn't exist Bind values:[] Query was:select t0.urn, t0.aspect, t0.version, t0.createdOn, t0.createdBy, t0.createdFor from metadata_aspect t0 where ((urn = ? and aspect = ? and version = ?) or (urn = ? and aspect = ? and version = ?))
        at com.linkedin.restli.internal.client.ExceptionUtil.wrapThrowable(ExceptionUtil.java:130)
        at com.linkedin.restli.internal.client.ResponseFutureImpl.getResponseImpl(ResponseFutureImpl.java:130)
        at com.linkedin.restli.internal.client.ResponseFutureImpl.getResponse(ResponseFutureImpl.java:94)
        at com.linkedin.identity.client.CorpUsers.get(CorpUsers.java:59)
        at com.linkedin.datahub.dao.view.CorpUserViewDao.get(CorpUserViewDao.java:33)
        at com.linkedin.datahub.dao.view.CorpUserViewDao.getByUserName(CorpUserViewDao.java:51)
        at controllers.api.v1.User.getLoggedInUser(User.java:49)
        ... 45 common frames omitted
Caused by: com.linkedin.restli.client.RestLiResponseException: RestException{_response=RestResponse[headers={content-length=11179, Content-Type=application/json, Date=Wed, 22 Apr 2020 08:45:22 GMT, Server=Jetty(9.4.20.v20190813), X-RestLi-Error-Response=true, X-RestLi-Protocol-Version=2.0.0},cookies=[],status=500,entityLength=11179]}
        at com.linkedin.restli.internal.client.ExceptionUtil.exceptionForThrowable(ExceptionUtil.java:102)
        at com.linkedin.restli.client.RestLiCallbackAdapter.convertError(RestLiCallbackAdapter.java:50)
        at com.linkedin.common.callback.CallbackAdapter.onError(CallbackAdapter.java:86)
        at com.linkedin.r2.transport.common.bridge.client.TransportCallbackAdapter.onResponse(TransportCallbackAdapter.java:47)
        at com.linkedin.r2.filter.transport.ResponseFilter.onRestError(ResponseFilter.java:79)
        at com.linkedin.r2.filter.TimedRestFilter.onRestError(TimedRestFilter.java:92)
        at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:166)
        at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:132)
        at com.linkedin.r2.filter.FilterChainIterator.onError(FilterChainIterator.java:101)
        at com.linkedin.r2.filter.TimedNextFilter.onError(TimedNextFilter.java:48)
        at com.linkedin.r2.filter.message.rest.RestFilter.onRestError(RestFilter.java:84)
        at com.linkedin.r2.filter.TimedRestFilter.onRestError(TimedRestFilter.java:92)
        at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:166)
        at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:132)
        at com.linkedin.r2.filter.FilterChainIterator.onError(FilterChainIterator.java:101)
        at com.linkedin.r2.filter.TimedNextFilter.onError(TimedNextFilter.java:48)
        at com.linkedin.r2.filter.message.rest.RestFilter.onRestError(RestFilter.java:84)
        at com.linkedin.r2.filter.TimedRestFilter.onRestError(TimedRestFilter.java:92)
        at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:166)
        at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:132)
        at com.linkedin.r2.filter.FilterChainIterator.onError(FilterChainIterator.java:101)
        at com.linkedin.r2.filter.TimedNextFilter.onError(TimedNextFilter.java:48)
        at com.linkedin.r2.filter.message.rest.RestFilter.onRestError(RestFilter.java:84)
        at com.linkedin.r2.filter.TimedRestFilter.onRestError(TimedRestFilter.java:92)
        at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:166)
        at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:132)
        at com.linkedin.r2.filter.FilterChainIterator.onError(FilterChainIterator.java:101)
        at com.linkedin.r2.filter.TimedNextFilter.onError(TimedNextFilter.java:48)
        at com.linkedin.r2.filter.transport.ClientRequestFilter.lambda$createCallback$0(ClientRequestFilter.java:95)
        at com.linkedin.r2.transport.http.common.HttpBridge$1.onResponse(HttpBridge.java:82)
        at com.linkedin.r2.transport.http.client.rest.ExecutionCallback.lambda$onResponse$0(ExecutionCallback.java:64)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: com.linkedin.r2.message.rest.RestException: Received error 500 from server for URI http://datahub-gms:8080/corpUsers/($params:(),name:datahub)
        at com.linkedin.r2.transport.http.common.HttpBridge$1.onResponse(HttpBridge.java:76)
        ... 4 common frames omitted

@clojurians-org
Copy link
Contributor

for login related error:
you should check gms backend log and mainly it's mysql store backend.

  1. check the mysql connection
  2. ensure the mysql schema and data exist already.
    https://github.com/linkedin/datahub/blob/master/docker/mysql/init.sql

@mars-lan
Copy link
Contributor

Yes, seems like your DB is somehow not initialized properly. As pointed out by @clojurians-org running that init script directly should fix the login issue.

@hamzahafeez7
Copy link
Author

Thanks all for help. I had the issue resolved. It seems I was missing out on some packages similar to "Build Essentials". I resolved it by the following command on RHEL.

yum install -y gcc
 yum groupinstall 'Development Tools' -y

I have faced the MySQL "init.sql" issue as well and followed @mars-lan instructions on how to resolve it following another question which was this.

docker exec -i mysql sh -c 'exec mysql datahub -udatahub -pdatahub' < docker/mysql/init.sql

However, I shall be creating a new question for help on resuming the containers after first time installation. That I am still unable to fix. The same login issue when resuming the containers after a reboot of system.
Thanks again.

@hamzahafeez7
Copy link
Author

Closing the issue since it has been resolved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Question
Projects
None yet
Development

No branches or pull requests

3 participants