[rokshan.jahan@ip-10-48-3-63 bin]$ ./adam-submit transformAlignments /home/rokshan.jahan/project/spark-genome-alignment-demo/build/adam/adam-core/src/test/resources/small.sam /home/rokshan.jahan/project/spark-genome-alignment-demo/build/adam/adam-core/src/test/resources/small.adam Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain Using SPARK_SUBMIT=/usr/bin/spark-submit 17/09/12 18:08:49 INFO cli.ADAMMain: ADAM invoked with args: "transformAlignments" "/home/rokshan.jahan/project/spark-genome-alignment-demo/build/adam/adam-core/src/test/resources/small.sam" "/home/rokshan.jahan/project/spark-genome-alignment-demo/build/adam/adam-core/src/test/resources/small.adam" 17/09/12 18:08:49 INFO spark.SparkContext: Running Spark version 1.6.0 17/09/12 18:08:50 INFO spark.SecurityManager: Changing view acls to: rokshan.jahan 17/09/12 18:08:50 INFO spark.SecurityManager: Changing modify acls to: rokshan.jahan 17/09/12 18:08:50 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(rokshan.jahan); users with modify permissions: Set(rokshan.jahan) 17/09/12 18:08:50 INFO util.Utils: Successfully started service 'sparkDriver' on port 34795. 17/09/12 18:08:50 INFO slf4j.Slf4jLogger: Slf4jLogger started 17/09/12 18:08:50 INFO Remoting: Starting remoting 17/09/12 18:08:50 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.48.3.63:42235] 17/09/12 18:08:50 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriverActorSystem@10.48.3.63:42235] 17/09/12 18:08:50 INFO util.Utils: Successfully started service 'sparkDriverActorSystem' on port 42235. 17/09/12 18:08:50 INFO spark.SparkEnv: Registering MapOutputTracker 17/09/12 18:08:50 INFO spark.SparkEnv: Registering BlockManagerMaster 17/09/12 18:08:50 INFO storage.DiskBlockManager: Created local directory at /data1/tmp/blockmgr-5a9a89eb-3074-49df-a9e8-dcde9371c980 17/09/12 18:08:50 INFO storage.MemoryStore: MemoryStore started with capacity 530.0 MB 17/09/12 18:08:51 INFO spark.SparkEnv: Registering OutputCommitCoordinator 17/09/12 18:08:51 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. 17/09/12 18:08:51 INFO ui.SparkUI: Started SparkUI at http://10.48.3.63:4040 17/09/12 18:08:51 INFO spark.SparkContext: Added JAR file:/home/rokshan.jahan/project/spark-genome-alignment-demo/build/adam/bin/../adam-assembly/target/adam-assembly_2.10-0.23.0-SNAPSHOT.jar at spark://10.48.3.63:34795/jars/adam-assembly_2.10-0.23.0-SNAPSHOT.jar with timestamp 1505254131268 17/09/12 18:08:51 INFO client.RMProxy: Connecting to ResourceManager at ip-10-48-3-5.ips.local/10.48.3.5:8032 17/09/12 18:08:51 INFO yarn.Client: Requesting a new application from cluster with 4 NodeManagers 17/09/12 18:08:51 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (28672 MB per container) 17/09/12 18:08:51 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead 17/09/12 18:08:51 INFO yarn.Client: Setting up container launch context for our AM 17/09/12 18:08:51 INFO yarn.Client: Setting up the launch environment for our AM container 17/09/12 18:08:51 INFO yarn.Client: Preparing resources for our AM container 17/09/12 18:08:52 INFO yarn.YarnSparkHadoopUtil: getting token for namenode: hdfs://ip-10-48-3-5.ips.local:8020/user/rokshan.jahan/.sparkStaging/application_1502373891039_16191 17/09/12 18:08:52 INFO hdfs.DFSClient: Created token for rokshan.jahan: HDFS_DELEGATION_TOKEN owner=rokshan.jahan@IPS.LOCAL, renewer=yarn, realUser=, issueDate=1505254132498, maxDate=1505858932498, sequenceNumber=81600, masterKeyId=748 on 10.48.3.5:8020 17/09/12 18:08:53 INFO hive.metastore: Trying to connect to metastore with URI thrift://ip-10-48-3-5.ips.local:9083 17/09/12 18:08:53 INFO hive.metastore: Opened a connection to metastore, current connections: 1 17/09/12 18:08:53 INFO hive.metastore: Connected to metastore. 17/09/12 18:08:53 INFO hive.metastore: Closed a connection to metastore, current connections: 0 17/09/12 18:08:53 INFO yarn.Client: Uploading resource file:/data1/tmp/spark-f5a591d0-12ae-4ea0-baf7-09fb9722477e/__spark_conf__8470767735747936118.zip -> hdfs://ip-10-48-3-5.ips.local:8020/user/rokshan.jahan/.sparkStaging/application_1502373891039_16191/__spark_conf__8470767735747936118.zip 17/09/12 18:08:54 INFO spark.SecurityManager: Changing view acls to: rokshan.jahan 17/09/12 18:08:54 INFO spark.SecurityManager: Changing modify acls to: rokshan.jahan 17/09/12 18:08:54 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(rokshan.jahan); users with modify permissions: Set(rokshan.jahan) 17/09/12 18:08:54 INFO yarn.Client: Submitting application 16191 to ResourceManager 17/09/12 18:08:54 INFO impl.YarnClientImpl: Submitted application application_1502373891039_16191 17/09/12 18:08:55 INFO yarn.Client: Application report for application_1502373891039_16191 (state: ACCEPTED) 17/09/12 18:08:55 INFO yarn.Client: client token: Token { kind: YARN_CLIENT_TOKEN, service: } diagnostics: N/A ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: root.users.rokshan_dot_jahan start time: 1505254134291 final status: UNDEFINED tracking URL: http://ip-10-48-3-5.ips.local:8088/proxy/application_1502373891039_16191/ user: rokshan.jahan 17/09/12 18:08:56 INFO yarn.Client: Application report for application_1502373891039_16191 (state: ACCEPTED) 17/09/12 18:08:57 INFO yarn.Client: Application report for application_1502373891039_16191 (state: ACCEPTED) 17/09/12 18:08:58 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null) 17/09/12 18:08:58 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> ip-10-48-3-5.ips.local, PROXY_URI_BASES -> http://ip-10-48-3-5.ips.local:8088/proxy/application_1502373891039_16191), /proxy/application_1502373891039_16191 17/09/12 18:08:58 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter 17/09/12 18:08:58 INFO yarn.Client: Application report for application_1502373891039_16191 (state: ACCEPTED) 17/09/12 18:08:59 INFO yarn.Client: Application report for application_1502373891039_16191 (state: RUNNING) 17/09/12 18:08:59 INFO yarn.Client: client token: Token { kind: YARN_CLIENT_TOKEN, service: } diagnostics: N/A ApplicationMaster host: 10.48.3.65 ApplicationMaster RPC port: 0 queue: root.users.rokshan_dot_jahan start time: 1505254134291 final status: UNDEFINED tracking URL: http://ip-10-48-3-5.ips.local:8088/proxy/application_1502373891039_16191/ user: rokshan.jahan 17/09/12 18:08:59 INFO cluster.YarnClientSchedulerBackend: Application application_1502373891039_16191 has started running. 17/09/12 18:08:59 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34798. 17/09/12 18:08:59 INFO netty.NettyBlockTransferService: Server created on 34798 17/09/12 18:08:59 INFO storage.BlockManager: external shuffle service port = 7337 17/09/12 18:08:59 INFO storage.BlockManagerMaster: Trying to register BlockManager 17/09/12 18:08:59 INFO storage.BlockManagerMasterEndpoint: Registering block manager 10.48.3.63:34798 with 530.0 MB RAM, BlockManagerId(driver, 10.48.3.63, 34798) 17/09/12 18:08:59 INFO storage.BlockManagerMaster: Registered BlockManager 17/09/12 18:08:59 INFO scheduler.EventLoggingListener: Logging events to hdfs://ip-10-48-3-5.ips.local:8020/user/spark/applicationHistory/application_1502373891039_16191 17/09/12 18:08:59 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8 17/09/12 18:08:59 INFO rdd.ADAMContext: Loading /home/rokshan.jahan/project/spark-genome-alignment-demo/build/adam/adam-core/src/test/resources/small.sam as BAM/CRAM/SAM and converting to AlignmentRecords. Command body threw exception: java.io.FileNotFoundException: Couldn't find any files matching /home/rokshan.jahan/project/spark-genome-alignment-demo/build/adam/adam-core/src/test/resources/small.sam. If you are trying to glob a directory of Parquet files, you need to glob inside the directory as well (e.g., "glob.me.*.adam/*", instead of "glob.me.*.adam". 17/09/12 18:08:59 INFO cli.TransformAlignments: Overall Duration: 10.13 secs Exception in thread "main" java.io.FileNotFoundException: Couldn't find any files matching /home/rokshan.jahan/project/spark-genome-alignment-demo/build/adam/adam-core/src/test/resources/small.sam. If you are trying to glob a directory of Parquet files, you need to glob inside the directory as well (e.g., "glob.me.*.adam/*", instead of "glob.me.*.adam". at org.bdgenomics.adam.rdd.ADAMContext.getFiles(ADAMContext.scala:1293) at org.bdgenomics.adam.rdd.ADAMContext.getFsAndFiles(ADAMContext.scala:1322) at org.bdgenomics.adam.rdd.ADAMContext$$anonfun$loadBam$1.apply(ADAMContext.scala:1449) at org.bdgenomics.adam.rdd.ADAMContext$$anonfun$loadBam$1.apply(ADAMContext.scala:1446) at scala.Option.fold(Option.scala:157) at org.apache.spark.rdd.Timer.time(Timer.scala:48) at org.bdgenomics.adam.rdd.ADAMContext.loadBam(ADAMContext.scala:1446) at org.bdgenomics.adam.rdd.ADAMContext$$anonfun$loadAlignments$1.apply(ADAMContext.scala:2837) at org.bdgenomics.adam.rdd.ADAMContext$$anonfun$loadAlignments$1.apply(ADAMContext.scala:2828) at scala.Option.fold(Option.scala:157) at org.apache.spark.rdd.Timer.time(Timer.scala:48) at org.bdgenomics.adam.rdd.ADAMContext.loadAlignments(ADAMContext.scala:2828) at org.bdgenomics.adam.cli.TransformAlignments.run(TransformAlignments.scala:481) at org.bdgenomics.utils.cli.BDGSparkCommand$class.run(BDGCommand.scala:55) at org.bdgenomics.adam.cli.TransformAlignments.run(TransformAlignments.scala:138) at org.bdgenomics.adam.cli.ADAMMain.apply(ADAMMain.scala:126) at org.bdgenomics.adam.cli.ADAMMain$.main(ADAMMain.scala:65) at org.bdgenomics.adam.cli.ADAMMain.main(ADAMMain.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 17/09/12 18:08:59 INFO spark.SparkContext: Invoking stop() from shutdown hook 17/09/12 18:08:59 INFO ui.SparkUI: Stopped Spark web UI at http://10.48.3.63:4040 17/09/12 18:08:59 INFO cluster.YarnClientSchedulerBackend: Interrupting monitor thread 17/09/12 18:08:59 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors 17/09/12 18:08:59 INFO cluster.YarnClientSchedulerBackend: Asking each executor to shut down 17/09/12 18:08:59 INFO cluster.YarnClientSchedulerBackend: Stopped 17/09/12 18:08:59 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 17/09/12 18:09:00 INFO storage.MemoryStore: MemoryStore cleared 17/09/12 18:09:00 INFO storage.BlockManager: BlockManager stopped 17/09/12 18:09:00 INFO storage.BlockManagerMaster: BlockManagerMaster stopped 17/09/12 18:09:00 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 17/09/12 18:09:00 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 17/09/12 18:09:00 INFO spark.SparkContext: Successfully stopped SparkContext 17/09/12 18:09:00 INFO util.ShutdownHookManager: Shutdown hook called 17/09/12 18:09:00 INFO util.ShutdownHookManager: Deleting directory /data1/tmp/spark-f5a591d0-12ae-4ea0-baf7-09fb9722477e 17/09/12 18:09:00 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. [rokshan.jahan@ip-10-48-3-63 bin]$ ./adam-submit transformAlignments /home/rokshan.jahan/project/spark-genome-alignment-demo/build/adam/adam-core/src/test/resources/small.sam /home/rokshan.jahan/project/spark-genome-alignment-demo/build/adam/adam-core/src/test/resources/small.adam