代码拉取完成,页面将自动刷新
[root@node1 spark-3.0.0-preview2-bin-hadoop3.2]# bin/spark-submit --class cn.spark.WordCount --master yarn --deploy-mode client --executor-memory 512M --total-executor-cores 1 ~/ApacheSpark-0.0.3-alpha.jar
2020-02-24 06:24:32,539 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2020-02-24 06:24:34,229 INFO spark.SparkContext: Running Spark version 3.0.0-preview2
2020-02-24 06:24:34,664 INFO resource.ResourceUtils: ==============================================================
2020-02-24 06:24:34,688 INFO resource.ResourceUtils: Resources for spark.driver:
2020-02-24 06:24:34,701 INFO resource.ResourceUtils: ==============================================================
2020-02-24 06:24:34,705 INFO spark.SparkContext: Submitted application: myWordCount2
2020-02-24 06:24:35,279 INFO spark.SecurityManager: Changing view acls to: root
2020-02-24 06:24:35,281 INFO spark.SecurityManager: Changing modify acls to: root
2020-02-24 06:24:35,282 INFO spark.SecurityManager: Changing view acls groups to:
2020-02-24 06:24:35,283 INFO spark.SecurityManager: Changing modify acls groups to:
2020-02-24 06:24:35,284 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
2020-02-24 06:24:36,767 INFO util.Utils: Successfully started service 'sparkDriver' on port 39858.
2020-02-24 06:24:37,009 INFO spark.SparkEnv: Registering MapOutputTracker
2020-02-24 06:24:37,222 INFO spark.SparkEnv: Registering BlockManagerMaster
2020-02-24 06:24:37,344 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2020-02-24 06:24:37,350 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
2020-02-24 06:24:37,499 INFO spark.SparkEnv: Registering BlockManagerMasterHeartbeat
2020-02-24 06:24:37,681 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-cdf6b798-743a-42ca-ba76-3c7b05160dbd
2020-02-24 06:24:37,827 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MiB
2020-02-24 06:24:38,021 INFO spark.SparkEnv: Registering OutputCommitCoordinator
2020-02-24 06:24:38,500 INFO util.log: Logging initialized @14853ms to org.sparkproject.jetty.util.log.Slf4jLog
2020-02-24 06:24:39,003 INFO server.Server: jetty-9.4.z-SNAPSHOT; built: 2019-04-29T20:42:08.989Z; git: e1bc35120a6617ee3df052294e433f3a25ce7097; jvm 1.8.0_191-b12
2020-02-24 06:24:39,155 INFO server.Server: Started @15518ms
2020-02-24 06:24:39,352 INFO server.AbstractConnector: Started ServerConnector@6c4f9535{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2020-02-24 06:24:39,353 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
2020-02-24 06:24:39,579 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6e33c391{/jobs,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,591 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@251f7d26{/jobs/json,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,595 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@52d10fb8{/jobs/job,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,621 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@22680f52{/jobs/job/json,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,628 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@39c11e6c{/stages,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,632 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@503d56b5{/stages/json,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,635 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@433ffad1{/stages/stage,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,654 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@ecf9fb3{/stages/stage/json,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,657 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@27f9e982{/stages/pool,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,668 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@37d3d232{/stages/pool/json,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,676 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@581d969c{/storage,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,685 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2b46a8c1{/storage/json,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,689 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@29caf222{/storage/rdd,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,701 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5851bd4f{/storage/rdd/json,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,707 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2f40a43{/environment,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,712 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@69c43e48{/environment/json,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,716 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3a80515c{/executors,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,720 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c807b1d{/executors/json,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,727 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1b39fd82{/executors/threadDump,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,738 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@21680803{/executors/threadDump/json,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,800 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@c8b96ec{/static,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,803 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3228d990{/,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,812 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@50b8ae8d{/api,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,816 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@40e4ea87{/jobs/job/kill,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,828 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3a7b503d{/stages/stage/kill,null,AVAILABLE,@Spark}
2020-02-24 06:24:39,860 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://node1.docker:4040
2020-02-24 06:24:39,940 INFO spark.SparkContext: Added JAR file:/root/ApacheSpark-0.0.3-alpha.jar at spark://node1.docker:39858/jars/ApacheSpark-0.0.3-alpha.jar with timestamp 1582543479935
2020-02-24 06:24:41,767 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
2020-02-24 06:24:43,177 INFO yarn.Client: Requesting a new application from cluster with 1 NodeManagers
2020-02-24 06:24:45,318 INFO conf.Configuration: resource-types.xml not found
2020-02-24 06:24:45,320 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
2020-02-24 06:24:45,413 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
2020-02-24 06:24:45,419 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
2020-02-24 06:24:45,422 INFO yarn.Client: Setting up container launch context for our AM
2020-02-24 06:24:45,425 INFO yarn.Client: Setting up the launch environment for our AM container
2020-02-24 06:24:45,513 INFO yarn.Client: Preparing resources for our AM container
2020-02-24 06:24:45,760 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
2020-02-24 06:24:49,134 INFO yarn.Client: Uploading resource file:/tmp/spark-cfa3ca7a-a22d-42c2-aa91-879e71de00b7/__spark_libs__1361487278565526469.zip -> hdfs://192.168.84.133:9000/user/root/.sparkStaging/application_1582541540155_0004/__spark_libs__1361487278565526469.zip
2020-02-24 06:24:56,675 INFO yarn.Client: Uploading resource file:/tmp/spark-cfa3ca7a-a22d-42c2-aa91-879e71de00b7/__spark_conf__5111073800977427415.zip -> hdfs://192.168.84.133:9000/user/root/.sparkStaging/application_1582541540155_0004/__spark_conf__.zip
2020-02-24 06:24:57,543 INFO spark.SecurityManager: Changing view acls to: root
2020-02-24 06:24:57,546 INFO spark.SecurityManager: Changing modify acls to: root
2020-02-24 06:24:57,547 INFO spark.SecurityManager: Changing view acls groups to:
2020-02-24 06:24:57,548 INFO spark.SecurityManager: Changing modify acls groups to:
2020-02-24 06:24:57,548 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
2020-02-24 06:24:57,672 INFO yarn.Client: Submitting application application_1582541540155_0004 to ResourceManager
2020-02-24 06:24:58,019 INFO impl.YarnClientImpl: Submitted application application_1582541540155_0004
2020-02-24 06:24:59,076 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:24:59,131 INFO yarn.Client:
client token: N/A
diagnostics: AM container is launched, waiting for AM container to Register with RM
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: default
start time: 1582543497791
final status: UNDEFINED
tracking URL: http://node1.docker:8088/proxy/application_1582541540155_0004/
user: root
2020-02-24 06:25:00,147 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:01,168 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:02,186 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:03,208 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:04,225 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:05,241 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:06,265 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:07,286 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:08,320 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:09,353 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:10,406 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:11,538 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:12,554 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:13,625 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:14,643 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:15,665 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:16,684 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:17,702 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:18,717 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:19,735 INFO yarn.Client: Application report for application_1582541540155_0004 (state: ACCEPTED)
2020-02-24 06:25:20,747 INFO yarn.Client: Application report for application_1582541540155_0004 (state: RUNNING)
2020-02-24 06:25:20,749 INFO yarn.Client:
client token: N/A
diagnostics: N/A
ApplicationMaster host: 192.168.84.133
ApplicationMaster RPC port: -1
queue: default
start time: 1582543497791
final status: UNDEFINED
tracking URL: http://node1.docker:8088/proxy/application_1582541540155_0004/
user: root
2020-02-24 06:25:20,794 INFO cluster.YarnClientSchedulerBackend: Application application_1582541540155_0004 has started running.
2020-02-24 06:25:20,961 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 39294.
2020-02-24 06:25:20,976 INFO netty.NettyBlockTransferService: Server created on node1.docker:39294
2020-02-24 06:25:21,016 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2020-02-24 06:25:21,099 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, node1.docker, 39294, None)
2020-02-24 06:25:21,179 INFO storage.BlockManagerMasterEndpoint: Registering block manager node1.docker:39294 with 366.3 MiB RAM, BlockManagerId(driver, node1.docker, 39294, None)
2020-02-24 06:25:21,227 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, node1.docker, 39294, None)
2020-02-24 06:25:21,263 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, node1.docker, 39294, None)
2020-02-24 06:25:22,856 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4bd5849e{/metrics/json,null,AVAILABLE,@Spark}
2020-02-24 06:25:23,161 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after waiting maxRegisteredResourcesWaitingTime: 30000000000(ns)
2020-02-24 06:25:23,317 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> node1.docker, PROXY_URI_BASES -> http://node1.docker:8088/proxy/application_1582541540155_0004), /proxy/application_1582541540155_0004
2020-02-24 06:25:27,072 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 295.4 KiB, free 366.0 MiB)
2020-02-24 06:25:27,585 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 27.0 KiB, free 366.0 MiB)
2020-02-24 06:25:27,613 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on node1.docker:39294 (size: 27.0 KiB, free: 366.3 MiB)
2020-02-24 06:25:27,672 INFO spark.SparkContext: Created broadcast 0 from textFile at WordCount.java:17
2020-02-24 06:25:28,955 INFO mapred.FileInputFormat: Total input files to process : 2
2020-02-24 06:25:29,802 INFO Configuration.deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
2020-02-24 06:25:29,858 INFO io.HadoopMapRedCommitProtocol: Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
2020-02-24 06:25:29,868 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 2
2020-02-24 06:25:29,870 INFO output.FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
2020-02-24 06:25:30,110 INFO spark.SparkContext: Starting job: runJob at SparkHadoopWriter.scala:78
2020-02-24 06:25:30,444 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark-client://YarnAM)
2020-02-24 06:25:30,568 INFO scheduler.DAGScheduler: Registering RDD 3 (mapToPair at WordCount.java:20) as input to shuffle 0
2020-02-24 06:25:30,605 INFO scheduler.DAGScheduler: Got job 0 (runJob at SparkHadoopWriter.scala:78) with 3 output partitions
2020-02-24 06:25:30,613 INFO scheduler.DAGScheduler: Final stage: ResultStage 1 (runJob at SparkHadoopWriter.scala:78)
2020-02-24 06:25:30,622 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
2020-02-24 06:25:30,640 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 0)
2020-02-24 06:25:30,698 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[3] at mapToPair at WordCount.java:20), which has no missing parents
2020-02-24 06:25:31,468 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 7.6 KiB, free 366.0 MiB)
2020-02-24 06:25:31,532 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 4.1 KiB, free 366.0 MiB)
2020-02-24 06:25:31,552 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on node1.docker:39294 (size: 4.1 KiB, free: 366.3 MiB)
2020-02-24 06:25:31,558 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1206
2020-02-24 06:25:31,828 INFO scheduler.DAGScheduler: Submitting 3 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[3] at mapToPair at WordCount.java:20) (first 15 tasks are for partitions Vector(0, 1, 2))
2020-02-24 06:25:31,844 INFO cluster.YarnScheduler: Adding task set 0.0 with 3 tasks
2020-02-24 06:25:47,687 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:26:02,658 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:26:04,776 INFO scheduler.AsyncEventQueue: Process of event SparkListenerExecutorMetricsUpdate(driver,WrappedArray(),Map((-1,-1) -> org.apache.spark.executor.ExecutorMetrics@7f82bff)) by listener AppStatusListener took 1.623004319s.
2020-02-24 06:26:18,130 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:26:32,091 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:26:47,298 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:27:25,999 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:27:26,296 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:27:32,292 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:27:33,388 INFO scheduler.AsyncEventQueue: Process of event SparkListenerExecutorMetricsUpdate(driver,WrappedArray(),Map((-1,-1) -> org.apache.spark.executor.ExecutorMetrics@18b820e6)) by listener AppStatusListener took 4.828930604s.
2020-02-24 06:27:50,059 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:27:51,058 INFO scheduler.AsyncEventQueue: Process of event SparkListenerExecutorMetricsUpdate(driver,WrappedArray(),Map((-1,-1) -> org.apache.spark.executor.ExecutorMetrics@270f58cf)) by listener AppStatusListener took 1.021110816s.
2020-02-24 06:28:02,085 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:28:18,226 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:28:32,198 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:28:47,588 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:29:03,072 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:29:17,084 WARN cluster.YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2020-02-24 06:29:21,791 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.84.133:60378) with ID 2
2020-02-24 06:29:21,879 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.84.133:60380) with ID 1
2020-02-24 06:29:23,828 INFO storage.BlockManagerMasterEndpoint: Registering block manager node1.docker:42413 with 93.3 MiB RAM, BlockManagerId(1, node1.docker, 42413, None)
2020-02-24 06:29:23,882 INFO storage.BlockManagerMasterEndpoint: Registering block manager node1.docker:43317 with 93.3 MiB RAM, BlockManagerId(2, node1.docker, 43317, None)
2020-02-24 06:29:24,545 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, node1.docker, executor 2, partition 0, NODE_LOCAL, 7407 bytes)
^[[B2020-02-24 06:29:24,830 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, node1.docker, executor 1, partition 1, NODE_LOCAL, 7404 bytes)
^[[B2020-02-24 06:29:28,575 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on node1.docker:42413 (size: 4.1 KiB, free: 93.3 MiB)
2020-02-24 06:29:28,593 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on node1.docker:43317 (size: 4.1 KiB, free: 93.3 MiB)
2020-02-24 06:29:35,189 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on node1.docker:43317 (size: 27.0 KiB, free: 93.3 MiB)
2020-02-24 06:29:35,538 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on node1.docker:42413 (size: 27.0 KiB, free: 93.3 MiB)
2020-02-24 06:29:46,853 INFO scheduler.TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, node1.docker, executor 2, partition 2, NODE_LOCAL, 7404 bytes)
2020-02-24 06:29:47,206 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 22795 ms on node1.docker (executor 2) (1/3)
2020-02-24 06:29:47,289 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 22476 ms on node1.docker (executor 1) (2/3)
2020-02-24 06:29:48,029 INFO scheduler.TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 1188 ms on node1.docker (executor 2) (3/3)
2020-02-24 06:29:48,050 INFO cluster.YarnScheduler: Removed TaskSet 0.0, whose tasks have all completed, from pool
2020-02-24 06:29:48,115 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (mapToPair at WordCount.java:20) finished in 257.052 s
2020-02-24 06:29:48,128 INFO scheduler.DAGScheduler: looking for newly runnable stages
2020-02-24 06:29:48,135 INFO scheduler.DAGScheduler: running: Set()
2020-02-24 06:29:48,161 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 1)
2020-02-24 06:29:48,168 INFO scheduler.DAGScheduler: failed: Set()
2020-02-24 06:29:48,427 INFO scheduler.DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[5] at saveAsTextFile at WordCount.java:22), which has no missing parents
2020-02-24 06:29:48,826 INFO memory.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 85.6 KiB, free 365.9 MiB)
2020-02-24 06:29:48,861 INFO memory.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 30.7 KiB, free 365.9 MiB)
2020-02-24 06:29:48,946 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on node1.docker:39294 (size: 30.7 KiB, free: 366.2 MiB)
2020-02-24 06:29:48,966 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1206
2020-02-24 06:29:49,044 INFO scheduler.DAGScheduler: Submitting 3 missing tasks from ResultStage 1 (MapPartitionsRDD[5] at saveAsTextFile at WordCount.java:22) (first 15 tasks are for partitions Vector(0, 1, 2))
2020-02-24 06:29:49,044 INFO cluster.YarnScheduler: Adding task set 1.0 with 3 tasks
2020-02-24 06:29:49,170 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0 (TID 3, node1.docker, executor 1, partition 0, NODE_LOCAL, 7154 bytes)
2020-02-24 06:29:49,175 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 1.0 (TID 4, node1.docker, executor 2, partition 1, NODE_LOCAL, 7154 bytes)
2020-02-24 06:29:51,707 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on node1.docker:43317 (size: 30.7 KiB, free: 93.2 MiB)
2020-02-24 06:29:51,808 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on node1.docker:42413 (size: 30.7 KiB, free: 93.2 MiB)
2020-02-24 06:29:58,629 INFO spark.MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to 192.168.84.133:60380
2020-02-24 06:29:58,630 INFO spark.MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to 192.168.84.133:60378
2020-02-24 06:30:04,792 INFO scheduler.TaskSetManager: Starting task 2.0 in stage 1.0 (TID 5, node1.docker, executor 2, partition 2, NODE_LOCAL, 7154 bytes)
2020-02-24 06:30:04,912 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 1.0 (TID 3) in 15849 ms on node1.docker (executor 1) (1/3)
2020-02-24 06:30:04,915 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 1.0 (TID 4) in 15745 ms on node1.docker (executor 2) (2/3)
2020-02-24 06:30:08,065 INFO scheduler.TaskSetManager: Finished task 2.0 in stage 1.0 (TID 5) in 3276 ms on node1.docker (executor 2) (3/3)
2020-02-24 06:30:08,068 INFO cluster.YarnScheduler: Removed TaskSet 1.0, whose tasks have all completed, from pool
2020-02-24 06:30:08,127 INFO scheduler.DAGScheduler: ResultStage 1 (runJob at SparkHadoopWriter.scala:78) finished in 19.450 s
2020-02-24 06:30:08,315 INFO scheduler.DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job
2020-02-24 06:30:08,329 INFO cluster.YarnScheduler: Killing all running tasks in stage 1: Stage finished
2020-02-24 06:30:08,386 INFO scheduler.DAGScheduler: Job 0 finished: runJob at SparkHadoopWriter.scala:78, took 278.272779 s
2020-02-24 06:30:09,095 INFO io.SparkHadoopWriter: Job job_20200224062529_0005 committed.
=================================================================================
wordcount is end
=================================================================================
2020-02-24 06:30:09,391 INFO spark.SparkContext: Invoking stop() from shutdown hook
2020-02-24 06:30:09,676 INFO server.AbstractConnector: Stopped Spark@6c4f9535{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2020-02-24 06:30:09,907 INFO ui.SparkUI: Stopped Spark web UI at http://node1.docker:4040
2020-02-24 06:30:10,329 INFO cluster.YarnClientSchedulerBackend: Interrupting monitor thread
2020-02-24 06:30:11,184 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors
2020-02-24 06:30:11,207 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
2020-02-24 06:30:11,395 INFO cluster.YarnClientSchedulerBackend: Stopped
2020-02-24 06:30:11,840 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
2020-02-24 06:30:12,589 INFO memory.MemoryStore: MemoryStore cleared
2020-02-24 06:30:12,616 INFO storage.BlockManager: BlockManager stopped
2020-02-24 06:30:12,929 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
2020-02-24 06:30:13,142 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
2020-02-24 06:30:13,398 INFO spark.SparkContext: Successfully stopped SparkContext
2020-02-24 06:30:13,403 INFO util.ShutdownHookManager: Shutdown hook called
2020-02-24 06:30:13,467 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-cfa3ca7a-a22d-42c2-aa91-879e71de00b7
2020-02-24 06:30:13,625 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-56cd506f-54ff-4d55-b939-7857e6af0a00
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。