...... [INFO] Including org.spark-project.spark:unused:jar:1.0.0 in the shaded jar. [WARNING] hadoop-yarn-common-2.6.0.jar, hadoop-yarn-api-2.6.0.jar define 3 overlapping classes: [WARNING] - org.apache.hadoop.yarn.factories.package-info [WARNING] - org.apache.hadoop.yarn.util.package-info [WARNING] - org.apache.hadoop.yarn.factory.providers.package-info [WARNING] unused-1.0.0.jar, spark-streaming-kafka_2.10-1.6.0.jar define 1 overlapping classes: [WARNING] - org.apache.spark.unused.UnusedStubClass [WARNING] hadoop-yarn-common-2.6.0.jar, hadoop-yarn-client-2.6.0.jar define 2 overlapping classes: [WARNING] - org.apache.hadoop.yarn.client.api.impl.package-info [WARNING] - org.apache.hadoop.yarn.client.api.package-info [WARNING] maven-shade-plugin has detected that some class files are [WARNING] present in two or more JARs. When this happens, only one [WARNING] single version of the class is copied to the uber jar. [WARNING] Usually this is not harmful and you can skip these warnings, [WARNING] otherwise try to manually exclude artifacts based on [WARNING] mvn dependency:tree -Ddetail=true and the above output. [WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin [INFO] Replacing original artifact with shaded artifact. [INFO] Replacing /usr/local/src/spark-1.6.0/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.6.0.jar with /usr/local/src/spark-1.6.0/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.6.0-shaded.jar [INFO] Dependency-reduced POM written at: /usr/local/src/spark-1.6.0/external/kafka-assembly/dependency-reduced-pom.xml [INFO] Dependency-reduced POM written at: /usr/local/src/spark-1.6.0/external/kafka-assembly/dependency-reduced-pom.xml [INFO] [INFO] --- maven-source-plugin:2.4:jar-no-fork (create-source-jar) @ spark-streaming-kafka-assembly_2.10 --- [INFO] Building jar: /usr/local/src/spark-1.6.0/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.6.0-sources.jar [INFO] [INFO] --- maven-source-plugin:2.4:test-jar-no-fork (create-source-jar) @ spark-streaming-kafka-assembly_2.10 --- [INFO] Building jar: /usr/local/src/spark-1.6.0/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.6.0-test-sources.jar [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Spark Project Parent POM ........................... SUCCESS [ 29.783 s] [INFO] Spark Project Test Tags ............................ SUCCESS [ 7.380 s] [INFO] Spark Project Launcher ............................. SUCCESS [ 7.855 s] [INFO] Spark Project Networking ........................... SUCCESS [ 8.191 s] [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 4.095 s] [INFO] Spark Project Unsafe ............................... SUCCESS [ 3.700 s] [INFO] Spark Project Core ................................. SUCCESS [02:05 min] [INFO] Spark Project Bagel ................................ SUCCESS [ 13.732 s] [INFO] Spark Project GraphX ............................... SUCCESS [ 30.345 s] [INFO] Spark Project Streaming ............................ SUCCESS [ 49.660 s] [INFO] Spark Project Catalyst ............................. SUCCESS [01:19 min] [INFO] Spark Project SQL .................................. SUCCESS [01:13 min] [INFO] Spark Project ML Library ........................... SUCCESS [01:27 min] [INFO] Spark Project Tools ................................ SUCCESS [ 13.390 s] [INFO] Spark Project Hive ................................. SUCCESS [ 53.630 s] [INFO] Spark Project Docker Integration Tests ............. SUCCESS [ 10.886 s] [INFO] Spark Project REPL ................................. SUCCESS [ 32.466 s] [INFO] Spark Project YARN Shuffle Service ................. SUCCESS [ 9.397 s] [INFO] Spark Project YARN ................................. SUCCESS [ 25.443 s] [INFO] Spark Project Hive Thrift Server ................... SUCCESS [ 19.905 s] [INFO] Spark Project Assembly ............................. SUCCESS [02:55 min] [INFO] Spark Project External Twitter ..................... SUCCESS [ 12.837 s] [INFO] Spark Project External Flume Sink .................. SUCCESS [ 16.321 s] [INFO] Spark Project External Flume ....................... SUCCESS [ 16.459 s] [INFO] Spark Project External Flume Assembly .............. SUCCESS [ 4.650 s] [INFO] Spark Project External MQTT ........................ SUCCESS [ 24.364 s] [INFO] Spark Project External MQTT Assembly ............... SUCCESS [ 8.288 s] [INFO] Spark Project External ZeroMQ ...................... SUCCESS [ 9.547 s] [INFO] Spark Project External Kafka ....................... SUCCESS [ 19.217 s] [INFO] Spark Project Examples ............................. SUCCESS [02:53 min] [INFO] Spark Project External Kafka Assembly .............. SUCCESS [ 9.699 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 19:18 min [INFO] Finished at: 2016-01-12T14:45:36+08:00 [INFO] Final Memory: 424M/2042M [INFO] ------------------------------------------------------------------------ [root@Colonel-Hou spark-1.6.0]#
相关推荐
hdfs源码剖析 基于hadoop2.6 hdfs源码剖析 基于hadoop2.6
本资源是spark-2.0.0-bin-hadoop2.6.tgz百度网盘资源下载,本资源是spark-2.0.0-bin-hadoop2.6.tgz百度网盘资源下载
spark-1.6.3-bin-hadoop2.6.tgz
linux中搭建spark环境使用的spark-1.6.0-bin-hadoop2.6.tgz安装包
spark-2.4.6-bin-hadoop2.6.tgz 官网下载不了的,可以这里下载哦,csdn很稳定哦
Hadoop2.6版本稳定版API文档CHM文件
基于Hadoop图书推荐系统源码+数据库.zip基于Hadoop图书推荐系统源码+数据库.zip基于Hadoop图书推荐系统源码+数据库.zip基于Hadoop图书推荐系统源码+数据库.zip基于Hadoop图书推荐系统源码+数据库.zip基于Hadoop图书...
解压后改名并修改spark-env.sh和slaves,提前安装scala11.x和hadoop2.6。
spark-2.3.4-bin-hadoop2.6.tgz
基于hadoop2.6的最新版的spark安装包,此版本对应的jdk需要在jdk8以上,请悉知
大数据处理系统 hadoop源码分析 基于hadoop2.6
spark-2.4.0-bin-hadoop2.6.tgz-----------------------------------------------linux spark安装
spark-2.4.4-bin-hadoop2.6.tgz
毕业设计-基于Hadoop+Spark的大数据金融信贷风险控系统源码.zip毕业设计-基于Hadoop+Spark的大数据金融信贷风险控系统源码.zip毕业设计-基于Hadoop+Spark的大数据金融信贷风险控系统源码.zip毕业设计-基于Hadoop+...
hadoop 2.6源代码的Windows64位编译版本。
因为从1.2.1版本升级到2.6以后老的不能用,自己编译了一个,网上找的不行,是1.7编译的,太新了所以自己编译了一个。是64位系统上编译的,java 版本是1.6. 已经测试能够使用了。
hadoop2.6-common-bin 解决在Windows上操作hadoop出现 Could not locate executable问题
hadoop2.6 hadoop.dll winutils.exe d
windows环境下使用python2.7配置spark1.6-hadoop2.6时使用工具
【资源说明】 1、该资源包括项目的全部源码,下载可以直接使用! 2、本项目适合作为计算机、数学、电子信息等专业的课程设计、期末大作业和毕设...基于Hadoop、Spark的大数据金融信贷风险控系统源码(高分毕设).zip