前言
ambari和bigtop联合的第一个发行版终于出来了!!!这是在HDP、CDH大数据平台闭源后的第一个开源免费发行版的大数据平台。下面为大家详细介绍Centos7下的编译方法。
组件版本介绍
组件 | 版本 |
Ambari | 2.8.0 |
Ambari-metrics | 3.0.0 |
Hadoop | 3.3.4 |
Hbase | 2.4.13 |
Hive | 3.1.3 |
Tez | 0.10.1 |
Zookeeper | 3.5.9 |
Kafka | 2.8.1 |
Flink | 1.15.3 |
Spark | 3.2.3 |
Zeppelin | 0.10.1 |
Solr | 8.11.2 |
编译思路
ambari+bigtop并不是打包在一起的,分别对应三个项目:ambari、ambari-metrics、bigtop。所以要分别编译这三个项目,最后将编译好的包提取到一起做成镜像源。另外,bigtop3.2.0不是所有组件都适配了ambari,只需编译上面表格所适配的组件即可。
编译环境
编译环境的准备工作在之前文章里已经写过,请参考《编译环境准备》进行配置;编译时需要星际网络,梯子需要自己搞定。
编译方法
ambari编译步骤
#git克隆ambari源代码 git clone https://github.com/apache/ambari.git #进入ambari根目录 cd ambari #切换到2.8分支 git checkout -b branch-2.8 origin/branch-2.8 #开始编译 mvn clean install rpm:rpm -DskipTests
ambari-metrics编译步骤
#git克隆ambari-metrics git clone https://github.com/apache/ambari-metrics.git #进入ambari-metrics根目录 cd ambari-metrics #切换到3.0分支 git checkout -b branch-3.0 origin/branch-3.0 #提前下载编译时需要的4个tar包--有利于编译加速 wget http://repo.bigtop.apache.org.s3.amazonaws.com/bigtop-stack-binary/3.2.0/centos-7/x86_64/hbase-2.4.13-bin.tar.gz wget http://repo.bigtop.apache.org.s3.amazonaws.com/bigtop-stack-binary/3.2.0/centos-7/x86_64/hadoop-3.3.4.tar.gz wget https://dl.grafana.com/oss/release/grafana-9.3.2.linux-amd64.tar.gz wget http://repo.bigtop.apache.org.s3.amazonaws.com/bigtop-stack-binary/3.2.0/centos-7/x86_64/phoenix-hbase-2.4-5.1.2-bin.tar.gz #修改ambari-metrics/pom.xml文件,将以上4个tar包的url修改成刚才下载好的本地路径file://{下载路径}/hbase-2.4.13-bin.tar.gz file://{下载路径}/hadoop-3.3.4.tar.gz file://{下载路径}/grafana-9.3.2.linux-amd64.tar.gz file://{下载路径}/phoenix-hbase-2.4-5.1.2-bin.tar.gz #编译 mvn clean install -DskipTests -Dbuild-rpm
bigtop编译步骤
#git克隆bigtop源代码 git clone https://github.com/apache/bigtop.git #进入bigtop根目录 cd bigtop #切换到3.2分支 git checkout -b branch-3.2 origin/branch-3.2 #修改bigtop/bigtop.bom配置 有两处要修改 #1.修改镜像源为国内镜像源 103、104行 APACHE_MIRROR = "https://repo.huaweicloud.com/apache" APACHE_ARCHIVE = "https://mirrors.aliyun.com/apache" #2.解开bigtop-select组件的注释 删除273、281行 #安装组件编译所需的依赖 #1.hadoop依赖 yum -y install fuse-devel cmake cmake3 lzo-devel openssl-devel protobuf* cyrus-* cp /usr/bin/cmake3 /usr/bin/cmake #2.zookeeper依赖 yum -y install cppunit-devel #3.spark依赖 yum -y install R* harfbuzz-devel fribidi-devel libcurl-devel libxml2-devel freetype-devel libpng-devel libtiff-devel libjpeg-turbo-devel pandoc* libgit2-devel Rscript -e "install.packages(c('knitr', 'rmarkdown', 'devtools', 'testthat', 'e1071', 'survival'), repos='http://mirrors.tuna.tsinghua.edu.cn/CRAN/')" #修改部分组件源代码 #1.先下载 ./gradlew tez-download zeppelin-download flink-download #2.进入下载目录 cd dl #3.解压这3个tar tar -zxvf flink-1.15.3.tar.gz tar -zxvf apache-tez-0.10.1-src.tar.gz tar -zxvf zeppelin-0.10.1.tar.gz #4.修改flink vi flink-1.15.0/flink-runtime-web/pom.xml 在275行 nodeVersion改为v12.22.1 在276行 npmVersion改为6.14.12 #5.修改tez vi apache-tez-0.10.1-src/tez-ui/pom.xml 在37行 allow-root-build改为--allow-root=true #6.修改zeppelin vi zeppelin-0.10.1/pom.xml 在209行plugin.gitcommitid.useNativeGit改为true vi zeppelin-0.10.1/spark/pom.xml 在50行spark.src.download.url改为https://repo.huaweicloud.com/apache/spark/${spark.archive}/${spark.archive}.tgz 在53行spark.bin.download.url改为https://repo.huaweicloud.com/apache/spark/${spark.archive}/${spark.archive}-bin-without-hadoop.tgz vi zeppelin-0.10.1/rlang/pom.xml 在41行spark.src.download.url改为https://repo.huaweicloud.com/apache/spark/${spark.archive}/${spark.archive}.tgz 在44行spark.bin.download.url改为https://repo.huaweicloud.com/apache/spark/${spark.archive}/${spark.archive}-bin-without-hadoop.tgz vi zeppelin-0.10.1/flink/flink-scala-parent/pom.xml 在45行flink.bin.download.url改为https://repo.huaweicloud.com/apache/flink/flink-${flink.version}/flink-${flink.version}-bin-scala_${flink.scala.binary.version}.tgz #7.重新打包这3个tar tar -zcvf flink-1.15.3.tar.gz flink-1.15.3 tar -zcvf apache-tez-0.10.1-src.tar.gz apache-tez-0.10.1-src tar -zcvf zeppelin-0.10.1.tar.gz zeppelin-0.10.1 #回到bigtop根目录 cd ../ #编译-预计需要一个半小时以上 ./gradlew allclean bigtop-groovy-rpm bigtop-jsvc-rpm bigtop-select-rpm bigtop-utils-rpm flink-rpm hadoop-rpm hbase-rpm hive-rpm kafka-rpm solr-rpm spark-rpm tez-rpm zeppelin-rpm zookeeper-rpm -Dbuildwithdeps=true -PparentDir=/usr/bigtop -PpkgSuffix
制作镜像
#创建bigdatarepo文件夹-路径随意放 mkdir -p bigdatarepo #将ambari包拷贝 mkdir -p bigdatarepo/ambari cp ambari/ambari-server/target/rpm/ambari-server/RPMS/x86_64/ambari-server-2.8.0.0-0.x86_64.rpm bigdatarepo/ambari/ cp ambari/ambari-agent/target/rpm/ambari-agent/RPMS/x86_64/ambari-agent-2.8.0.0-0.x86_64.rpm bigdatarepo/ambari/ #将ambari-metrics包拷贝 mkdir -p bigdatarepo/ambari-metrics cp ambari-metrics/ambari-metrics-assembly/target/rpm/ambari-metrics-collector/RPMS/x86_64/ambari-metrics-collector-3.0.1-1.x86_64.rpm bigdatarepo/ambari-metrics/ cp ambari-metrics/ambari-metrics-assembly/target/rpm/ambari-metrics-grafana/RPMS/x86_64/ambari-metrics-grafana-3.0.1-1.x86_64.rpm bigdatarepo/ambari-metrics/ cp ambari-metrics/ambari-metrics-assembly/target/rpm/ambari-metrics-hadoop-sink/RPMS/x86_64/ambari-metrics-hadoop-sink-3.0.1-1.x86_64.rpm bigdatarepo/ambari-metrics/ cp ambari-metrics/ambari-metrics-assembly/target/rpm/ambari-metrics-monitor/RPMS/x86_64/ambari-metrics-monitor-3.0.1-1.x86_64.rpm bigdatarepo/ambari-metrics/ #将bigtop包拷贝 cp -r bigtop/output/* bigdatarepo/ #制作镜像源 createrepo bigdatarepo/
写在最后
按以上步骤执行,就可以成功编译了,不过一定要注意网络问题。如果觉得编译太麻烦,也可以下载我编译好的包,欢迎一起交流。
echo "编-译-好-的-包-放-在-群-文-件-里-了" echo "欢-迎-加-Q-Q-群-进-行-交-流" echo "7-2-2-0-1-4-9-1-2"
猜你喜欢
网友评论
- 搜索
- 最新文章
- 热门文章