天天看點

HIve安裝踩坑手冊一、安裝環境二、安裝Hadoop三、安裝hive四、錯誤記錄

hive是基于Hadoop的一個資料倉庫工具,是以我們要安裝hive首先要安裝Hadoop。本文hadoop和hive的安裝方式和我踩的一些坑記錄起來分享給大家

目錄

一、安裝環境

二、安裝Hadoop

1、下載下傳hadoop

2、修改環境變量

三、安裝hive

1、下載下傳hive

2、修改環境變量

3、修改hivesite 配置

4、驗證是否安裝成功

四、錯誤記錄

1、配置檔案中存在異常字元

二、guava版本不一緻

一、安裝環境

JDK 1.8

二、安裝Hadoop

1、下載下傳hadoop

 http://mirror.bit.edu.cn/apache/hadoop/ 選擇合适的版本

下載下傳hadoop

wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-3.3.0/hadoop-3.3.0.tar.gz
           

執行 進行解壓,為了友善使用嗎,mv進行修改名稱

tar -xzvf hadoop-3.3.0.tar.gz 
mv hadoop-3.3.0.tar.gz  hadoop 
           

2、修改環境變量

将hadoop環境資訊寫入環境變量中

vim /etc/profile

export HADOOP_HOME=/opt/hadoop

export PATH=$HADOOP_HOME/bin:$PATH
           

執行source etc/profile使其生效

3、修改配置檔案

修改hadoop-env.sh檔案,vim etc/hadoop/hadoop-env.sh修改JAVA_HOME資訊

export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.262.b10-0.el7_8.x86_64
           

執行hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.3.0.jar grep input output 'dfs[a-z]',hadoop自帶的例子,驗證hadoop是否安裝成功

三、安裝hive

1、下載下傳hive

wget http://mirror.bit.edu.cn/apache/hive/hive-3.1.2/apache-hive-3.1.2-bin.tar.gz

解壓tar -zxvf apache-hive-3.1.2-bin.tar.gz 

修改名稱 mv apache-hive-3.1.2-bin hive

2、修改環境變量

vim /etc/profile

export HIVE_HOME=/opt/hive

export PATH=$MAVEN_HOME/bin:$HIVE_HOME/bin:$HADOOP_HOME/bin:$PATH
           

source etc/profile

3、修改hivesite 配置

<!-- WARNING!!! This file is auto generated  for   documentation purposes ONLY! -->

<!-- WARNING!!! Any changes you make to  this   file will be ignored by Hive.   -->

<!-- WARNING!!! You must make your changes in hive-site.xml instead.         -->

<!-- Hive Execution Parameters -->



       <!-- 以下配置原配置都有,搜尋之後進行修改或者删除後在統一位置添加 -->

   <property>

       <name>javax.jdo.option.ConnectionUserName</name>使用者名

       <value>root</value>

   </property>

   <property>

       <name>javax.jdo.option.ConnectionPassword</name>密碼

       <value> 123456 </value>

   </property>

  <property>

       <name>javax.jdo.option.ConnectionURL</name>mysql

       <value>jdbc:mysql: //127.0.0.1:3306/hive</value>

   </property>

   <property>

       <name>javax.jdo.option.ConnectionDriverName</name>mysql驅動程式

       <value>com.mysql.jdbc.Driver</value>

   </property>

<property>

   <name>hive.exec.script.wrapper</name>

   <value/>

   <description/>

</property>
           

複制mysql的驅動程式到hive/lib下面,然後進入/hive/bin 目錄執行

schematool -dbType mysql -initSchema
           

4、驗證是否安裝成功

hive --version檢視目前版本

hive 看是否進入hive指令操作行,進去的話說明成功

HIve安裝踩坑手冊一、安裝環境二、安裝Hadoop三、安裝hive四、錯誤記錄

四、錯誤記錄

1、配置檔案中存在異常字元

根據指定的

HIve安裝踩坑手冊一、安裝環境二、安裝Hadoop三、安裝hive四、錯誤記錄
Logging initialized using configuration in jar:file:/opt/hive/lib/hive-common- 3.1 . 2 .jar!/hive-log4j2.properties Async:  true

Exception in thread  "main"   java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D

     at org.apache.hadoop.fs.Path.initialize(Path.java: 263 )

     at org.apache.hadoop.fs.Path.<init>(Path.java: 221 )

     at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java: 710 )

     at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java: 627 )

     at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java: 591 )

     at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java: 747 )

     at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java: 683 )

     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: 62 )

     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java: 43 )

     at java.lang.reflect.Method.invoke(Method.java: 498 )

     at org.apache.hadoop.util.RunJar.run(RunJar.java: 323 )

     at org.apache.hadoop.util.RunJar.main(RunJar.java: 236 )

Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D

     at java.net.URI.checkPath(URI.java: 1823 )

     at java.net.URI.<init>(URI.java: 745 )

     at org.apache.hadoop.fs.Path.initialize(Path.java: 260 )

     ...  12   more
           

解決方式:

找到指定的配置檔案行數,将描述進行删除

 <property>
    <name>hive.exec.scratchdir</name>
    <value>/tmp/hive</value>
    <description>HDFS root scratch dir for Hive jobs which gets created with write all (733) permission. For each connecting user, an HDFS scratch dir: ${hive.exec.scratchdir}/&lt;username&gt; is created, with ${hive.scratch.dir.permission}.</description>
  </property>

  <property>
    <name>hive.exec.local.scratchdir</name>
    <value>/tmp/hive/local</value>
    <description>Local scratch space for Hive jobs</description>
  </property>

  <property>
    <name>hive.downloaded.resources.dir</name>
    <value>/tmp/hive/resources</value>
    <description>Temporary local directory for added resources in the remote file system.</description>
  </property>
           

二、guava版本不一緻

HIve安裝踩坑手冊一、安裝環境二、安裝Hadoop三、安裝hive四、錯誤記錄
Exception in thread  "main"   java.lang.RuntimeException: com.ctc.wstx.exc.WstxParsingException: Illegal character entity: expansion character (code  0x8

  at [row,col,system-id]: [ 3215 , 96 , "file:/opt/hive/conf/hive-site.xml" ]

     at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java: 3051 )

     at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java: 3000 )

     at org.apache.hadoop.conf.Configuration.getProps(Configuration.java: 2875 )

     at org.apache.hadoop.conf.Configuration.get(Configuration.java: 1484 )

     at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java: 4996 )

     at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java: 5069 )

     at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java: 5156 )

     at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java: 5104 )

     at org.apache.hive.beeline.HiveSchemaTool.<init>(HiveSchemaTool.java: 96 )

     at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java: 1473 )

     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: 62 )

     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java: 43 )

     at java.lang.reflect.Method.invoke(Method.java: 498 )

     at org.apache.hadoop.util.RunJar.run(RunJar.java: 323 )

     at org.apache.hadoop.util.RunJar.main(RunJar.java: 236 )

Caused by: com.ctc.wstx.exc.WstxParsingException: Illegal character entity: expansion character (code  0x8

  at [row,col,system-id]: [ 3215 , 96 , "file:/opt/hive/conf/hive-site.xml" ]

     at com.ctc.wstx.sr.StreamScanner.constructWfcException(StreamScanner.java: 621 )

     at com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java: 491 )

     at com.ctc.wstx.sr.StreamScanner.reportIllegalChar(StreamScanner.java: 2456 )

     at com.ctc.wstx.sr.StreamScanner.validateChar(StreamScanner.java: 2403 )

     at com.ctc.wstx.sr.StreamScanner.resolveCharEnt(StreamScanner.java: 2369 )

     at com.ctc.wstx.sr.StreamScanner.fullyResolveEntity(StreamScanner.java: 1515 )

     at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java: 2828 )

     at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java: 1123 )

     at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java: 3347 )

     at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java: 3141 )

     at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java: 3034 )

     ...  15   more
           

解決辦法:

1、com.google.common.base.Preconditions.checkArgument這個類所在的jar包為:guava.jar

2、hadoop-3.2.1(路徑:hadoop\share\hadoop\common\lib)中該jar包為  guava-27.0-jre.jar;而hive-3.1.2(路徑:hive/lib)中該jar包為guava-19.0.1.jar

3、将jar包變成一緻的版本:删除hive中低版本jar包,将hadoop中高版本的複制到hive的lib中。

再次啟動問題得到解決!

繼續閱讀