天天看點

Hive:JDBC示例

1)本地目錄/home/hadoop/test下的test4.txt檔案内容(每行資料之間用tab鍵隔開)如下所示:

[hadoop@master test]$ sudo vim test4.txt

    dajiangtai
    hadoop
    hive
    hbase
    spark
           

2)啟動hiveserver2

[[email protected] test]$ cd ${HIVE_HOME}/bin
[[email protected] bin]$ ll
total 
-rwxr-xr-x  hadoop hadoop  Jan   beeline
drwxr-xr-x  hadoop hadoop  May  : ext
-rwxr-xr-x  hadoop hadoop  Jan   hive
-rwxr-xr-x  hadoop hadoop  Jan   hive-config.sh
-rwxr-xr-x  hadoop hadoop  Jan   hiveserver2
-rwxr-xr-x  hadoop hadoop  Jan   metatool
-rwxr-xr-x  hadoop hadoop  Jan   schematool

[[email protected] bin]$ ./hiveserver2 
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/modules/hadoop-/share/hadoop/common/lib/slf4j-log4j12-.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/modules/hive1/lib/hive-jdbc--standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
           

3) 程式代碼

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
public class Hive {
    private static String driverName = "org.apache.hive.jdbc.HiveDriver";//hive驅動名稱
    private static String url = "jdbc:hive2://master:10000/default";//連接配接hive2服務的連接配接位址,Hive0.11.0以上版本提供了一個全新的服務:HiveServer2
    private static String user = "hadoop";//對HDFS有操作權限的使用者
    private static String password = "";//在非安全模式下,指定一個使用者運作查詢,忽略密碼
    private static String sql = "";
    private static ResultSet res;
    public static void main(String[] args) {
        try {
            Class.forName(driverName);//加載HiveServer2驅動程式
            Connection conn = DriverManager.getConnection(url, user, password);//根據URL連接配接指定的資料庫
            Statement stmt = conn.createStatement();

            //建立的表名
            String tableName = "testHiveDriverTable";

            /** 第一步:表存在就先删除 **/
            sql = "drop table " + tableName;
            stmt.execute(sql);

            /** 第二步:表不存在就建立 **/
            sql = "create table " + tableName + " (key int, value string)  row format delimited fields terminated by '\t' STORED AS TEXTFILE";
            stmt.execute(sql);

            // 執行“show tables”操作
            sql = "show tables '" + tableName + "'";
            res = stmt.executeQuery(sql);
            if (res.next()) {
                System.out.println(res.getString());
            }

            // 執行“describe table”操作
            sql = "describe " + tableName;
            res = stmt.executeQuery(sql);
            while (res.next()) {  
                System.out.println(res.getString() + "\t" + res.getString());
            }

            // 執行“load data into table”操作
            String filepath = "/home/hadoop/test/test4.txt";//hive服務所在節點的本地檔案路徑
            sql = "load data local inpath '" + filepath + "' into table " + tableName;
            stmt.execute(sql);

            // 執行“select * query”操作
            sql = "select * from " + tableName;
            res = stmt.executeQuery(sql);
            while (res.next()) {
                System.out.println(res.getInt() + "\t" + res.getString());
            }

            // 執行“regular hive query”操作,此查詢會轉換為MapReduce程式來處理
            sql = "select count(*) from " + tableName;
            res = stmt.executeQuery(sql);
            while (res.next()) {
                System.out.println(res.getString());
            }        
            conn.close();
            conn = null;
        } catch (ClassNotFoundException e) {
            e.printStackTrace();
            System.exit();
        } catch (SQLException e) {
            e.printStackTrace();
            System.exit();
        }
    }
}
           

4) 運作結果(右擊–>Run as–>Run on Hadoop)

此時直接運作會報錯,解決方案請見下一篇博文:HiveSQLException: Error while compiling statement: No privilege ‘Create’ found for outputs{database}。

運作日志如下:

-- ::, INFO [org.apache.hive.jdbc.Utils] - Supplied authorities: master:
-- ::, INFO [org.apache.hive.jdbc.Utils] - Resolved authority: master:
-- ::, INFO [org.apache.hive.jdbc.HiveConnection] - Will try to open client transport with JDBC Uri: jdbc:hive2://master:/default
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - opening transport org.apache.thrift.transport.TSaslClientTransport@3834d63f
-- ::, DEBUG [org.apache.thrift.transport.TSaslClientTransport] - Sending mechanism name PLAIN and initial response of length 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Writing message with status START and payload length 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Writing message with status COMPLETE and payload length 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Start message handled
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Main negotiation loop complete
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: SASL Client receiving last message
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: Received message with status COMPLETE and payload length 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**testhivedrivertable**
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**key    int**
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**value    string**
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**    dajiangtai**
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**    hadoop**
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**    hive**
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**    hbase**
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
**    spark**
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.hive.jdbc.HiveQueryResultSet] - Fetched row string: 
****
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - writing data length: 
-- ::, DEBUG [org.apache.thrift.transport.TSaslTransport] - CLIENT: reading data length: 
           

執行“show tables”運作結果:

testhivedrivertable
           

執行“describe table”運作結果:

key    int
value    string
           

執行“select * query”運作結果:

1    dajiangtai
2    hadoop
3    hive
4    hbase
5    spark
           

或者從叢集上檢視運作結果。

hive> show tables;
OK
copy_student1
copy_student2
copy_student3
copy_student4
employee
group_gender_agg
group_gender_sum
group_test
index_test
index_tmp
partition_test
student1
student2
test
test_view
testhivedrivertable
user
Time taken:  seconds, Fetched:  row(s)
hive> desc testhivedrivertable;
OK
key                     int                                         
value                   string                                      
Time taken:  seconds, Fetched:  row(s)
hive> select * from testhivedrivertable;
OK
    dajiangtai
    hadoop
    hive
    hbase
    spark
Time taken:  seconds, Fetched:  row(s)
           

以上就是部落客為大家介紹的這一闆塊的主要内容,這都是部落客自己的學習過程,希望能給大家帶來一定的指導作用,有用的還望大家點個支援,如果對你沒用也望包涵,有錯誤煩請指出。如有期待可關注部落客以第一時間擷取更新哦,謝謝! 

繼續閱讀