天天看点

kylin使用出现的坑(三)——java.sql.SQLException:No suitable driver found for jdbc:mysql://localhost:3306/hive?

异常信息描述:

kylin的cube在build时,#2 Step Name: Extract Fact Table Distinct Columns出现java.sql.SQLException的错误。如下所示:

java.sql.SQLException: No suitable driver found for jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true&useSSL=false具体错误如下文所示:

Caused by: MetaException(message:Unable to open a test connection to the given database. JDBC url = jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true&useSSL=false, username = hadoop. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: No suitable driver found for jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true&useSSL=false
	at java.sql.DriverManager.getConnection(DriverManager.java:689)
	at java.sql.DriverManager.getConnection(DriverManager.java:208)
	at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
	at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
	at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
	at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
	at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:296)
	at sun.reflect.GeneratedConstructorAccessor94.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	
result code:2
           

小厨查了很多资料,大部分都说jdbc的版本不对或者是hive-site.xml的配置文件<value></value>处写的不在一行,都一一试了,但是结果依然是报这个错。后来开始剖析问题的本质,我的hive可以正常操作,说明并不是配置文件或者jdbc版本的原因,原因应该在于:kylin在抽取事实表时不能跟hive的元数据库连接,因此发现应该是kylin的问题。

解决方法:

这是由于kylin提交的任务交给mr后,hadoop集群将任务分发给从节点时,需要hive的依赖信息。平时我们只将hive安装在某一台节点上,从而导致在slave节点上找不到hive的依赖信息,因此出现上述错误,在此我们将master节点上的hive目录发送到两个slave节点,并且更改添加相关的/etc/profile环境变量内容.

#在master节点执行以下语句,将hive的文件目录拷贝到slave节点上
scp -r /usr/hive [email protected]:/usr
scp -r /usr/hive [email protected]:/usr