天天看點

【sqoop】scoop Column length too big for column ‘TYPE_NAME‘ (max = 21845); use BLOB or TEXT instead

執行 将關系型資料的表結構複制到hive中出現異常

錯誤:

com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Column length too big for column 'TYPE_NAME' (max = 21845); use BLOB or TEXT instead

scoop ERROR tool.CreateHiveTableTool: Encountered IOException running create table job: java.io.IOException: Hive exited with status 1 

圖:

【sqoop】scoop Column length too big for column ‘TYPE_NAME‘ (max = 21845); use BLOB or TEXT instead
【sqoop】scoop Column length too big for column ‘TYPE_NAME‘ (max = 21845); use BLOB or TEXT instead

代碼:

[[email protected] sqoop-1.4.6.bin__hadoop-2.0.4-alpha]# bin/sqoop create-hive-table \
> --connect jdbc:mysql://node01:3306/userdb \
> --table emp \
> --username root \
> --password 123456 \
> --hive-table test.emp_add_sp
Warning: /export/servers/sqoop-1.4.6.bin__hadoop-2.0.4-alpha//../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /export/servers/sqoop-1.4.6.bin__hadoop-2.0.4-alpha//../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /export/servers/sqoop-1.4.6.bin__hadoop-2.0.4-alpha//../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /export/servers/sqoop-1.4.6.bin__hadoop-2.0.4-alpha//../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
19/12/03 07:26:51 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
19/12/03 07:26:51 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/12/03 07:26:51 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
19/12/03 07:26:51 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
19/12/03 07:26:51 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
19/12/03 07:26:52 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp` AS t LIMIT 1
19/12/03 07:26:52 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp` AS t LIMIT 1
19/12/03 07:26:52 INFO hive.HiveImport: Loading uploaded data into Hive
19/12/03 07:26:53 INFO hive.HiveImport: which: no hbase in (/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/export/servers/hadoop-2.6.0-cdh5.14.0/bin:/export/servers/hadoop-2.6.0-cdh5.14.0/sbin:/export/servers/hive-1.1.0-cdh5.14.0/bin:/export/servers/jdk1.8.0_144/bin:/export/servers/sqoop-1.4.6.bin__hadoop-2.0.4-alpha//bin:/root/bin)
19/12/03 07:26:55 INFO hive.HiveImport: 
19/12/03 07:26:55 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/export/servers/hive-1.1.0-cdh5.14.0/lib/hive-common-1.1.0-cdh5.14.0.jar!/hive-log4j.properties
19/12/03 07:27:21 INFO hive.HiveImport: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:An exception was thrown while adding/validating class(es) : Column length too big for column 'TYPE_NAME' (max = 21845); use BLOB or TEXT instead
19/12/03 07:27:21 INFO hive.HiveImport: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Column length too big for column 'TYPE_NAME' (max = 21845); use BLOB or TEXT instead
19/12/03 07:27:21 INFO hive.HiveImport:         at sun.reflect.GeneratedConstructorAccessor32.newInstance(Unknown Source)
19/12/03 07:27:21 INFO hive.HiveImport:         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
19/12/03 07:27:21 INFO hive.HiveImport:         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.mysql.jdbc.Util.handleNewInstance(Util.java:404)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.mysql.jdbc.Util.getInstance(Util.java:387)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:939)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3878)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3814)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2478)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2625)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2547)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2505)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.mysql.jdbc.StatementImpl.executeInternal(StatementImpl.java:840)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:740)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.jolbox.bonecp.StatementHandle.execute(StatementHandle.java:254)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:760)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatementList(AbstractTable.java:711)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.store.rdbms.table.AbstractTable.create(AbstractTable.java:425)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.store.rdbms.table.AbstractTable.exists(AbstractTable.java:488)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3380)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:3190)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2841)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:1605)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:954)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:679)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2045)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1365)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3827)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.state.JDOStateManager.setIdentity(JDOStateManager.java:2571)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.state.JDOStateManager.initialiseForPersistentNew(JDOStateManager.java:513)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:232)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.ExecutionContextImpl.newObjectProviderForPersistentNew(ExecutionContextImpl.java:1414)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2218)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:2065)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1913)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:727)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:902)
19/12/03 07:27:21 INFO hive.HiveImport:         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
19/12/03 07:27:21 INFO hive.HiveImport:         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
19/12/03 07:27:21 INFO hive.HiveImport:         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
19/12/03 07:27:21 INFO hive.HiveImport:         at java.lang.reflect.Method.invoke(Method.java:498)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:103)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.sun.proxy.$Proxy19.createTable(Unknown Source)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1515)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1557)
19/12/03 07:27:21 INFO hive.HiveImport:         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
19/12/03 07:27:21 INFO hive.HiveImport:         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
19/12/03 07:27:21 INFO hive.HiveImport:         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
19/12/03 07:27:21 INFO hive.HiveImport:         at java.lang.reflect.Method.invoke(Method.java:498)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:140)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:99)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.sun.proxy.$Proxy20.create_table_with_environment_context(Unknown Source)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2202)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.create_table_with_environment_context(SessionHiveMetaStoreClient.java:97)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:740)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:728)
19/12/03 07:27:21 INFO hive.HiveImport:         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
19/12/03 07:27:21 INFO hive.HiveImport:         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
19/12/03 07:27:21 INFO hive.HiveImport:         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
19/12/03 07:27:21 INFO hive.HiveImport:         at java.lang.reflect.Method.invoke(Method.java:498)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:105)
19/12/03 07:27:21 INFO hive.HiveImport:         at com.sun.proxy.$Proxy21.createTable(Unknown Source)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:784)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4177)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:311)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:214)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:99)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2052)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1748)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1501)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1285)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1275)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:226)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:175)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:389)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:324)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:422)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:438)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:732)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:699)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:634)
19/12/03 07:27:21 INFO hive.HiveImport:         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
19/12/03 07:27:21 INFO hive.HiveImport:         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
19/12/03 07:27:21 INFO hive.HiveImport:         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
19/12/03 07:27:21 INFO hive.HiveImport:         at java.lang.reflect.Method.invoke(Method.java:498)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
19/12/03 07:27:21 INFO hive.HiveImport:         at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
19/12/03 07:27:21 INFO hive.HiveImport: )
19/12/03 07:27:21 ERROR tool.CreateHiveTableTool: Encountered IOException running create table job: java.io.IOException: Hive exited with status 1
        at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:389)
        at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:339)
        at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:240)
        at org.apache.sqoop.tool.CreateHiveTableTool.run(CreateHiveTableTool.java:58)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:236)

[[email protected] sqoop-1.4.6.bin__hadoop-2.0.4-alpha]#
           

我的錯誤原因是:

關系資料庫mysql(我的是虛拟機節點裡安裝的mysql)的資料庫編碼格式utf-8 不可以

<-- 檢視資料庫編碼格式 -->
mysql> show variables like '%char%';
+--------------------------+----------------------------+
| Variable_name            | Value                      |
+--------------------------+----------------------------+
| character_set_client     | utf8                       |
| character_set_connection | utf8                       |
| character_set_database   | utf8                       |
| character_set_filesystem | binary                     |
| character_set_results    | utf8                       |
| character_set_server     | utf8                       |
| character_set_system     | utf8                       |
| character_sets_dir       | /usr/share/mysql/charsets/ |
+--------------------------+----------------------------+
8 rows in set (0.00 sec)

mysql>
           

解決方法:

【sqoop】scoop Column length too big for column ‘TYPE_NAME‘ (max = 21845); use BLOB or TEXT instead

不知道你們的遇到這樣的異常是否可以這樣解決,希望能幫助到你們

繼續閱讀