天天看點

Hive查詢失敗:no LazyObject for VOID

線上一個ETL Job不能跑了,報異常,這裡為了說明問題簡化表結構:

1

2

3

4

<code>hive&gt;</code><code>desc</code> <code>void_t;</code>

<code>OK</code>

<code>x                       </code><code>int</code>                     <code>None             </code>

<code>z                       void                    None</code>

<code>select</code><code>* </code><code>from</code> <code>void_t</code>

确實會抛出異常:

<code>14</code><code>/</code><code>03</code><code>/</code><code>0201</code><code>:</code><code>28</code><code>:</code><code>58</code> <code>ERROR CliDriver: Failed with exceptionjava.io.IOException:org.apache.hadoop.hive.ql.metadata.HiveException: Errorevaluating x</code>

看到這個異常很疑惑,和x字段有什麼關系呢,繼續看詳細日志:

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

<code>java.io.IOException:org.apache.hadoop.hive.ql.metadata.HiveException: Error evaluating x</code>

<code>        </code><code>atorg.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:</code><code>150</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.hive.ql.Driver.getResults(Driver.java:</code><code>1412</code><code>)</code>

<code>        </code><code>at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:</code><code>271</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:</code><code>216</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:</code><code>413</code><code>)</code>

<code>        </code><code>at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:</code><code>756</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:</code><code>614</code><code>)</code>

<code>        </code><code>atsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)</code>

<code>        </code><code>atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:</code><code>39</code><code>)</code>

<code>        </code><code>atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:</code><code>25</code><code>)</code>

<code>        </code><code>atjava.lang.reflect.Method.invoke(Method.java:</code><code>597</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.util.RunJar.main(RunJar.java:</code><code>208</code><code>)</code>

<code>Causedby: org.apache.hadoop.hive.ql.metadata.HiveException: Error evaluating x</code>

<code>        </code><code>atorg.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:</code><code>80</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:</code><code>502</code><code>)</code>

<code>        </code><code>at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:</code><code>832</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:</code><code>90</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:</code><code>490</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:</code><code>136</code><code>)</code>

<code>        </code><code>... </code><code>11</code> <code>more</code>

<code>Causedby: java.lang.RuntimeException: Internal error: no LazyObject </code><code>for</code> <code>VOID</code>

<code>        </code><code>atorg.apache.hadoop.hive.serde2.lazy.LazyFactory.createLazyPrimitiveClass(LazyFactory.java:</code><code>119</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.hive.serde2.lazy.LazyFactory.createLazyObject(LazyFactory.java:</code><code>155</code><code>)</code>

<code>        </code><code>at org.apache.hadoop.hive.serde2.lazy.LazyStruct.parse(LazyStruct.java:</code><code>108</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.hive.serde2.lazy.LazyStruct.getField(LazyStruct.java:</code><code>190</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.hive.serde2.lazy.objectinspector.LazySimpleStructObjectInspector.getStructFieldData(LazySimpleStructObjectInspector.java:</code><code>188</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.hive.serde2.objectinspector.DelegatedStructObjectInspector.getStructFieldData(DelegatedStructObjectInspector.java:</code><code>79</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.hive.ql.exec.ExprNodeColumnEvaluator.evaluate(ExprNodeColumnEvaluator.java:</code><code>98</code><code>)</code>

<code>        </code><code>atorg.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:</code><code>76</code><code>)</code>

看到這個noLazyObject for VOID才知道原來問題出現在這裡,也就是字段z上;檢視ETL Job裡的Query發現裡面一個建表的語句用到了create table xxx as select null as z from xxx這樣的方式,進而生成了一個VOID類型的字段,但是Hive本身卻無法處理該字段,在jira裡确實也有這麼一個unresolved的Bug:HIVE-2615

Workaround也比較簡單:1.先建表再insert select 2.在ctas時cast(null as &lt;type&gt;) z來強制指定類型.

本文轉自MIKE老畢 51CTO部落格,原文連結:http://blog.51cto.com/boylook/1365747,如需轉載請自行聯系原作者