1、coalesce报错
FAILED: SemanticException [Error 10014]: Line 197:4 Wrong arguments ‘’10’’: Unsafe compares
BETWEEN different types are disabled for safety reasons. If you know what you are doing,
please SET hive.strict.checks.type.safety to false
AND make sure that hive.mapred.mode is NOT SET to ‘strict’ to proceed. Note that you may get errors
OR incorrect results if you make a mistake while
USING some of the unsafe features.
1
2
3
4
5
6
解决:hive.strict.checks.type.safety=false (默认是true)
2、MR任务失败
Task with the most failures(4):
——-
Task ID:
task_1566481621886_15450555_m_000004
URL:
http://TXIDC65-bigdata-resourcemanager1:8088/taskdetails.jsp?jobid=job_1566481621886_15450555&tipid=task_1566481621886_15450555_m_000004
——-
Diagnostic Messages for this Task:
Task KILL is received. Killing attempt!
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
MapReduce Jobs Launched:
Stage-Stage-1: Map: 65 Reduce: 11 Cumulative CPU: 804.45 sec HDFS Read: 1354997759 HDFS Write: 10808642 SUCCESS
Stage-Stage-6: Map: 6 Reduce: 3 Cumulative CPU: 64.73 sec HDFS Read: 212028947 HDFS Write: 19101152 SUCCESS
Stage-Stage-7: Map: 5 FAIL
Total MapReduce CPU Time Spent: 14 minutes 29 seconds 179 msec
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
看MR日志:
java.lang.ClassCastException: org.apache.hadoop.hive.ql.exec.vector.LongColumnVector cannot be cast to org.apache.hadoop.hive.ql.exec.vector.DecimalColumnVector
1
解决:不使用向量化执行可以解决vector强转报错的问题,hive.vectorized.execution.enabled=false
3、关键字冲突
FAILED: ParseException line 7:1 cannot recognize input near ‘application’ ‘string’ ‘comment’ IN column name OR constraint
1
解决:字段名application和Hive中关键字冲突了,需要在字段两边加上application。除此之外常用的还有date、user等。
4、from_unixtime默认时区修改
巨坑!幸好上线前数仓小伙伴发现了这个问题。
在Hive3.1.0之后所有时区都改成了UTC,导致类似from_unixtime的时间UDF处理结果都少了8小时。
代码变化:
解决:
将HIVE-12192中相关代码回滚。
5、mapred-site.xml中增加
————————————————
版权声明:本文为CSDN博主「Deegue」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
原文链接:https://blog.csdn.net/zyzzxycj/article/details/107607873