Hive:分区字段的异常字符

1. 问题描述

hive支持动态分区技术,可以将SELECT出来的数据根据分区字段的取值动态的插入到目标分区中。例如:

set hive.exec.dynamic.partition.mode=nonstrict;
set hive.exec.dynamic.partition=true;

INSERT OVERWRITE TABLE target_table_name PARTITION(partition_field_name)
SELECT
  ...
  partition_field_name
FROM source_table_name
WHERE ...

注意,使用动态分区时,必须设置两个属性:

set hive.exec.dynamic.partition.mode=nonstrict;
set hive.exec.dynamic.partition=true;

如果SELECT出来的partition_field_name字段出现异常字符,就会出现下面的错误:

NestedThrowablesStackTrace:
java.sql.SQLException: Illegal mix of collations (latin1_bin,IMPLICIT) and (utf8_general_ci,COERCIBLE) for operation '='
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1072)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3563)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3495)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1959)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2113)
    at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2693)
    at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2102)
    at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2261)
    at com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:174)
    at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:381)
    at org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:504)
    at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:651)
    at org.datanucleus.store.query.Query.executeQuery(Query.java:1786)
    at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)
    at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:290)
    at org.apache.hadoop.hive.metastore.ObjectStore.getMPartition(ObjectStore.java:1369)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPartitionWithAuth(ObjectStore.java:1613)
    at sun.reflect.GeneratedMethodAccessor30.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)
    at com.sun.proxy.$Proxy8.getPartitionWithAuth(Unknown Source)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_partition_with_auth(HiveMetaStore.java:2544)
    at sun.reflect.GeneratedMethodAccessor29.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
    at com.sun.proxy.$Proxy9.get_partition_with_auth(Unknown Source)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getPartitionWithAuthInfo(HiveMetaStoreClient.java:979)
    at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
    at com.sun.proxy.$Proxy10.getPartitionWithAuthInfo(Unknown Source)
    at org.apache.hadoop.hive.ql.metadata.Hive.getPartition(Hive.java:1608)
    at org.apache.hadoop.hive.ql.metadata.Hive.getPartition(Hive.java:1562)
    at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:1215)
    at org.apache.hadoop.hive.ql.metadata.Hive.loadDynamicPartitions(Hive.java:1438)
    at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:346)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1519)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1279)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1091)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:914)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:904)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
    at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
    at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
    at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:748)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar._main(RunJar.java:227)
    at org.apache.hadoop.util.RunJar.access$000(RunJar.java:49)
    at org.apache.hadoop.util.RunJar$1.run(RunJar.java:126)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
    at org.apache.hadoop.security.SecurityUtil.doAsConfigUser(SecurityUtil.java:646)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:123)
)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
)
Fail in Action load

上述错误信息中,最主要的信息为:

java.sql.SQLException: Illegal mix of collations (latin1_bin,IMPLICIT) and (utf8_general_ci,COERCIBLE) for operation '='

查看分区数据后,会看到下面类型的数据:

分区字段异常

2 解决办法

我们使用正则表达式将异常值过滤掉即可,例如:

WHERE partition_field_name RLIKE '^\\w*$'

参考资料

[1]. http://blog.csdn.net/yfkiss/article/details/8212050

----EOF-----

Categories: hive Tags: hive join