hive中删除表的错误Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException,hadoophive


1:请看操作

[jifeng@jifeng02 hive-0.12.0-bin]$ hive

Logging initialized using configuration in jar:file:/home/jifeng/hadoop/hive-0.12.0-bin/lib/hive-common-0.12.0.jar!/hive-log4j.properties
hive> show tables;
OK
t1
tianq
tianqi
Time taken: 3.338 seconds, Fetched: 3 row(s)
hive> drop table t1;
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:javax.jdo.JDODataStoreException: Iteration request failed : SELECT `A0`.`COMMENT`,`A0`.`COLUMN_NAME`,`A0`.`TYPE_NAME`,`A0`.`INTEGER_IDX` AS NUCORDER0 FROM `COLUMNS_V2` `A0` WHERE `A0`.`CD_ID` = ? AND `A0`.`INTEGER_IDX` >= 0 ORDER BY NUCORDER0
        at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)
        at org.datanucleus.api.jdo.JDOPersistenceManager.jdoRetrieve(JDOPersistenceManager.java:610)
        at org.datanucleus.api.jdo.JDOPersistenceManager.retrieve(JDOPersistenceManager.java:622)
        at org.datanucleus.api.jdo.JDOPersistenceManager.retrieve(JDOPersistenceManager.java:631)
        at org.apache.hadoop.hive.metastore.ObjectStore.removeUnusedColumnDescriptor(ObjectStore.java:2293)
        at org.apache.hadoop.hive.metastore.ObjectStore.preDropStorageDescriptor(ObjectStore.java:2321)
        at org.apache.hadoop.hive.metastore.ObjectStore.dropTable(ObjectStore.java:742)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:124)
        at com.sun.proxy.$Proxy10.dropTable(Unknown Source)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1192)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1328)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
        at com.sun.proxy.$Proxy11.drop_table_with_environment_context(Unknown Source)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:671)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:647)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
        at com.sun.proxy.$Proxy12.dropTable(Unknown Source)
        at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:869)
        at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:836)
        at org.apache.hadoop.hive.ql.exec.DDLTask.dropTable(DDLTask.java:3329)
        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:277)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1020)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)
        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
NestedThrowablesStackTrace:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'OPTION SQL_SELECT_LIMIT=DEFAULT' at line 1
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at com.mysql.jdbc.Util.handleNewInstance(Util.java:406)
        at com.mysql.jdbc.Util.getInstance(Util.java:381)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1030)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956)
        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3491)
        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3423)
        at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1936)
        at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2060)
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2536)
        at com.mysql.jdbc.StatementImpl.executeSimpleNonQuery(StatementImpl.java:1463)
        at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1869)
        at com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:172)
        at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:381)
        at org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:504)
        at org.datanucleus.store.rdbms.scostore.JoinListStore.listIterator(JoinListStore.java:748)
        at org.datanucleus.store.rdbms.scostore.AbstractListStore.listIterator(AbstractListStore.java:92)
        at org.datanucleus.store.rdbms.scostore.AbstractListStore.iterator(AbstractListStore.java:82)
        at org.datanucleus.store.types.backed.List.loadFromStore(List.java:300)
        at org.datanucleus.store.types.backed.List.load(List.java:273)
        at org.datanucleus.state.JDOStateManager.loadUnloadedFields(JDOStateManager.java:2967)
        at org.datanucleus.api.jdo.state.Hollow.transitionRetrieve(Hollow.java:168)
        at org.datanucleus.state.AbstractStateManager.retrieve(AbstractStateManager.java:598)
        at org.datanucleus.ExecutionContextImpl.retrieveObject(ExecutionContextImpl.java:1778)
        at org.datanucleus.ExecutionContextThreadedImpl.retrieveObject(ExecutionContextThreadedImpl.java:203)
        at org.datanucleus.api.jdo.JDOPersistenceManager.jdoRetrieve(JDOPersistenceManager.java:605)
        at org.datanucleus.api.jdo.JDOPersistenceManager.retrieve(JDOPersistenceManager.java:622)
        at org.datanucleus.api.jdo.JDOPersistenceManager.retrieve(JDOPersistenceManager.java:631)
        at org.apache.hadoop.hive.metastore.ObjectStore.removeUnusedColumnDescriptor(ObjectStore.java:2293)
        at org.apache.hadoop.hive.metastore.ObjectStore.preDropStorageDescriptor(ObjectStore.java:2321)
        at org.apache.hadoop.hive.metastore.ObjectStore.dropTable(ObjectStore.java:742)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:124)
        at com.sun.proxy.$Proxy10.dropTable(Unknown Source)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1192)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1328)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
        at com.sun.proxy.$Proxy11.drop_table_with_environment_context(Unknown Source)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:671)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:647)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
        at com.sun.proxy.$Proxy12.dropTable(Unknown Source)
        at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:869)
        at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:836)
        at org.apache.hadoop.hive.ql.exec.DDLTask.dropTable(DDLTask.java:3329)
        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:277)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1020)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)
        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
)
hive> show tables;  
OK
t1
tianq
tianqi
Time taken: 0.033 seconds, Fetched: 3 row(s)
hive> drop table tianq;
FAILED: SemanticException [Error 10001]: Table not found tianq
hive> show tables;     
OK
t1
tianq
tianqi
Time taken: 0.022 seconds, Fetched: 3 row(s)


记得前2天用sqoop从mysql导入数据到hive出现的错误,后面替换mysql-connector-java-5.1.32-bin.jar包解决问题的

我也用这个新的包试试,并且把mysql-connector-java-5.1.6-bin.jar删除


[jifeng@jifeng02 mysql-connector-java-5.1.32]$ cp mysql-connector-java-5.1.32-bin.jar /home/jifeng/hadoop/hive-0.12.0-bin/lib
[jifeng@jifeng02 mysql-connector-java-5.1.32]$ cd ..
[jifeng@jifeng02 ~]$ cd  /home/jifeng/hadoop/hive-0.12.0-bin/lib
[jifeng@jifeng02 lib]$ rm mysql-connector-java-5.1.6-bin.jar 

问题解决了,估计是mysql的bug吧

[jifeng@jifeng02 hive-0.12.0-bin]$ hive

Logging initialized using configuration in jar:file:/home/jifeng/hadoop/hive-0.12.0-bin/lib/hive-common-0.12.0.jar!/hive-log4j.properties
hive> show tables;
OK
t1
tianq
tianqi
Time taken: 2.182 seconds, Fetched: 3 row(s)
hive> drop table t1;
OK
Time taken: 1.02 seconds
hive> select * from tianq;
OK
Time taken: 0.192 seconds
hive> drop table tianq;
OK
Time taken: 0.28 seconds




hive070中,执行 INSERT OVERWRITE TABLE hbase_table_1 SELECT * FROM pokes;报错

可以再/hive/bin目录下输入 hive -hiveconf hive.root.logger=debug,console
然后再输入你报错的hql语句,会有更详细的错误提示,这个调试模式很管用的,希望能帮上。
 

安装Hive的独立模式时出现了异常 ,怎解决?

The value property should not contain any spaces or carriage returns. It should appear all on one line.

我也被坑了好久,开始以为是因为少了JAR包。
其实是因为PROPERTY元素中不要有回车和空格。坑爹啊。。。。
 

相关内容