2

On HDP cluster, I am trying create and integrate Hive tables with existing Hbase tables. It creates the hive table. But when I am try to query the hive table it throws the following exception especially when the number of columns exceeds 200.

I checked the # of columns in both hbase & hive is same. Not getting proper solution to debug it.

hive> select * from hbase_hive.lead;

FAILED: RuntimeException MetaException(message:org.apache.hadoop.hive.serde2.SerDeException 
org.apache.hadoop.hive.hbase.HBaseSerDe: 
columns has 273 elements while hbase.columns.mapping has 204 elements (counting the key if implicit))

Is there any column limitation in this case?

Please suggest me solution on this

0

1 Answer 1

0

This has fixed the issue.

https://hortonworks.my.salesforce.com/kA2E0000000LZQ5?srPos=0&srKp=ka2⟨=en_US

ROOT CAUSE:

A character limitation for PARAM_VALUE field in SERDE_PARAMS table in hive metastore for 4000 character is the root cause of this issue. This limitation prevents Hive from creating a table with high column numbers, eventually causing desc or select * from to fail with error above.

WORKAROUND: This issue can be worked around by doing the following in hive metastore

-- log into Hive Metastore DB -- >alter table SERDE_PARAMS MODIFY PARAM_VALUE VARCHAR(400000000);
Sign up to request clarification or add additional context in comments.

1 Comment

Hi, you could also change the type to text - this would be highly beneficial for file types like json where you may have large number of columns.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.