Exception -
Caused by:
org.apache.spark.sql.AnalysisException: Could not read schema from the hive metastore because
it is corrupted. (missing part 0 of the schema, 2 parts are expected).;
Analysis -
·
Check for table definition. In TBLProperties, you might find
something like this –
> spark.sql.sources.schema.numPartCols
> 'spark.sql.sources.schema.numParts'
'spark.sql.sources.schema.part.0'
> 'spark.sql.sources.schema.part.1' 'spark.sql.sources.schema.part.2'
> 'spark.sql.sources.schema.partCol.0'
> 'spark.sql.sources.schema.partCol.1'
That’s what error seems to say that part1 is defined but part0
is missing.
Solution -
Drop & re-create table. If Table was partitioned then all partitions would have been removed. So do either of below -
·
Msck repair table
<db_name>.<table_name>
·
alter table <db_name>.<table_name>
add partition (<partion_name>=’<partition_value>’)
Instead of dropping table, we can unset TBLProperties like –
ALTER TABLE <db_name>.<table_name>
UNSET TBLPROPERTIES('spark.sql.sources.schema.numParts',
'spark.sql.sources.schema.numPartCols')
Comments
Post a Comment