您好,登錄后才能下訂單哦!
本篇內容介紹了“SparkSql讀取hive表tblproperties異常如何解決”的有關知識,在實際案例的操作過程中,不少人都會遇到這樣的困境,接下來就讓小編帶領大家學習一下如何處理這些情況吧!希望大家仔細閱讀,能夠學有所成!
集群環境
sparksql讀取Parquet 格式的hive表報錯
hive的parquet表,hive和impala讀取正常,使用spark-sql讀取則報錯
異常信息
com.fasterxml.jackson.core.JsonParseException: Unexpected end-of-input within/between Object entriesat [Source: (String)"{"type":"struct","fields":[{"name":"timestamp","type":"string","nullable":true,"metadata":{"HIVE_TYPE_STRING":"string"}},{"name":"xxx","type":"string","nullable":true,"metadata":{"HIVE_TYPE_STRING":"string"}},{"name":"xxx","type":"string","nullable":true,"; line: 1, column: 513]at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1804)at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._skipAfterComma2(ReaderBasedJsonParser.java:2323)at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._skipComma(ReaderBasedJsonParser.java:2293)at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:664)at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:47)at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:39)at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:32)at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:46)at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:39)at com.fasterxml.jackson.databind.ObjectReader._bindAndClose(ObjectReader.java:1611)at com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:1219)at org.json4s.jackson.JsonMethods$class.parse(JsonMethods.scala:25)at org.json4s.jackson.JsonMethods$.parse(JsonMethods.scala:55)at org.apache.spark.sql.types.DataType$.fromJson(DataType.scala:127)at org.apache.spark.sql.hive.HiveExternalCatalog$.org$apache$spark$sql$hive$HiveExternalCatalog$$getSchemaFromTableProperties(HiveExternalCatalog.scala:1382)at org.apache.spark.sql.hive.HiveExternalCatalog.restoreDataSourceTable(HiveExternalCatalog.scala:845)at org.apache.spark.sql.hive.HiveExternalCatalog.org$apache$spark$sql$hive$HiveExternalCatalog$$restoreTableMetadata(HiveExternalCatalog.scala:765)at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$getTable$1.apply(HiveExternalCatalog.scala:734)at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$getTable$1.apply(HiveExternalCatalog.scala:734)
tblproperites不全的問題,應該是hive存儲tblproperites的表,參數字段存在截斷,因此找到metastore庫中的TABLE_PARAMS表,檢查PARAM_VALUE字段,發現該字段的長度僅為256,找到問題
將PARAM_VALUE的長度修改為8000,問題解決
“SparkSql讀取hive表tblproperties異常如何解決”的內容就介紹到這里了,感謝大家的閱讀。如果想了解更多行業相關的知識可以關注億速云網站,小編將為大家輸出更多高質量的實用文章!
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。