需要检查的点 3 + 1, 3个配置、1个执行方式
- HADOOP_HOME 环境变量是否正确
- HADOOP_HOME/etc/hadoop 配置文件是否正确
- spark-default.conf 以下6个选项是否拥有 ``` spark.jars /Users/ada/opt/spark-aux-jar/iceberg-spark-runtime-3.2_2.12-0.14.0.jar
spark.sql.catalog.spark_catalog org.apache.iceberg.spark.SparkSessionCatalog spark.sql.catalog.spark_catalog.type hive
spark.sql.catalog.spark_catalog.uri thrift://hadoop3x1:9083 spark.sql.catalog.spark_catalog.warehouse /user/hive/warehouse
spark.sql.extensions org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions ```
- bin/spark-sql 或kyuubi执行Iceberg SQL