您的位置:首页 > 数据库

sparksql参数配置

2017-06-26 14:59 190 查看
转载自:http://www.cnblogs.com/wwxbi/p/6114410.html 查看当前环境SQL参数的配置
spark.sql("SET -v")
keyvalue
spark.sql.hive.version1.2.1
spark.sql.sources.parallelPartitionDiscovery.threshold32
spark.sql.hive.metastore.barrierPrefixes 
spark.sql.shuffle.partitions200
spark.sql.hive.metastorePartitionPruningFALSE
spark.sql.broadcastTimeout300
spark.sql.sources.bucketing.enabledTRUE
spark.sql.parquet.filterPushdownTRUE
spark.sql.statistics.fallBackToHdfsFALSE
spark.sql.adaptive.enabledFALSE
spark.sql.parquet.cacheMetadataTRUE
spark.sql.hive.metastore.sharedPrefixescom.mysql.jdbc
spark.sql.parquet.respectSummaryFilesFALSE
spark.sql.warehouse.dirhdfs:///user/spark/warehouse
spark.sql.orderByOrdinalTRUE
spark.sql.hive.convertMetastoreParquetTRUE
spark.sql.groupByOrdinalTRUE
spark.sql.hive.thriftServer.asyncTRUE
spark.sql.thriftserver.scheduler.pool<undefined>
spark.sql.orc.filterPushdownFALSE
spark.sql.adaptive.shuffle.targetPostShuffleInputSize67108864b
spark.sql.sources.defaultparquet
spark.sql.parquet.compression.codecsnappy
spark.sql.hive.metastore.version1.2.1
spark.sql.sources.partitionDiscovery.enabledTRUE
spark.sql.crossJoin.enabledFALSE
spark.sql.parquet.writeLegacyFormatFALSE
spark.sql.hive.verifyPartitionPathFALSE
spark.sql.variable.substituteTRUE
spark.sql.thriftserver.ui.retainedStatements200
spark.sql.hive.convertMetastoreParquet.mergeSchemaFALSE
spark.sql.parquet.enableVectorizedReaderTRUE
spark.sql.parquet.mergeSchemaFALSE
spark.sql.parquet.binaryAsStringFALSE
spark.sql.columnNameOfCorruptRecord_corrupt_record
spark.sql.files.maxPartitionBytes134217728
spark.sql.streaming.checkpointLocation<undefined>
spark.sql.variable.substitute.depth40
spark.sql.parquet.int96AsTimestampTRUE
spark.sql.autoBroadcastJoinThreshold10485760
spark.sql.pivotMaxValues10000
spark.sql.sources.partitionColumnTypeInference.enabledTRUE
spark.sql.hive.metastore.jarsbuiltin
spark.sql.thriftserver.ui.retainedSessions200
spark.sql.sources.maxConcurrentWrites1
spark.sql.parquet.output.committer.classorg.apache.parquet.hadoop.ParquetOutputCommitter
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  spark sparksql