Spark通过https的方式读取elasticsearch中的数据
2016-12-13 16:54
1736 查看
为了安全起见,es中配置了https访问方式,但是spark读取es中的数据的时候是通过普通的http的方式访问的,现在读取的话肯定报错,找了一圈,上代码
val conf=new SparkConf().setAppName("es test")///.setMaster("local")
conf.set("es.net.ssl","true")
conf.set("es.net.ssl.keystore.location","D:\\test_jar\\node-0-keystore.jks")//"D:\\test_jar\\node-0-keystore.jks"
conf.set("es.net.ssl.keystore.pass","changeit")
conf.set("es.net.ssl.keystore.type","JKS")
conf.set("es.net.ssl.truststore.location","D:\\test_jar\\truststore.jks")//在linux上运行的时候此处路径写file:///全路径
conf.set("es.net.ssl.truststore.pass","changeit")
conf.set("es.net.ssl.cert.allow.self.signed","false")
conf.set("es.net.ssl.protocol","TLSv1.2")//官网上是TLS,必须带版本号,否则报错
conf.set("es.index.auto.create","true")
conf.set("es.net.http.auth.user","admin")//访问es的用户名
conf.set("es.net.http.auth.pass","admin")//访问es的密码
//conf.set("es.nodes.wan.only","true")
conf.set("es.index.read.missing.as.empty","true")
conf.set("es.nodes",args(2))//es节点IP
val sc=new SparkContext(conf)通过以上访问方式就可以通过https方式读取es中的数据了,其中配置的key对应的value值在elasticsearch.xml中得到,以下是我的elasticsearch中的配置文件
searchguard.ssl.transport.keystore_filepath: node-0-keystore.jks
searchguard.ssl.transport.keystore_password: changeit
searchguard.ssl.transport.truststore_filepath: truststore.jks
searchguard.ssl.transport.truststore_password: changeit
searchguard.ssl.transport.enforce_hostname_verification: false
searchguard.ssl.http.enabled: true
searchguard.ssl.http.keystore_filepath: node-0-keystore.jks
searchguard.ssl.http.keystore_password: changeit
searchguard.ssl.http.truststore_filepath: truststore.jks
searchguard.ssl.http.truststore_password: changeit
至于加密算法,还需要研究,暂时就这么着了
val conf=new SparkConf().setAppName("es test")///.setMaster("local")
conf.set("es.net.ssl","true")
conf.set("es.net.ssl.keystore.location","D:\\test_jar\\node-0-keystore.jks")//"D:\\test_jar\\node-0-keystore.jks"
conf.set("es.net.ssl.keystore.pass","changeit")
conf.set("es.net.ssl.keystore.type","JKS")
conf.set("es.net.ssl.truststore.location","D:\\test_jar\\truststore.jks")//在linux上运行的时候此处路径写file:///全路径
conf.set("es.net.ssl.truststore.pass","changeit")
conf.set("es.net.ssl.cert.allow.self.signed","false")
conf.set("es.net.ssl.protocol","TLSv1.2")//官网上是TLS,必须带版本号,否则报错
conf.set("es.index.auto.create","true")
conf.set("es.net.http.auth.user","admin")//访问es的用户名
conf.set("es.net.http.auth.pass","admin")//访问es的密码
//conf.set("es.nodes.wan.only","true")
conf.set("es.index.read.missing.as.empty","true")
conf.set("es.nodes",args(2))//es节点IP
val sc=new SparkContext(conf)通过以上访问方式就可以通过https方式读取es中的数据了,其中配置的key对应的value值在elasticsearch.xml中得到,以下是我的elasticsearch中的配置文件
searchguard.ssl.transport.keystore_filepath: node-0-keystore.jks
searchguard.ssl.transport.keystore_password: changeit
searchguard.ssl.transport.truststore_filepath: truststore.jks
searchguard.ssl.transport.truststore_password: changeit
searchguard.ssl.transport.enforce_hostname_verification: false
searchguard.ssl.http.enabled: true
searchguard.ssl.http.keystore_filepath: node-0-keystore.jks
searchguard.ssl.http.keystore_password: changeit
searchguard.ssl.http.truststore_filepath: truststore.jks
searchguard.ssl.http.truststore_password: changeit
至于加密算法,还需要研究,暂时就这么着了
相关文章推荐
- SPark SQL 从 DB 读取数据方法和方式
- Spark支持四种方式从数据库中读取数据
- Spark Streaming通过直连的方式消费Kafka中的数据
- sparksql 从oracle读取数据然后整合到elasticsearch
- 24-java版Spark程序读取ElasticSearch数据
- spark通过Phoenix读取hbase数据
- c#通过OleDb方式读取Excel的最佳做法,不会丢数据
- SparkStreaming通过Kafka获取数据(Receiver方式)
- 关于spark读取elasticsearch中数据,但是无法实现过滤数据的问题
- spark中通过jdbc读取和存储数据
- Spark Streaming场景应用-Kafka数据读取方式
- SparkStreaming通过Flume获取数据(单机,push和poll两种方式)的实现
- Spark Streaming场景应用-Kafka数据读取方式
- spark中读取elasticsearch数据
- spark中读取elasticsearch数据
- spark 读取elasticsearch中数据不完整问题
- stm32F107VC通过模拟SPI方式读取LIS3DH三轴加速度传感器数据
- spark读取kafka数据(两种方式比较及flume配置文件)
- SparkSQL读取HBase数据,通过自定义外部数据源(hbase的Hive外关联表)
- SPark SQL 从 DB 读取数据方法和方式 scala