This post explains the difference between - SparkSession SparkContext SQLContext HiveContext. The difference between Spark Session vs Spark Context vs Sql Context lies in the version of the Spark versions used in Application. As per Spark versions > Spark 2.0 , A pictorial Representation of the Hierarchy between - SparkSession SparkContext SQLContext HiveContext
################
### Spark Conf
################
spark\_conf \= (SparkConf()
.setAppName("Your App Name"))
.set("spark.some.config.option", "some-value")
sc \= SparkContext(conf \= spark\_conf)
###################
## Spark Session
###################
spark \= (SparkSession
.builder
.appName("Your App Name")
.config("spark.some.config.option", "some-value")
.getOrCreate())
sample examples of the Tuple \- ("spark.some.config.option", "some-value")
.
Examples of the Tuple are \-
('spark.executor.memory', '4g')
('spark.executor.cores', '4')
('spark.cores.max', '4')
('spark.driver.memory','4g')
sc.\_conf.getAll()
spark.sparkContext.getConf().getAll()
Additional Read - Different Parts of a Spark Application Code , Class & Jars How to Improve Spark Application Performance –Part 1?