DevOps | Cloud | Analytics | Open Source | Programming





How To Fix - Service SparkDriver Could Not Bind on Port Issue in Spark ?



In this post, we will explore How To Fix - "Service 'SparkDriver' Could Not Bind on Port" Issue in Spark. Below are some of the formats of the same error.


"Service 'Driver' could not bind on port x. Attempting port y."


WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.

 

Checks:

  • First thing first, check spark.driver.port. Does it match the port numbers used ?
  • These issues happen sometimes if there are Network changes (e.g. VPN) and the IP addresses are not correctly defined.
  • The major point is hostname, bind address etc. go in conflict with each other if not correctly defined. So it is a good practice to maintain uniformity while setting up the config.
  • Do note hostname with below command for later use -

hostname

  • Do note the internal IP with below command for later use -

ip a

   

Solution 1 :

 

  • Go to Spark config and set the host address - spark.driver.host. Set this specifically so that there is uniformity and system does not set the "system name" as the hoostname.
 


spark.driver.host = xx.xx.xx.xx

 

  • Go to Spark config and set the bind address - spark.driver.bindAddress
 


spark.driver.bindAddress = xx.xx.xx.xx

 

  • The above two config changes will ensure that hostname and bind address are same. However note that you can also use localhost ip as well i.e. 127.0.0.1 for xx.xx.xx.xx in case you are using your local machine. However for Production cluster, it should be actual IP Address or it could be the Public IP in case of cloud system like AWS.
    You could also make the amendment dynamically in the run time as shown below -


spark =SparkSession.builder.appName("Spark\_App") \\
.master("local\[\*\]") \\
.config("spark.driver.host","xx.xx.xx.xx") \\ #IP Address
.config("spark.driver.bindAddress","xx.xx.xx.xx") \\ #IP Address
.getOrCreate()

   

Solution 2:

  • Are you using VPN while using the Spark job ?
 

  • If yes, you could use the below steps. Even if you are not using VPN, you could use the below steps and see if that works.
 

  • Run the below
 


$ sudo hostname -s 127.0.0.1

  Alternatively you could also set the hostname in the etc/hosts file.

  • Go to the etc/hosts file and make the below entries (or changes)
 


127.0.0.1 <hostname>

    Check if the hostname changed by running below -


ping $(hostname)

   

Solution 3 :

 

  • Go to you ~/.bash_profile. Add below -
 


export SPARK\_LOCAL\_IP="127.0.0.1"

 

  • Go to your Spark Home dir i.e. $SPARK_HOME. Locate spark-env.sh file. Add the below changes . This will set the Spark Local IP.

export SPARK\_LOCAL\_IP=127.0.0.1

 

  • Go to your Spark Home dir i.e. $SPARK_HOME. Locate load-spark-env.sh file. Add the below changes . This will set the Spark Local IP.

export SPARK\_LOCAL\_IP=127.0.0.1

  Hope the above will help to fix the issue.    

Other Interesting Reads -

     


service 'driver' could not bind on a random free port ,Service 'Driver' could not bind on port 0. Attempting port 1. ,can't assign requested address: service 'sparkdriver' failed after 16 retries ,service 'sparkui' could not bind on port 4040 attempting port 4041 ,consider explicitly setting the appropriate port for the service 'sparkui ,service 'sparkui' could not bind on port 4055 attempting port 4056 ,spark localhost 7077 ,pyspark driver memory ,how to set spark local ip ,spark in local mode , , ,can't assign requested address: service 'sparkdriver' failed after 16 retries ,service 'sparkui' could not bind on port 4040 attempting port 4041 ,consider explicitly setting the appropriate port for the service 'sparkui ,service 'sparkui' could not bind on port 4055 attempting port 4056 ,spark localhost 7077 ,assertion failed expected hostname not ip but got ,spark ports ,how to set spark local ip , ,databricks connect service 'sparkdriver' could not bind on a random free port ,hp service center pekanbaru ,mac service 'sparkdriver' could not bind on a random free port ,ng-bind not working ,service 'sparkdriver' could not bind environment variable ,service 'sparkdriver' could not bind gateway ,service 'sparkdriver' could not bind header ,service 'sparkdriver' could not bind here ,service 'sparkdriver' could not bind https ,service 'sparkdriver' could not bind json ,service 'sparkdriver' could not bind json file ,service 'sparkdriver' could not bind json object ,service 'sparkdriver' could not bind json response ,service 'sparkdriver' could not bind kafka ,service 'sparkdriver' could not bind key ,service 'sparkdriver' could not bind lambda ,service 'sparkdriver' could not bind localhost ,service 'sparkdriver' could not bind logger ,service 'sparkdriver' could not bind name ,service 'sparkdriver' could not bind on a random free port ,service 'sparkdriver' could not bind on port ,service 'sparkdriver' could not bind on port 0. attempting port 1 ,service 'sparkdriver' could not bind query ,service 'sparkdriver' could not bind query parameter ,service 'sparkdriver' could not bind query string ,service 'sparkdriver' could not bind to ,service 'sparkdriver' could not bind value ,service 'sparkdriver' could not bind variable ,service 'sparkdriver' could not bind xampp ,service 'sparkdriver' could not bind xcode ,service 'sparkdriver' could not bind xml ,service 'sparkdriver' could not bind xml file ,service 'sparkdriver' could not bind yet ,service 'sparkdriver' could not bind your request ,service 'sparkdriver' could not bind zabbix ,service 'sparkdriver' could not bind zip file ,service 'sparkdriver' could not bind zone ,service 'sparkdriver' could not bind zoom ,spark service 'sparkdriver' could not bind on a random free port ,spark-shell service 'sparkdriver' could not bind on a random free port ,utils service 'sparkdriver' could not bind on a random free port ,vue v-bind class not working ,warn utils service 'sparkdriver' could not bind on a random free port