Apache Cluster not binding ip

Keywords´╝Ü apache-spark apache-spark-standalone

Question: 

I am trying to run my cluster on my external IP so I can have workers from multiple pc's but I'm getting this:


spark-class org.apache.spark.deploy.master.Master --host <myIpIsHere>
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/06/04 20:26:27 INFO Master: Started daemon with process name: 11324@Pusky
19/06/04 20:26:28 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/06/04 20:26:28 INFO SecurityManager: Changing view acls to: Pusky
19/06/04 20:26:28 INFO SecurityManager: Changing modify acls to: Pusky
19/06/04 20:26:28 INFO SecurityManager: Changing view acls groups to:
19/06/04 20:26:28 INFO SecurityManager: Changing modify acls groups to:
19/06/04 20:26:28 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(Pusky); groups with view permissions: Set(); users  with modify permissions: Set(Pusky); groups with modify permissions: Set()
19/06/04 20:26:29 WARN Utils: Service 'sparkMaster' could not bind on port 7077.

Answers: