Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hadoop Installation, Error: getSubject is supported only if a security manager is allowed

I tried to install Hadoop on my macOS Ventura but has been failed several times. I tried download the lower versions of hadoop as well but no luck so far.

Tried Hadoop Versions: 3.4.0 and 3.3.6 Java Version: java 23

Error Message i am getting while running hdfs namenode -format is:

Exiting with status 1: java.lang.UnsupportedOperationException: getSubject is supported only if a security manager is allowed

Similary, after when i run start-all.sh:

Starting namenodes
Starting namenodes on [Sumans-MacBook-Pro.local]
Starting datanodes
localhost: ERROR: Cannot set priority of datanode process 3556
2024-09-23 16:10:29,438 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.UnsupportedOperationException: getSubject is supported only if a security manager is allowed
    at java.base/javax.security.auth.Subject.getSubject(Subject.java:347)
    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:577)
    at org.apache.hadoop.hdfs.tools.GetConf.run(GetConf.java:344)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:82)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:97)
    at org.apache.hadoop.hdfs.tools.GetConf.main(GetConf.java:361)
Starting resourcemanagers on []
Starting nodemanagers
localhost: ERROR: Cannot set priority of nodemanager process 3966

Configuration i tried are as follows:

core-site.xml

<configuration>
  <property>
      <name>hadoop.tmp.dir</name>
      <value>/Users/sumanbhattarai/hdfs/tmp/</value>
  </property>
  <property>
      <name>fs.default.name</name>
      <value>hdfs://127.0.0.1:9000</value>
  </property>
</configuration>

hdfs-site.xml

<configuration>
  <property>
      <name>dfs.data.dir</name>
      <value>/Users/sumanbhattarai/hdfs/namenode</value>
  </property>
  <property>
      <name>dfs.data.dir</name>
      <value>/Users/sumanbhattarai/hdfs/datanode</value>
  </property>
  <property>
      <name>dfs.replication</name>
      <value>1</value>
  </property>
</configuration>

mapred-site.xml

<configuration> 
  <property> 
    <name>mapreduce.framework.name</name> 
    <value>yarn</value> 
  </property> 
</configuration>

yarn-site.xml

<configuration>
  <property>
    <name>yarn.nodemanager.aux-services</name>
    <value>mapreduce_shuffle</value>
  </property>
  <property>
    <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
    <value>org.apache.hadoop.mapred.ShuffleHandler</value>
  </property>
  <property>
    <name>yarn.resourcemanager.hostname</name>
    <value>127.0.0.1</value>
  </property>
  <property>
    <name>yarn.acl.enable</name>
    <value>0</value>
  </property>
  <property>
    <name>yarn.nodemanager.env-whitelist</name>   
    <value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PERPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME</value>
  </property>
</configuration>
like image 695
Suman Bhattarai Avatar asked Jan 25 '26 22:01

Suman Bhattarai


1 Answers

Spark only works with Java 8, 11 and 17 at the time of writing this answer. Consider downgrading from Java 23 that you are using now to one of the versions specified.

I recommend version 11 as the most stable one

https://www.oracle.com/java/technologies/downloads/#java11?er=221886

like image 77
Yasin Amini Avatar answered Jan 27 '26 11:01

Yasin Amini