Database

[Spark] WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 에러

아나엘 2023. 12. 5. 21:26

맥북 로컬 가상환경에 pyspark 설치 후 터미널이나 주피터노트북에서 Session을 만들어 활용해보려고 했는데 아래와 같은 에러가 떴다.

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
23/12/05 21:14:32 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Traceback (most recent call last):
  File "sparktest.py", line 8, in <module>
    .appName("AuthorsAges")
  File "/Users/macpro/miniconda3/envs/datamining/lib/python3.6/site-packages/pyspark/sql/session.py", line 228, in getOrCreate
    sc = SparkContext.getOrCreate(sparkConf)
  File "/Users/macpro/miniconda3/envs/datamining/lib/python3.6/site-packages/pyspark/context.py", line 392, in getOrCreate
    SparkContext(conf=conf or SparkConf())
  File "/Users/macpro/miniconda3/envs/datamining/lib/python3.6/site-packages/pyspark/context.py", line 147, in __init__
    conf, jsc, profiler_cls)
  File "/Users/macpro/miniconda3/envs/datamining/lib/python3.6/site-packages/pyspark/context.py", line 209, in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
  File "/Users/macpro/miniconda3/envs/datamining/lib/python3.6/site-packages/pyspark/context.py", line 329, in _initialize_context
    return self._jvm.JavaSparkContext(jconf)
  File "/Users/macpro/miniconda3/envs/datamining/lib/python3.6/site-packages/py4j/java_gateway.py", line 1586, in __call__
    answer, self._gateway_client, None, self._fqn)
  File "/Users/macpro/miniconda3/envs/datamining/lib/python3.6/site-packages/py4j/protocol.py", line 328, in get_return_value
    format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x755c9148) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x755c9148
	at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:213)
	at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
	at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:110)
	at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:348)
	at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:287)
	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:336)
	at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:191)
	at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:460)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
	at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
	at py4j.Gateway.invoke(Gateway.java:238)
	at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
	at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
	at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
	at java.base/java.lang.Thread.run(Thread.java:833)

 

 

요거는 가상환경에서 실행하려고 했는데 자바랑 버전 지원이 안돼서 그런거였다. deactivate 후 테스트해보니 됐다

반응형