Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging #254

Closed
out0 opened this issue Sep 20, 2019 · 3 comments
Closed

java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging #254

out0 opened this issue Sep 20, 2019 · 3 comments
Labels
question Further information is requested

Comments

@out0
Copy link

out0 commented Sep 20, 2019

Problem encountered on https://dotnet.microsoft.com/learn/data/spark-tutorial/run
Operating System: Linux CentOS 7 - x64

I'm using CDH. I installed spark as a CDH resource. Tested it with python both in pyspark shell and as a standalone script (spark-submit).

Tried to run the C# example but ran into this exception.

spark-submit --class org.apache.spark.deploy.dotnet.DotnetRunner --master local bin/Debug/netcoreapp2.2/microsoft-spark-2.4.x-0.4.0.jar dotnet bin/Debug/netcoreapp2.2/MySparkApp.dll
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.deploy.dotnet.DotnetRunner.main(DotnetRunner.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:730)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 21 more

Program.cs

using Microsoft.Spark.Sql;

namespace MySparkApp
{
class Program
{
static void Main(string[] args)
{
// Create a Spark session
var spark = SparkSession
.Builder()
.AppName("word_count_sample")
.GetOrCreate();

        // Create initial DataFrame
        DataFrame dataFrame = spark.Read().Text("input.txt");

        // Count words
        var words = dataFrame
            .Select(Functions.Split(Functions.Col("value"), " ").Alias("words"))
            .Select(Functions.Explode(Functions.Col("words"))
            .Alias("word"))
            .GroupBy("word")
            .Count()
            .OrderBy(Functions.Col("count").Desc());

        // Show results
        words.Show();
    }
}

}

input.txt
Hello World
This .NET app uses .NET for Apache Spark
This .NET app counts words with Apache Spark

@imback82
Copy link
Contributor

Looks like you are using Spark 2.4.2 (or Spark built with scala 2.12): #60 (comment)

Can you download Spark here: https://spark.apache.org/downloads.html?

@imback82 imback82 added the question Further information is requested label Sep 21, 2019
@out0
Copy link
Author

out0 commented Sep 23, 2019

I'm using spark 1.6.0 (spark-submit --version)

@imback82
Copy link
Contributor

Please check supported Spark version here: https://github.com/dotnet/spark#supported-apache-spark

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants