Comments (6)
I have the same issue, what image should we use?
from spark-on-k8s-operator.
Good morning,
As of #2010, the examples have been updated to reference the official spark image available on dockerhub: spark:3.5.0
. Unfortunately, the legacy images are no longer available. Fortunately, the official spark images are fully compatible with this operator.
from spark-on-k8s-operator.
Thank you for the information, but also in spar-pi-prometheus we have the same problem, this image: gcr.io/spark-operator/spark:v3.1.0-gcs-prometheus does not work. could you please help us!
from spark-on-k8s-operator.
@sidi-elwely #2010 didn't update the prometheus-enabled image, which currently is not published by any of the CI jobs. I'll defer to a maintainer as to whether this is something worth re-enabling, but I think it's likely to need some rework regardless. Right now the image is tied specifically to GCP, which I'm comfortable saying isn't optimal. The meat of the image WRT prometheus is a single jar and a couple of conf files-- perhaps not worth maintaining as a separate image, but I can imagine a few ways to ease usage.
from spark-on-k8s-operator.
See https://github.com/kubeflow/spark-operator/tree/master/spark-docker if you're interested in creating your own prometheus-enabled image in the mean time.
from spark-on-k8s-operator.
After updating the base image to spark 3.5.0 and the spark-operator till the last version i have this error and my apps won't start, also the application version was updated from 3.2.0 to 3.5.0
Files local:///opt/spark-jars/spark-3-rules.yaml from /opt/spark-jars/spark-3-rules.yaml to /opt/spark-jars/spark-3-rules.yaml
2024-07-18 12:44:07.005 WARN [main ] org.apache.spark.network.util.JavaUtils:112 - Attempt to delete using native Unix OS command failed for path = /opt/spark-jars/spark-3-rules.yaml. Falling back to Java IO way
java.io.IOException: Failed to delete: /opt/spark-jars/spark-3-rules.yaml
at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingUnixNative(JavaUtils.java:173) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:109) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:90) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.util.SparkFileUtils.deleteRecursively(SparkFileUtils.scala:121) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.util.SparkFileUtils.deleteRecursively$(SparkFileUtils.scala:120) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1126) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.deploy.SparkSubmit.$anonfun$prepareSubmitEnvironment$14(SparkSubmit.scala:437) ~[DataAnalyticsReporting.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[DataAnalyticsReporting.jar:?]
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) ~[DataAnalyticsReporting.jar:?]
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) ~[DataAnalyticsReporting.jar:?]
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) ~[DataAnalyticsReporting.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[DataAnalyticsReporting.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[DataAnalyticsReporting.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.deploy.SparkSubmit.downloadResourcesToCurrentDirectory$1(SparkSubmit.scala:429) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.deploy.SparkSubmit.$anonfun$prepareSubmitEnvironment$16(SparkSubmit.scala:450) ~[DataAnalyticsReporting.jar:?]
at scala.Option.map(Option.scala:230) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:450) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:964) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:194) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:217) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1120) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1129) ~[DataAnalyticsReporting.jar:?]
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) ~[DataAnalyticsReporting.jar:?]
Exception in thread "main" java.io.IOException: Failed to delete: /opt/spark-jars/spark-3-rules.yaml
at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:146)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:117)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:90)
at org.apache.spark.util.SparkFileUtils.deleteRecursively(SparkFileUtils.scala:121)
at org.apache.spark.util.SparkFileUtils.deleteRecursively$(SparkFileUtils.scala:120)
at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1126)
at org.apache.spark.deploy.SparkSubmit.$anonfun$prepareSubmitEnvironment$14(SparkSubmit.scala:437)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.map(TraversableLike.scala:286)
at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
at scala.collection.AbstractTraversable.map(Traversable.scala:108)
at org.apache.spark.deploy.SparkSubmit.downloadResourcesToCurrentDirectory$1(SparkSubmit.scala:429)
at org.apache.spark.deploy.SparkSubmit.$anonfun$prepareSubmitEnvironment$16(SparkSubmit.scala:450)
at scala.Option.map(Option.scala:230)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:450)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:964)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:194)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:217)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1120)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1129)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
from spark-on-k8s-operator.
Related Issues (20)
- [BUG] spec.ipFamilies: field not declared in schema Exception HOT 5
- [QUESTION] How can we monitor the Spark Operator container logs?
- [BUG] Submitting a Spark Application Fails on K8s versions < 1.21 HOT 2
- [Feature] Spark Operator Deployment Not Handling Multiple Namespaces HOT 6
- [FEATURE] prevent driver pod from being deleted before its status is processed by the operator HOT 2
- [BUG] volumes configmap not find in driver pod HOT 3
- [QUESTION] How should we upgrade to newer Spark versions for: the Spark Operator and long running Spark Applications?
- [BUG] Executor run error HOT 1
- [FEATURE] Reduce startup time associated with duplicated dependency downloads
- Add new slack channel to README.md HOT 3
- [BUG] Spark Operator Lock identity is empty while HA HOT 3
- [BUG] Changed the usage of the path of Example, a strange exception occurred.
- [BUG] A strange ClassCastException HOT 1
- [SECURITY ISSUE] A potential risk in spark-operator which can be levereaged to make cluster-level privilege escalation HOT 2
- [BUG] Env vars are randomly dropped HOT 10
- [FEATURE] add readiness and liveness probes
- [FEATURE] Add topologySpreadConstraints if .Values.replicaCount greater than 1
- [QUESTION] My SparkApplication won't start after migrating to spark:3.5.0 HOT 3
- sparkApplication immediately deleted if an ownerReference is set (Argo Workflows) HOT 4
- [FEATURE] Support Spark 4.0.0
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from spark-on-k8s-operator.