Prevent jobs to be killed from Web UI

0 votes
HI. I have created a Spark application for my project and created a Web UI for it. The problem is that I am new to Spark and somehow either I or my teammates delete the jobs while experimenting.  Is there any way I can take away the privileges for delete of jobs?
Mar 6, 2019 in Apache Spark by Madhu
672 views

1 answer to this question.

0 votes

You need to be careful with this. I am not sure about completely disabling the permission to delete jobs but you can disable deletion of jobs and stages from the Web UI. Use the following property with spark submit and you won't be able to delete the jobs from the Web UI. 

val sc = new SparkContext(new SparkConf())
./bin/spark-submit <all your existing options> --spark.ui.killEnabled=false
answered Mar 6, 2019 by Rohit

Related Questions In Apache Spark

0 votes
1 answer

How to prevent executor from self-destructing?

I think there is a timeout set ...READ MORE

answered Mar 12, 2019 in Apache Spark by Veer
1,089 views
0 votes
1 answer

How to check if user has permission in Web UI?

You can implement this as follows: First, add ...READ MORE

answered Mar 14, 2019 in Apache Spark by Raj
966 views
0 votes
1 answer

How to add modify access for Web UI user?

For a user to have modification access ...READ MORE

answered Mar 14, 2019 in Apache Spark by Raj
637 views
0 votes
1 answer

Efficient way to read specific columns from parquet file in spark

As parquet is a column based storage ...READ MORE

answered Apr 20, 2018 in Apache Spark by kurt_cobain
• 9,350 points
7,855 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,029 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,536 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
108,832 views
0 votes
1 answer

Copy file from local to hdfs from the spark job in yarn mode

Refer to the below code: import org.apache.hadoop.conf.Configuration import org.apache.hadoop.fs.FileSystem import ...READ MORE

answered Jul 24, 2019 in Apache Spark by Yogi
3,825 views
+1 vote
1 answer

How to extract record from one RDD using another RDD

Hey, you can use "contains" filter to extract ...READ MORE

answered Aug 23, 2019 in Apache Spark by Karan
2,401 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP