42065/how-to-cleanup-application-work-directories-faster
By default, the cleanup time is set to 604800 (7 days). You can change it as follows:
val sc = new SparkContext(new SparkConf()) ./bin/spark-submit <all your existing options> --spark.worker.cleanup.appDataTtl=<Time in seconds>
You can give users only view permission ...READ MORE
Yes, you have read it right. The ...READ MORE
By default, the timeout is set to ...READ MORE
To enable cleanup, open the spark shell ...READ MORE
Instead of spliting on '\n'. You should ...READ MORE
Firstly you need to understand the concept ...READ MORE
org.apache.hadoop.mapred is the Old API org.apache.hadoop.mapreduce is the ...READ MORE
Hi, You can create one directory in HDFS ...READ MORE
You can do it dynamically like this: val ...READ MORE
You cans set extra JVM options that ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.