How to use Spark jars for Yarn distribution

0 votes
Hello. I have an archive that has all the jars need for Yarn cache. I want to know how to use this archive for the application. Where should I store it and how to make the application use it?
Mar 28, 2019 in Apache Spark by Siri
1,977 views

1 answer to this question.

0 votes

First, store upload this archive to hdfs and note the path to the archive on hdfs. Then open spark shell and run the below command:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.yarn.archive=<hdfs path to archive>
answered Mar 28, 2019 by Raj

Related Questions In Apache Spark

0 votes
1 answer

How to add third party java jars for use in PySpark?

You can add external jars as arguments ...READ MORE

answered Jul 4, 2018 in Apache Spark by nitinrawat895
• 11,380 points

edited Nov 19, 2021 by Sarfaraz 9,219 views
0 votes
1 answer

How to use ftp scheme using Yarn in Spark application?

In case Yarn does not support schemes ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
1,495 views
0 votes
1 answer
0 votes
1 answer

How to make Spark wait for more time for acknowledgement?

Use the following command to increase the ...READ MORE

answered Mar 11, 2019 in Apache Spark by Raj
2,942 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,714 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
3,094 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
113,167 views
0 votes
1 answer

How to set executors for static allocation in Spark Yarn?

Open Spark shell and run the following ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
1,825 views
0 votes
1 answer

How to increase Spark memory for execution?

Probably the spill is because you have ...READ MORE

answered Mar 7, 2019 in Apache Spark by Pavitra

edited Mar 8, 2019 1,490 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP