questions/apache-spark/page/8
Hi, RDD in spark stands for REsilient distributed ...READ MORE
Can anyone explain what is immutability in ...READ MORE
Web crawling is a program or automated ...READ MORE
Hi, You can use a simple mathematical calculation ...READ MORE
Dataframe creation commands: Now we will register them ...READ MORE
Can anyone suggest when we create an ...READ MORE
Hi, No, not mandatory, but there is no ...READ MORE
Hey, Spark Core is a base engine of ...READ MORE
Hi, No. An RDD is made up of ...READ MORE
peopleDF: org.apache.spark.sql.DataFrame = [_corrupt_record: string] The above that ...READ MORE
Can anyone suggest how to create RDD ...READ MORE
Hi, You can use for loop in scala using ...READ MORE
Hi, There is a term in Scala that is ...READ MORE
Hi, You can follow this example to know ...READ MORE
Hi, Yield keyword can be used either before ...READ MORE
If you need a single output file ...READ MORE
Hi, You can use two level loops using the ...READ MORE
Hi, The transformations are the functions that are ...READ MORE
Some of the issues I have faced ...READ MORE
Variable declaration can be done in two ...READ MORE
Hey, Reduce action converts an RDD to a ...READ MORE
Hi, Yes, in scala there is a guard condition where ...READ MORE
Hi, Spark’s RDDs are by default recomputed each ...READ MORE
Hi, Spark ecosystem libraries are composed of various ...READ MORE
There is a difference between the two: mapValues ...READ MORE
For spark.read.textFile we need spark-2.x. Please try ...READ MORE
The statement display(id, name, salary) is written before the display function ...READ MORE
You have to specify a comma-separated list ...READ MORE
You cans set extra JVM options that ...READ MORE
By default, each task is allocated with ...READ MORE
You can change the location where you ...READ MORE
To generate the output file, you can ...READ MORE
Run below commands spark-class org.apache.spark.deploy.master.Master spark-class org.apache.spark.deploy.worker.Worker spark://192.168.254.1:7077 NOTE: The ...READ MORE
First create a Spark session like this: val ...READ MORE
To get command prompt for Scala open ...READ MORE
Every spark application has same fixed heap ...READ MORE
Fold in spark Fold is a very powerful ...READ MORE
Spark by default won't let you overwrite ...READ MORE
The default time that the Yarn application waits ...READ MORE
It is not like a CPU to ...READ MORE
To enable cleanup, open the spark shell ...READ MORE
To change to version 2, run the ...READ MORE
You can disable it like this: val sc ...READ MORE
You can do it dynamically like this: val ...READ MORE
You can run the Spark shell for ...READ MORE
By default, the maximum number of times ...READ MORE
I don't think you can copy and ...READ MORE
If you are running history server and ...READ MORE
There are different methods to achieve optimization ...READ MORE
To make Spark store the event logs, ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.