questions/apache-spark/page/5
SparkContext sets up internal services and establishes ...READ MORE
First, reboot the system. And after reboot, ...READ MORE
This should work: def readExcel(file: String): DataFrame = ...READ MORE
Source tags are different: { x : [ { ...READ MORE
Hey, Here is the example of which will return ...READ MORE
Assuming your RDD[row] is called rdd, you ...READ MORE
Hey, You can use the subtractByKey () function to ...READ MORE
You can try the below code: df.registerTempTable(“airports”) sqlContext.sql(" create ...READ MORE
Converting text file to Orc: Using Spark, the ...READ MORE
By default, the timeout is set to ...READ MORE
After downloading Spark, you need to set ...READ MORE
Refer to the below code: import org.apache.hadoop.conf.Configuration import org.apache.hadoop.fs.FileSystem import ...READ MORE
Hey, you can use "contains" filter to extract ...READ MORE
Try this code: val rdd= sc.textFile (“file.txt”, 5) rdd.partitions.size Output ...READ MORE
Hey, Jobs- to view all the spark jobs Stages- ...READ MORE
Hi, Regarding this error, you just need to change ...READ MORE
Hey, There are few methods provided by the ...READ MORE
Start spark shell using below line of ...READ MORE
I found the following solution to be ...READ MORE
Hey, Lineage is an RDD process to reconstruct ...READ MORE
The reason you are able to load ...READ MORE
Did you find any documents or example ...READ MORE
All prefix operators' symbols are predefined: +, -, ...READ MORE
You can try this: object printarray { ...READ MORE
Well, it depends on the block of ...READ MORE
The error message you have shared with ...READ MORE
spark.read.csv is used when loading into a ...READ MORE
Hi, If you have a file with id ...READ MORE
Hey, You can use this command to start ...READ MORE
There are few reasons for keeping RDD ...READ MORE
Hi, SparkSQL is a special component on the ...READ MORE
What is the major use case for ...READ MORE
As is widely used, and has different ...READ MORE
Below is an example of reading data ...READ MORE
Hey, You can concatenate/join two Maps in more than ...READ MORE
Hi, You can either declare an empty Scala ...READ MORE
Hi, This error will only generate when you ...READ MORE
Yes, we can work with Avro files ...READ MORE
Hey, I guess the only problem with the ...READ MORE
You can try this: d.filter(col("value").isin(desiredThings: _*)) and if you ...READ MORE
df = spark.createDataFrame([("A", 2000), ("A", 2002), ("A", ...READ MORE
The problem is probably with the command. ...READ MORE
Hi, This happens in Scala whenever you won't ...READ MORE
Try this and see if this does ...READ MORE
Hey, Yes, there are two ways of doing ...READ MORE
Hi, Spark provides a pipe() method on RDDs. ...READ MORE
Hey, Java’s “If. Else”: In Java, “If. Else” is a statement, ...READ MORE
You can use this: lines = sc.textFile(“hdfs://path/to/file/filename.txt”); def isFound(line): if ...READ MORE
Try this code, it worked for me: val ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.