What is Spark Core

0 votes
Is Spark Core the main processing unit like a CPU to the computer? Please explain
Mar 8, 2019 in Apache Spark by Ritu
3,520 views

1 answer to this question.

0 votes

It is not like a CPU to the computer but it is similar. SparkCore is the heart of Apache Spark. It handles important functions such as fault-tolerance, memory management, job scheduling, storage system interaction etc. The SparkCore functionality can easily be accessed with the help of available Scala, Java, And Python APIs.

answered Mar 8, 2019 by Raj

Related Questions In Apache Spark

+1 vote
1 answer

Can anyone explain what is RDD in Spark?

RDD is a fundamental data structure of ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 13,490 points
2,695 views
0 votes
1 answer

What is the difference between Apache Spark SQLContext vs HiveContext?

Spark 2.0+ Spark 2.0 provides native window functions ...READ MORE

answered May 26, 2018 in Apache Spark by nitinrawat895
• 11,380 points
4,596 views
0 votes
1 answer

Spark 2.3? What is new in it?

Here are the changes in new version ...READ MORE

answered May 28, 2018 in Apache Spark by kurt_cobain
• 9,350 points
797 views
0 votes
1 answer

What is Spark Piping?

Spark provides a pipe() method on RDDs. ...READ MORE

answered May 31, 2018 in Apache Spark by kurt_cobain
• 9,350 points
2,183 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,029 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,538 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
108,836 views
0 votes
1 answer

Components of Spark

Spark core: The base engine that offers ...READ MORE

answered Mar 8, 2019 in Apache Spark by Raj
643 views
0 votes
1 answer

Cache() vs persist() in Spark

The cache() is used only the default storage level ...READ MORE

answered Mar 8, 2019 in Apache Spark by Raj
10,939 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP