In what kind of use cases has Spark outperformed Hadoop in processing

0 votes
Any Idea Anyone ???
Sep 19, 2018 in Apache Spark by shams
• 3,670 points
1,131 views

1 answer to this question.

0 votes

I can list some but there can be more to it:

  1. Sensor Data Processing –Apache Spark’s ‘In-memory computing’ works best here, as data is retrieved and combined from different sources.
  2. Spark is preferred over Hadoop for real time querying of data
  3. Stream Processing – For processing logs and detecting frauds in live streams for alerts, Apache Spark is the best solution.
answered Sep 19, 2018 by zombie
• 3,790 points

Related Questions In Apache Spark

0 votes
1 answer

What is the use of App class in Scala?

Hi, Scala provides a helper class, called App, that ...READ MORE

answered Jul 31, 2019 in Apache Spark by Gitika
• 65,770 points
11,385 views
0 votes
1 answer
0 votes
1 answer

What are some of the things you can monitor in the Spark Web UI?

Option c) Mapr Jobs that are submitted READ MORE

answered Nov 25, 2020 in Apache Spark by Gitika
• 65,770 points
3,771 views
+1 vote
2 answers

Hadoop 3 compatibility with older versions of Hive, Pig, Sqoop and Spark

Hadoop 3 is not widely used in ...READ MORE

answered Apr 20, 2018 in Apache Spark by kurt_cobain
• 9,350 points
5,940 views
+2 votes
4 answers

use length function in substring in spark

You can use the function expr val data ...READ MORE

answered May 3, 2018 in Apache Spark by kurt_cobain
• 9,350 points
42,967 views
+1 vote
1 answer

Can anyone explain what is RDD in Spark?

RDD is a fundamental data structure of ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 13,490 points
2,699 views
0 votes
1 answer

Spark 2.3? What is new in it?

Here are the changes in new version ...READ MORE

answered May 28, 2018 in Apache Spark by kurt_cobain
• 9,350 points
798 views
0 votes
1 answer

What are the levels of parallelism in spark streaming ?

> In order to reduce the processing ...READ MORE

answered Jul 27, 2018 in Apache Spark by zombie
• 3,790 points
4,889 views
+1 vote
8 answers

How to print the contents of RDD in Apache Spark?

Save it to a text file: line.saveAsTextFile("alicia.txt") Print contains ...READ MORE

answered Dec 10, 2018 in Apache Spark by Akshay
61,807 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP