Hey I am install Hadoop and on single node and run on one file in HDFS but I want to run more of file I work upload files but I know to running all

0 votes
Apr 19, 2019 in Big Data Hadoop by حسين
• 200 points

edited Apr 29, 2019 by Gitika 1,073 views

1 answer to this question.

0 votes

If you are talking about running multiple jobs, then you can do it by submitting your jobs to JobClient.runJob() API. 

public static RunningJob runJob(JobConf job) throws IOException

There are different ways of using this.

answered Apr 22, 2019 by Tina

Related Questions In Big Data Hadoop

0 votes
1 answer
0 votes
2 answers

Hey for all, how to get on large data i want use in hadoop?

Hi, To work with Hadoop you can also ...READ MORE

answered Jul 30, 2019 in Big Data Hadoop by Sunny
1,200 views
0 votes
0 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,028 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,536 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
108,832 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,350 points
4,612 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP