Hadoop Map Reduce java lang reflect InvocationTargetException

+1 vote

I am getting the above mentioned error when I run my code. The code is as follows:

import java.io.IOException;


import org.apache.hadoop.io.IntWritable;

import org.apache.hadoop.io.LongWritable;

import org.apache.hadoop.io.Text;

import org.apache.hadoop.mrunit.mapreduce.MapDriver;

import org.junit.Before;

import org.junit.Test;


public class MyTest {

MapDriver<LongWritable, Text, Text, IntWritable> mapDriver;

@Before

public void setup()

{

MyMapper mr=new MyMapper();

mapDriver=MapDriver.newMapDriver(mr);

System.out.println("mapdriver: "+mapDriver);

}

@Test

public void resultSuccess() throws IOException

{

mapDriver.withInput(new LongWritable(), new Text("655209;1;796764372490213;804422938115889;6"));

mapDriver.withOutput(new Text("6"), new IntWritable(1));

mapDriver.runTest();

}


}



Mapperclass

============


import java.io.IOException;


import org.apache.hadoop.io.IntWritable;

import org.apache.hadoop.io.LongWritable;

import org.apache.hadoop.io.Text;

import org.apache.hadoop.mapreduce.Mapper;

import org.apache.hadoop.mapreduce.Mapper.Context;


public class MyMapper extends Mapper<LongWritable, Text, Text, IntWritable> {

     Text word = new Text();

IntWritable one = new IntWritable(1);


public void map(LongWritable key, Text value, Context con)

throws IOException, InterruptedException {

String strValu = value.toString();

    String[] words=strValu.split(":");

    if((Integer.parseInt(words[1])==1))

    {

     word.set(words[4]);

     con.write(word, one);

    }

}


}

Dec 17, 2018 in Big Data Hadoop by slayer
• 29,370 points
1,699 views

1 answer to this question.

0 votes
I executed the same code and it worked. I got no error. Create a new project and add the code there. Execute it from there. Hopefully it should run.
answered Dec 17, 2018 by Omkar
• 69,220 points

Related Questions In Big Data Hadoop

0 votes
1 answer

Map and Reduce task memory settings in Hadoop YARN

It's preferable and generally, it is recommended ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by kurt_cobain
• 9,350 points
1,202 views
0 votes
1 answer

What is the best functional language to do Hadoop Map-Reduce?

down voteacceptedBoth Clojure and Haskell are definitely ...READ MORE

answered Sep 4, 2018 in Big Data Hadoop by Frankie
• 9,830 points
908 views
0 votes
1 answer

Hadoop Java Error: java.lang.NoClassDefFoundError: WordCount (wrong name: org/myorg/WordCount)

Hey, try this code import java.io.IOException; import java.util.Iterator; import java.util.StringTokenizer; import ...READ MORE

answered Sep 19, 2018 in Big Data Hadoop by slayer
• 29,370 points
6,181 views
0 votes
1 answer

Hadoop: java.lang.IllegalArgumentException: Wrong FS: expected: file:///

Try this: Configuration configuration = new Configuration(); FileSystem fs ...READ MORE

answered Dec 12, 2018 in Big Data Hadoop by Omkar
• 69,220 points
7,837 views
0 votes
1 answer

Moving files in Hadoop using the Java API?

I would recommend you to use FileSystem.rename(). ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,669 views
0 votes
1 answer

Hadoop giving java.io.IOException, in mkdir Java code.

I am not sure about the issue. ...READ MORE

answered May 3, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,543 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,029 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,538 views
0 votes
1 answer

Hadoop Pig: java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/filter/Filter

This seems like a problem with the ...READ MORE

answered Nov 16, 2018 in Big Data Hadoop by Omkar
• 69,220 points
2,042 views
0 votes
1 answer

Hadoop: "Caused by: java.lang.ClassNotFoundException" error

Have you placed the jar files in ...READ MORE

answered Nov 19, 2018 in Big Data Hadoop by Omkar
• 69,220 points
7,005 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP