questions/big-data-hadoop/page/27
You have to add the partition before ...READ MORE
Incremental append or load in sqoop will ...READ MORE
There are many sites you can get ...READ MORE
In HA (High Availability) architecture, we have ...READ MORE
The error which you are getting can ...READ MORE
Never mind. I forgot to run hadoop namenode ...READ MORE
Seems like Firewall is blocking the connection. ...READ MORE
Try this: stop all the daemons: ./stop-all.sh format the namenode: cd ...READ MORE
Logs are distributed across your cluster, but ...READ MORE
You can use the FileUtil api to do this. Example: Configuration ...READ MORE
Seems like your system does not have ...READ MORE
Try adding <property> <name>dfs.name.dir</name> <value>/path/to/hdfs/dir</value> ...READ MORE
Run the command as sudo or add the ...READ MORE
It can be controlled by setting the ...READ MORE
While running Scala, Scala objects are translated ...READ MORE
You have forgotten to include the package name ...READ MORE
First, format the namenode and then try ...READ MORE
There are two possible reasons for this: Wrong ...READ MORE
You can see the free available space ...READ MORE
You can use the SPARK_MAJOR_VERSION for this. Suppose ...READ MORE
1)Family Delete Marker- This markers marks all ...READ MORE
Make the following changes to the hadoop-env.sh ...READ MORE
You can increase the threshold in yarn-site.xml <property> ...READ MORE
type jps and check whether namenode and datanode is ...READ MORE
To rectify this errors, you need to ...READ MORE
Make sure you have built Nutch from ...READ MORE
You have to write this directory in ...READ MORE
Below are the versions which can be used ...READ MORE
In hadoop, we do not create different ...READ MORE
Hi. This is the code I used ...READ MORE
It seems hbase did not start properly ...READ MORE
Yes, you can update the data before ...READ MORE
sudo service mysqld restart mysql -u <username> root ...READ MORE
You can use the DESCRIBE command to ...READ MORE
You can use the hdfs command: hdfs fs ...READ MORE
Hello. The -m or --num-mappers is just a ...READ MORE
The error which you are getting i.e ...READ MORE
You can try the code mentioned below ...READ MORE
Hey, try this code import java.io.IOException; import java.util.Iterator; import java.util.StringTokenizer; import ...READ MORE
Seems like the jar file was not ...READ MORE
Hadoop has a special file system called ...READ MORE
In Hdfs, data and metadata are decoupled. ...READ MORE
Please refer to the below commands: student = ...READ MORE
The error you are getting is AvroWrapper class not ...READ MORE
Try this: val text = sc.wholeTextFiles("student/*") text.collect() ...READ MORE
import commands hdir_list = commands.getoutput('hadoop fs -ls hdfs: ...READ MORE
Yes, you check the free space using ...READ MORE
The status of Hue is shown in ...READ MORE
Try this: First, click on file import appliance. Now ...READ MORE
Combiner function is not preferred because when ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.