HDFS Upload file from local file system to Edureka cloud lab

0 votes
Hi. I am using Edureka Cloud lab for hdfs practice. I want to upload a wordcount problem file which is in my local system to the lab. How to do this? Can you please mention the steps?
Jul 11, 2019 in Big Data Hadoop by Rehan
1,387 views

1 answer to this question.

0 votes

To upload a file from your local system to Edureka's Cloud Lab for practicing an HDFS wordcount problem, follow these steps:

1. Log in to Edureka Cloud Lab:
   - Access the Edureka Cloud Lab using your login credentials.

2. Open the Terminal in the Lab:
   - Once logged in, access the terminal from the lab environment where you will interact with HDFS.

3. Navigate to the Appropriate HDFS Directory:
   - Use the `hadoop fs -ls` command to check directories and decide where you want to upload the file.
   - If needed, create a new directory using `hadoop fs -mkdir /path/to/directory`.

4. File Upload from Local System:
   - Look for a "File Upload" or "Upload" option in the Cloud Lab interface. Typically, there is a GUI-based upload tool within the lab interface. Click on this option, select your file from the local system, and upload it.

5. Confirm File Location in the Lab:
   - After uploading the file, ensure it has been moved to the correct HDFS directory by listing the directory contents:

     hadoop fs -ls /path/to/directory
 

6. Move the File into HDFS (if required):
  If you uploaded the file into the Cloud Lab's local environment but it isn't yet in HDFS, use the following command to move it into HDFS:

hadoop fs -put /local/path/to/file /hdfs/path/
   
   - For example:
     ```bash
     hadoop fs -put wordcount.txt /user/username/hdfsdir/
     ```

You should now have your wordcount file uploaded and ready for use with HDFS operations! If you encounter any specific issues, let me know.

answered Jul 11, 2019 by Zora

Related Questions In Big Data Hadoop

0 votes
1 answer

Copy file from HDFS to the local file system

There are two possible ways to copy ...READ MORE

answered Mar 27, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
17,102 views
+1 vote
1 answer

How to copy file from Local file system to HDFS?

Hi@akhtar, You can copy files from your local ...READ MORE

answered Oct 20, 2020 in Big Data Hadoop by MD
• 95,460 points
2,957 views
0 votes
1 answer
0 votes
2 answers

hadoop copy a local file system folder to HDFS

There's a typo in your command: "hadopp". ...READ MORE

answered Feb 4, 2019 in Big Data Hadoop by Lohith
25,321 views
+1 vote
2 answers

What does hadoop fs -du command gives as output?

du command is used for to see ...READ MORE

answered Jul 24, 2019 in Big Data Hadoop by Lokesh Singh
5,860 views
0 votes
1 answer

How can I write text in HDFS using CMD?

Hadoop put & appendToFile only reads standard ...READ MORE

answered Apr 27, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,277 views
0 votes
1 answer

What is the command to find the free space in HDFS?

You can use dfsadmin which runs a ...READ MORE

answered Apr 29, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,179 views
0 votes
1 answer

How to find the used cache in HDFS

hdfs dfsadmin -report This command tells fs ...READ MORE

answered May 4, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,466 views
0 votes
1 answer

Hadoop: How to copy directory from local system to hdfs using Java code?

Just use the FileSystem's copyFromLocalFile method. If the source Path ...READ MORE

answered Nov 14, 2018 in Big Data Hadoop by Omkar
• 69,220 points
5,839 views
0 votes
1 answer

How to execute python script in hadoop file system (hdfs)?

If you are simply looking to distribute ...READ MORE

answered Sep 19, 2018 in Big Data Hadoop by digger
• 26,740 points
13,553 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP