4331/what-is-the-command-to-check-the-number-of-cores-in-spark
My spark.cores.max property is 24 and I have 3 worker nodes. Once I log into my worker node, I can see one process running which is the consuming CPU. I think it is not using all the 8 cores. How can I check the number of cores?
Go to your Spark Web UI & you can see you’re the number of cores over there:
hadoop fs -cat /example2/doc1 | wc -l READ MORE
Hey, Yes, there is a way to check ...READ MORE
Hi@sonali, It depends on what kind of testing ...READ MORE
One of the options to check the ...READ MORE
Instead of spliting on '\n'. You should ...READ MORE
Though Spark and Hadoop were the frameworks designed ...READ MORE
Firstly you need to understand the concept ...READ MORE
Hi, You can create one directory in HDFS ...READ MORE
You can try filter using value in ...READ MORE
mr-jobhistory-daemon. sh start historyserver READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.