JOBSEEKER?

You are given the task of configuring a Hadoop cluster with 2 NameNode(s) and 5 DataNode(s) to handle data generated by web crawlers.

Which options are correct for the given cluster and MapReduce implementations over it?

(Select all acceptable answers.)

In case of a NameNode failure any of the 5 DataNode(s) can act as a failover.
Each of the 5 DataNode(s) will maintain its own EditLog.
For all jobs, 5 mappers will spawn which depends on the number of DataNode(s).
Reducers will always run at one or more of the given 5 DataNode(s).
In a highly available cluster made out of given nodes, only one NameNode will be active.
   

Tags
Hadoop HDFS DataNode HDFS NameNode MapReduce New Public
Easy

3min

Would you like to see our other questions?

We have 1000+ premium hand-crafted questions for 160+ job skills and 20+ coding languages. We prefer questions with small samples of actual work over academic problems or brain teasers.

Visit our question library
Private Concierge

Send us an email with an explanation of your testing needs and a list of candidates. We will create an appropriate test, invite your candidates, review their results, and send you a detailed report.

Contact Private Concierge

Would you like to see our tests? The following tests contain Hadoop related questions:
On the TestDome Blog

Screening Applicants: The Good, the Bad and the Ugly

Since we’re all biased and we use incorrect proxies, why not just outsource hiring to experts or recruitment agencies? After all, they’ve been screening people for many years, so they must know how to do it right?

Not really. I was surprised to discover that many experts disagree with each other. Everybody praises their pet method and criticizes the others. Many of these methods look legitimate, but are based on...

Dashboard Start Trial Sign In Home Tour Tests Questions Pricing For Jobseekers