Independence Day Deal! Unlock 25% OFF Today – Limited-Time Offer - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Splunk Exam SPLK-2002 Topic 1 Question 78 Discussion

Actual exam question for Splunk's SPLK-2002 exam
Question #: 78
Topic #: 1
[All SPLK-2002 Questions]

What is the best method for sizing or scaling a search head cluster?

Show Suggested Answer Hide Answer
Suggested Answer: B

Contribute your Thoughts:

Filiberto
2 months ago
Hmm, that's an interesting perspective. I can see how that method could also work well for sizing a search head cluster.
upvoted 0 times
...
Theron
2 months ago
I disagree, I believe option D) Estimate the maximum concurrent number of searches and divide by the number of CPU cores per search head is the way to go.
upvoted 0 times
...
Filiberto
2 months ago
I think the best method is A) Estimate the maximum daily ingest volume in gigabytes and divide by the number of CPU cores per search head.
upvoted 0 times
...
Adelina
2 months ago
I personally think option B) Estimate the total number of searches per day and divide by the number of CPU cores available on the search heads makes the most sense.
upvoted 0 times
...
Tom
2 months ago
Haha, C is just ridiculous. Might as well just roll a dice to determine the number of search heads.
upvoted 0 times
Krissy
25 days ago
B) Estimate the total number of searches per day and divide by the number of CPU cores available on the search heads.
upvoted 0 times
...
Harley
27 days ago
A) Estimate the maximum daily ingest volume in gigabytes and divide by the number of CPU cores per search head.
upvoted 0 times
...
Amber
1 months ago
B) Estimate the total number of searches per day and divide by the number of CPU cores available on the search heads.
upvoted 0 times
...
Sina
1 months ago
A) Estimate the maximum daily ingest volume in gigabytes and divide by the number of CPU cores per search head.
upvoted 0 times
...
...
Fletcher
2 months ago
Hey, I was going to pick C, but that sounds like a total guess. Dividing indexers by 3? What kind of magic number is that?
upvoted 0 times
...
Sherly
2 months ago
I disagree, I believe option D) Estimate the maximum concurrent number of searches and divide by the number of CPU cores per search head is the way to go.
upvoted 0 times
...
Michell
2 months ago
I was about to choose B, but D makes more sense. Gotta account for those peak concurrency numbers, not just total volume.
upvoted 0 times
...
Abel
2 months ago
Hmm, D seems like the most reasonable approach. Sizing the search head cluster based on the maximum concurrent searches makes the most sense to me.
upvoted 0 times
Bulah
25 days ago
User4: D does seem like a logical choice for scaling the search head cluster.
upvoted 0 times
...
Margot
1 months ago
User3: I would go with D as well. Sizing based on concurrent searches is practical.
upvoted 0 times
...
Silvana
1 months ago
User2: I agree, it seems like the most reasonable approach.
upvoted 0 times
...
Emerson
2 months ago
User1: I think D is the way to go. Sizing based on maximum concurrent searches makes sense.
upvoted 0 times
...
...
Viki
2 months ago
I think the best method is A) Estimate the maximum daily ingest volume in gigabytes and divide by the number of CPU cores per search head.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77