Independence Day Deal! Unlock 25% OFF Today – Limited-Time Offer - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

NVIDIA Exam NCA-GENL Topic 1 Question 1 Discussion

Actual exam question for NVIDIA's NCA-GENL exam
Question #: 1
Topic #: 1
[All NCA-GENL Questions]

[Fundamentals of Machine Learning and Neural Networks]

When comparing and contrasting the ReLU and sigmoid activation functions, which statement is true?

Show Suggested Answer Hide Answer
Suggested Answer: D

ReLU (Rectified Linear Unit) and sigmoid are activation functions used in neural networks. According to NVIDIA's deep learning documentation (e.g., cuDNN and TensorRT), ReLU, defined as f(x) = max(0, x), is computationally efficient because it involves simple thresholding, avoiding expensive exponential calculations required by sigmoid, f(x) = 1/(1 + e^(-x)). Sigmoid outputs values in the range

[0, 1], making it suitable for predicting probabilities in binary classification tasks. ReLU, with an unbounded positive range, is less suited for direct probability prediction but accelerates training by mitigating vanishing gradient issues. Option A is incorrect, as ReLU is non-linear (piecewise linear). Option B is false, as ReLU is more efficient and not inherently more accurate. Option C is wrong, as ReLU's range is

[0, ), not

[0, 1].


NVIDIA cuDNN Documentation: https://docs.nvidia.com/deeplearning/cudnn/developer-guide/index.html

Goodfellow, I., et al. (2016). 'Deep Learning.' MIT Press.

Contribute your Thoughts:

Dana
5 days ago
I think the answer is A) ReLU is a linear function while sigmoid is non-linear.
upvoted 0 times
...
Ryan
10 days ago
I think the answer is D. ReLU is more efficient, but sigmoid is better for probabilities.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77