Independence Day Deal! Unlock 25% OFF Today – Limited-Time Offer - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Amazon Exam MLS-C01 Topic 4 Question 98 Discussion

Actual exam question for Amazon's MLS-C01 exam
Question #: 98
Topic #: 4
[All MLS-C01 Questions]

This graph shows the training and validation loss against the epochs for a neural network

The network being trained is as follows

* Two dense layers one output neuron

* 100 neurons in each layer

* 100 epochs

* Random initialization of weights

Which technique can be used to improve model performance in terms of accuracy in the validation set?

Show Suggested Answer Hide Answer
Suggested Answer: A

Stratified sampling is a technique that preserves the class distribution of the original dataset when creating a smaller or split dataset. This means that the proportion of examples from each class in the original dataset is maintained in the smaller or split dataset. Stratified sampling can help improve the validation accuracy of the model by ensuring that the validation dataset is representative of the original dataset and not biased towards any class. This can reduce the variance and overfitting of the model and increase its generalization ability. Stratified sampling can be applied to both oversampling and undersampling methods, depending on whether the goal is to increase or decrease the size of the dataset.

The other options are not effective ways to improve the validation accuracy of the model. Acquiring additional data about the majority classes in the original dataset will only increase the imbalance and make the model more biased towards the majority classes. Using a smaller, randomly sampled version of the training dataset will not guarantee that the class distribution is preserved and may result in losing important information from the minority classes. Performing systematic sampling on the original dataset will also not ensure that the class distribution is preserved and may introduce sampling bias if the original dataset is ordered or grouped by class.

References:

* Stratified Sampling for Imbalanced Datasets

* Imbalanced Data

* Tour of Data Sampling Methods for Imbalanced Classification


Contribute your Thoughts:

Tamra
1 months ago
If this was a cake-baking exam, I'd say the solution is to add more sugar. But since it's a neural network, I guess I'll have to use my brain instead of my sweet tooth.
upvoted 0 times
Lavera
14 days ago
C) Increasing the number of epochs
upvoted 0 times
...
Mariann
16 days ago
B) Random initialization of weights with appropriate seed
upvoted 0 times
...
Ming
1 months ago
A) Early stopping
upvoted 0 times
...
...
Rosina
2 months ago
Random initialization of weights with an appropriate seed? Sounds like a job for the weights and biases fairy. I wonder if they take résumés.
upvoted 0 times
Torie
13 days ago
User 3: Random initialization of weights with an appropriate seed is crucial for training neural networks effectively.
upvoted 0 times
...
Raylene
24 days ago
User 2: Adding another layer with the 100 neurons might also improve the model performance.
upvoted 0 times
...
Esteban
1 months ago
User 1: Early stopping could help improve accuracy in the validation set.
upvoted 0 times
...
...
Alecia
2 months ago
Adding another layer with 100 neurons? Seriously, that's like throwing more spaghetti at the wall, hoping it sticks. Not a very strategic approach.
upvoted 0 times
Alpha
9 days ago
C) Increasing the number of epochs
upvoted 0 times
...
Erick
18 days ago
B) Random initialization of weights with appropriate seed
upvoted 0 times
...
Caprice
1 months ago
A) Early stopping
upvoted 0 times
...
...
Sena
2 months ago
Increasing the number of epochs won't help here. The model has already converged, and continuing to train would just lead to more overfitting.
upvoted 0 times
...
Gail
2 months ago
I agree with Martina, increasing the number of epochs can help improve model performance.
upvoted 0 times
...
Lilli
2 months ago
The training and validation loss curves indicate that the model is overfitting. Early stopping would be the best choice to prevent overfitting and improve validation accuracy.
upvoted 0 times
Eleonore
21 days ago
Random initialization of weights with appropriate seed could also help in preventing overfitting and improving model performance.
upvoted 0 times
...
Joana
21 days ago
Increasing the number of epochs might not necessarily improve validation accuracy, early stopping is a better approach.
upvoted 0 times
...
Glen
29 days ago
I agree, adding another layer with 100 neurons might make the overfitting issue worse.
upvoted 0 times
...
Nancey
1 months ago
Early stopping would definitely help prevent overfitting in this case.
upvoted 0 times
...
Wenona
1 months ago
C) Increasing the number of epochs
upvoted 0 times
...
Lashandra
2 months ago
A) Early stopping
upvoted 0 times
...
...
Martina
2 months ago
I disagree, I believe the answer is C) Increasing the number of epochs.
upvoted 0 times
...
Launa
3 months ago
I think the answer is A) Early stopping.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77