Independence Day Deal! Unlock 25% OFF Today – Limited-Time Offer - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

CertNexus Exam AIP-210 Topic 6 Question 24 Discussion

Actual exam question for CertNexus's AIP-210 exam
Question #: 24
Topic #: 6
[All AIP-210 Questions]

A company is developing a merchandise sales application The product team uses training data to teach the AI model predicting sales, and discovers emergent bias. What caused the biased results?

Show Suggested Answer Hide Answer
Suggested Answer: B

Workflow design patterns for machine learning pipelines are common solutions to recurring problems in building and managing machine learning workflows. One of these patterns is to represent a pipeline with a directed acyclic graph (DAG), which is a graph that consists of nodes and edges, where each node represents a step or task in the pipeline, and each edge represents a dependency or order between the tasks. A DAG has no cycles, meaning there is no way to start at one node and return to it by following the edges. A DAG can help visualize and organize the pipeline, as well as facilitate parallel execution, fault tolerance, and reproducibility.


Contribute your Thoughts:

Dottie
2 months ago
I'm going to have to go with C. Flawed expectations? Sounds like the team was playing a game of 'Guess the Bias' instead of 'Predict the Sales'.
upvoted 0 times
Nguyet
23 days ago
C) The team set flawed expectations when training the model.
upvoted 0 times
...
Jenelle
24 days ago
B) The application was migrated from on-premise to a public cloud.
upvoted 0 times
...
Bettyann
1 months ago
A) The AI model was trained in winter and applied in summer.
upvoted 0 times
...
...
Kelvin
2 months ago
Nah, I'm sticking with option A. Training in winter and applying in summer? That's a recipe for disaster. Looks like the team needed to invest in a seasonal wardrobe for their AI model.
upvoted 0 times
...
Janessa
2 months ago
Oh, I'm feeling lucky with B. Migrating to the cloud? That's bound to introduce all kinds of unexpected biases. Gotta love technology, am I right?
upvoted 0 times
Kattie
13 days ago
C) The team set flawed expectations when training the model.
upvoted 0 times
...
Lashonda
16 days ago
Oh, I'm feeling lucky with B. Migrating to the cloud? That's bound to introduce all kinds of unexpected biases. Gotta love technology, am I right?
upvoted 0 times
...
Garry
18 days ago
B) The application was migrated from on-premise to a public cloud.
upvoted 0 times
...
Brandon
1 months ago
A) The AI model was trained in winter and applied in summer.
upvoted 0 times
...
...
Lashanda
2 months ago
I don't know, D seems like the obvious choice to me. Inaccurate training data is a surefire way to get biased predictions. Maybe the team should have used a crystal ball instead?
upvoted 0 times
...
Valentine
2 months ago
Hmm, I'm gonna go with option C. Flawed expectations when training the model could definitely lead to biased results. Rookie mistake, but it happens.
upvoted 0 times
...
Gilma
2 months ago
Maybe the team should have set better expectations during training to avoid bias.
upvoted 0 times
...
Lera
3 months ago
I agree with Ernest, using inaccurate data can definitely lead to biased results.
upvoted 0 times
...
Ernest
3 months ago
I think the biased results were caused by inaccurate training data.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77