Which of the following is NOT an activation function?
An activation function is a function that determines the output of a neuron in a neural network based on its input. An activation function can introduce non-linearity into a neural network, which allows it to model complex and non-linear relationships between inputs and outputs. Some of the common activation functions are:
Sigmoid: A sigmoid function is a function that maps any real value to a value between 0 and 1. It has an S-shaped curve and is often used for binary classification or probability estimation.
Hyperbolic tangent: A hyperbolic tangent function is a function that maps any real value to a value between -1 and 1. It has a similar shape to the sigmoid function but is symmetric around the origin. It is often used for regression or classification problems.
ReLU: A ReLU (rectified linear unit) function is a function that maps any negative value to 0 and any positive value to itself. It has a piecewise linear shape and is often used for hidden layers in deep neural networks.
Additive is not an activation function, but rather a term that describes a property of some functions. Additive functions are functions that satisfy the condition f(x+y) = f(x) + f(y) for any x and y. Additive functions are linear functions, which means they have a constant slope and do not introduce non-linearity.
Which two of the following criteria are essential for machine learning models to achieve before deployment? (Select two.)
Scalability and explainability are two criteria that are essential for ML models to achieve before deployment. Scalability is the ability of an ML model to handle increasing amounts of data or requests without compromising its performance or quality. Scalability can help ensure that the model can meet the demand and expectations of users or customers, as well as adapt to changing conditions or environments. Explainability is the ability of an ML model to provide clear and intuitive explanations for its predictions or decisions. Explainability can help increase trust and confidence among users or stakeholders, as well as enable accountability and responsibility for the model's actions and outcomes.
You have a dataset with many features that you are using to classify a dependent variable. Because the sample size is small, you are worried about overfitting. Which algorithm is ideal to prevent overfitting?
Random forest is an algorithm that is ideal to prevent overfitting when using a dataset with many features and a small sample size. Random forest is an ensemble learning method that combines multiple decision trees to create a more robust and accurate model. Random forest can prevent overfitting by introducing randomness and diversity into the model, such as by using bootstrap sampling (sampling with replacement) to create different subsets of data for each tree, or by using feature selection (choosing a random subset of features) to split each node in a tree.
Which of the following regressions will help when there is the existence of near-linear relationships among the independent variables (collinearity)?
The graph is an elbow plot showing the inertia or within-cluster sum of squares on the y-axis and number of clusters (also called K) on the x-axis, denoting the change in inertia as the clusters change using k-means algorithm.
What would be an optimal value of K to ensure a good number of clusters?
The optimal value of K is the one that minimizes the inertia or within-cluster sum of squares, while avoiding too many clusters that may overfit the data. The elbow plot shows a sharp decrease in inertia from K = 1 to K = 2, and then a more gradual decrease from K = 2 to K = 3. After K = 3, the inertia does not change much as K increases. Therefore, the elbow point is at K = 3, which is the optimal value of K for this data. Reference: How to Run K-Means Clustering in Python, K-means clustering - Wikipedia
Salena
2 days agoKirby
1 months agoBarbra
2 months agoLawana
3 months agoKrystal
4 months agoKassandra
4 months agoLelia
4 months agoLashawnda
5 months agoCarole
5 months agoTomoko
5 months agoGlenn
6 months agoTheresia
6 months agoValentin
6 months agoKami
7 months agoMalcom
7 months agoMeaghan
7 months agoYvonne
7 months agoStaci
8 months agoLera
8 months agoAdelaide
8 months agoTori
9 months agoWilliam
9 months agoTyra
10 months agoTegan
10 months agoMaryanne
11 months agoLorean
11 months agoSkye
11 months agoJamie
11 months agoAlex
1 years ago