exam questions

Exam AI-900 All Questions

View all questions & answers for the AI-900 exam

Exam AI-900 topic 1 question 62 discussion

Actual exam question from Microsoft's AI-900
Question #: 62
Topic #: 1
[All AI-900 Questions]

HOTSPOT -
To complete the sentence, select the appropriate option in the answer area.
Hot Area:

Show Suggested Answer Hide Answer
Suggested Answer:

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
koni_
Highly Voted 3 years, 6 months ago
I think it should be feature engineering.
upvoted 89 times
mishumashu
2 years, 11 months ago
Agree. This is "Normalization of Data", it involves CHANGING values of input data, and can only be done AFTER feature selection -- i.e. this is feature engineering.
upvoted 20 times
...
...
rfiuvcfns
Highly Voted 3 years, 6 months ago
Based on this statement from the referenced article: Feature selection: The process of selecting the key subset of features to reduce the dimensionality of the training problem. Feature selection is the answer since feature engineering is the creation of NEW features from raw data to capture additional information not easily apparent in the original feature set.
upvoted 26 times
xiban
3 years, 6 months ago
and what about feature transformation ?
upvoted 3 times
Ayor
2 years, 10 months ago
Or feature scaling
upvoted 1 times
...
rfiuvcfns
3 years, 5 months ago
Good point and it would seem the most appropriate answer to this question would be Data Transformation. However, to my uneducated eye, that would still make the answer feature selection since you would select features from the transformed data. I'm guessing the key here it to recall feature engineering is applied first to generate additional features, and then feature selection is done to eliminate irrelevant, redundant, or highly correlated features.
upvoted 2 times
...
...
...
BobFar
Most Recent 9 months, 1 week ago
The answer is Feature engineering! Feature engineering involves transforming raw data into features suitable for machine learning algorithms.   One crucial step is to ensure that numeric variables have a similar scale. This process, often called normalization or standardization, prevents features with larger scales from dominating the model's learning process.   e.g: Imagine you have a dataset with two features: age (ranging from 18 to 65) and income (ranging from $10,000 to $200,000). Without scaling, the income feature might dominate the model's decision-making process, leading to biased results. By scaling both features to a similar range (e.g., 0 to 1), the model can treat both features equally.
upvoted 6 times
Mgb106
5 months, 1 week ago
Perfect explanation. From my understanding: ensuring that the numeric variables are on a similar scale, is data normalization, which is one type of feature engineering.
upvoted 1 times
...
...
60ties
10 months, 2 weeks ago
Feature Engineering = Creation of NEW features from the given raw data. E.g creation of a "seconds" column from a "days" column. Here we are both normalizing and creating a new feature called "seconds." Feature Engineering is the best answer here. Meanwhile Feature selection is choosing features that have (direct & measurable) impacts on the predictions. E.g. Choosing "Driver's Age" & "Number of Years of Driving" & ignoring "Drivers Names" for predicting the likely hood of having a road accident
upvoted 2 times
...
60ties
10 months, 2 weeks ago
Feature Engineering = Creation of NEW features from the given raw data. E.g creation of a "seconds" column from a "days" column. Here we are both normalizing and creating a new feature called "seconds." Feature Engineering is the best answer here.
upvoted 2 times
...
stepkurniawan
1 year, 3 months ago
feature selection is a process of reducing / dropping features, so it is not the correct answer.
upvoted 1 times
...
kd333200
1 year, 6 months ago
This is normalization. In the context of machine learning and data preprocessing, normalization is considered a feature engineering technique, not feature selection.
upvoted 6 times
...
katrang
1 year, 7 months ago
Normally feature engineering is applied first to generate additional features, and then feature selection is done to eliminate irrelevant, redundant, or highly correlated features.
upvoted 1 times
...
mcalif
1 year, 8 months ago
the given answer is wrong. The correct one is feature engineering
upvoted 8 times
...
takusui
1 year, 9 months ago
Feature engineering, as it is hinted here: https://learn.microsoft.com/en-us/azure/machine-learning/concept-automated-ml?view=azureml-api-2#feature-engineering
upvoted 5 times
mciezak
1 year, 8 months ago
I agree: "In Azure Machine Learning, scaling and normalization techniques are applied to facilitate feature engineering"
upvoted 3 times
...
...
rdemontis
1 year, 11 months ago
Given answer is wrong. Feature engineering involves transforming or manipulating the input features (variables) in a way that improves the performance or interpretability of a machine learning model. One common technique in feature engineering is scaling or normalizing numeric variables to ensure they have a similar scale or range. When the numeric variables have different scales, it can negatively impact the performance of certain machine learning algorithms. For example, algorithms that are sensitive to the scale of variables, such as gradient descent-based optimization algorithms, may converge slowly or exhibit biased results if the variables have significantly different scales. https://towardsdatascience.com/feature-engineering-for-numerical-data-e20167ec18
upvoted 6 times
...
leojadue
1 year, 11 months ago
Ensuring that the numeric variables in training data are on a similar scale is an example of feature engineering. Feature engineering involves transforming raw data into a format that is more suitable for modeling. It can involve a range of activities, such as dealing with missing values, creating new features from existing ones, encoding categorical variables, and normalizing numeric variables. Normalizing numeric variables, as you mentioned, is a common form of feature engineering to ensure all variables are on a similar scale. This can be particularly important for certain types of models, like neural networks and support vector machines, which can behave poorly if the features are not on similar scales.
upvoted 1 times
...
master_yoda
2 years ago
Normalization is a **feature engineering** technique used to adjust the values of features in a dataset to a common scale. This is done to facilitate data analysis and modeling, and to reduce the impact of different scales on the accuracy of machine learning models ¹. Is there anything else you would like to know? Source: Conversation with Bing, 4/24/2023 (1) Feature Engineering: Scaling, Normalization and Standardization. https://www.analyticsvidhya.com/blog/2020/04/feature-scaling-machine-learning-normalization-standardization/ Accessed 4/24/2023.
upvoted 3 times
...
XtraWest
2 years ago
Feature Engineering as per Chat GPT (dealing with data on a similar scale)
upvoted 5 times
...
p1zz4
2 years, 2 months ago
feature engineering 100%
upvoted 5 times
...
Romanhuki
2 years, 6 months ago
Feature Selection is the right answer: Normally feature engineering is applied first to generate additional features, and then feature selection is done to eliminate irrelevant, redundant, or highly correlated features. (https://learn.microsoft.com/en-us/azure/architecture/data-science-process/create-features)
upvoted 2 times
Alex_W
10 months, 3 weeks ago
No, correct answer is feature engineering. Ensuring that numeric variables are on similar scale involves normalization, which is a kind of feature engineering.
upvoted 1 times
...
...
sriram72
2 years, 9 months ago
A confusing question (with the given answers) for sure! I am not sure about whether this should be Feature Engineering or Feature Selection
upvoted 4 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago