- Feature Engineering - Wikipedia
- Discover Feature Engineering, How to Engineer Features and How to Get Good at It - MachineLeariningMastery
- Feature Engineering Using Pandas for Beginners - Analytics Vidhya
- GroupBy in Pandas: Your Guide to Summarizing and Aggregating Data in Python - Analytics Vidhya
- feature templates - Stanford cs221
Feature Engineering — deep dive into Encoding and Binning techniques
Illustration of feature encoding and feature binning techniques
towardsdatascience.com
https://medium.com/machine-learning-researcher/dimensionality-reduction-pca-and-lda-6be91734f567
Dimensionality Reduction(PCA and LDA)
In this Chapter we will discuss about Dimensionality Reduction Algorithms (Principle Component Analysis (PCA) and Linear Discriminant…
medium.com
밑의 글은 모두 출처가 위의 글입니다.
Reduce the dimension of your feature space
-> Feature Elimination / Feature extraction (PCA)
* PCA : drop the "least important" variables
When should I use PCA?
1. Do you want to reduce the no. of variables, but are not able to identify variables to completely remove from consideration?
2. Do you want to ensure your variables are independent of one another?
3. Are you comfortable making your independent variable less interpretable?
아...어떡하지 수학 나와서 읽기 싫다. 하필 eigenvalues 나오네...잊어먹었는데ㅔ...
'AI' 카테고리의 다른 글
Of Spiky SVDs and Music Recommendation 리뷰 (0) | 2024.01.16 |
---|---|
대략 01/12의 공부 일지 + 10주차 회고 (0) | 2024.01.12 |
GLORY 리뷰 (0) | 2024.01.09 |
9주차 회고 (0) | 2024.01.05 |
8주차 회고 (0) | 2023.12.29 |