본문 바로가기

AI

Feature Enigneering 공부

https://medium.com/@rishabhrjain/linear-regrsesion-binning-and-polynomial-linear-regression-3ed62f79ce0

https://towardsdatascience.com/feature-engineering-deep-dive-into-encoding-and-binning-techniques-5618d55a6b38

출처: https://towardsdatascience.com/feature-engineering-deep-dive-into-encoding-and-binning-techniques-5618d55a6b38

 

 

Feature Engineering — deep dive into Encoding and Binning techniques

Illustration of feature encoding and feature binning techniques

towardsdatascience.com

https://medium.com/machine-learning-researcher/dimensionality-reduction-pca-and-lda-6be91734f567

 

Dimensionality Reduction(PCA and LDA)

In this Chapter we will discuss about Dimensionality Reduction Algorithms (Principle Component Analysis (PCA) and Linear Discriminant…

medium.com

밑의 글은 모두 출처가 위의 글입니다. 

 

Reduce the dimension of your feature space

-> Feature Elimination / Feature extraction (PCA)

 

* PCA : drop the "least important" variables

When should I use PCA?
1. Do you want to reduce the no. of variables, but are not able to identify variables to completely remove from consideration?
2. Do you want to ensure your variables are independent of one another?
3. Are you comfortable making your independent variable less interpretable?

 

아...어떡하지 수학 나와서 읽기 싫다. 하필 eigenvalues 나오네...잊어먹었는데ㅔ...

'AI' 카테고리의 다른 글

Of Spiky SVDs and Music Recommendation 리뷰  (0) 2024.01.16
대략 01/12의 공부 일지 + 10주차 회고  (0) 2024.01.12
GLORY 리뷰  (0) 2024.01.09
9주차 회고  (0) 2024.01.05
8주차 회고  (0) 2023.12.29