Skip to content Skip to sidebar Skip to footer

How To Run Rfecv With Svc In Sklearn

I am trying to perform Recursive Feature Elimination with Cross Validation (RFECV) with GridSearchCV as follows using SVC as the classifier. My code is as follows. X = df[my_featur

Solution 1:

To look at more feature selection implementations you can have a look at:

https://scikit-learn.org/stable/modules/classes.html#module-sklearn.feature_selection

As an example, in the next link they use PCA with k-best feature selection and svc.

https://scikit-learn.org/stable/auto_examples/compose/plot_feature_union.html#sphx-glr-auto-examples-compose-plot-feature-union-py

An example of use would be, modified form the previous link for more simplicity:

iris = load_iris()

X, y = iris.data, iris.target

# Maybe some original features where good, too?
selection = SelectKBest()

# Build SVC
svm = SVC(kernel="linear")

# Do grid search over k, n_components and C:

pipeline = Pipeline([("features", selection), ("svm", svm)])

param_grid = dict(features__k=[1, 2],
                  svm__C=[0.1, 1, 10])

grid_search = GridSearchCV(pipeline, param_grid=param_grid, cv=5, verbose=10)
grid_search.fit(X, y)
print(grid_search.best_estimator_)

Solution 2:

emmm...in sklearn 0.19.2,The problem seems to have been solved.My code is similar to yours, but it works:

           svc = SVC(                    
            kernel = 'linear',
            probability = True,
            random_state = 1 ) 
       rfecv = RFECV(
               estimator = svc,
               scoring = 'roc_auc'  
               )

       rfecv.fit(train_values,train_Labels)     
       selecInfo = rfecv.support_                      
       selecIndex = np.where(selecInfo==1)         

Post a Comment for "How To Run Rfecv With Svc In Sklearn"