Coefficient is more than 1?

Screen Link: https://app.dataquest.io/m/186/feature-preparation%2C-selection-and-engineering/3/determining-the-most-relevant-features

Could someone tell me why there are some coefficients that are more than 1 or less than -1?

My Code:

import matplotlib.pyplot as plt
from sklearn.linear_model import LogisticRegression

columns = ['Age_categories_Missing', 'Age_categories_Infant',
       'Age_categories_Child', 'Age_categories_Teenager',
       'Age_categories_Young Adult', 'Age_categories_Adult',
       'Age_categories_Senior', 'Pclass_1', 'Pclass_2', 'Pclass_3',
       'Sex_female', 'Sex_male', 'Embarked_C', 'Embarked_Q', 'Embarked_S',
       'SibSp_scaled', 'Parch_scaled', 'Fare_scaled']


lr = LogisticRegression() 
lr.fit(train[columns],train['Survived']) 
coefficients = lr.coef_
feature_importance = pd.Series(coefficients[0],index=columns) 

feature_importance.plot.barh() 

What I expected to happen:
I expected that the values of coefficients fall in a range between -1 and 1 since it is how coefficient is defined. However, it seems like that some of coefficients are more than 1 or less than -1.

a variable,coefficients, is like this

        -0.52620202, -0.90049959,  1.04515623,  0.13729476, -0.94467395,
         1.45610934, -1.2183323 ,  0.25010253,  0.24374319, -0.25606868,
        -1.74775712, -0.77650208,  0.54308487]])
1 Like

Any particular reason you expect them to be between -1 and 1?

Thank you so much for your reply!
As far as I knew, I though coefficients indicate Pearson’s correlation coefficient, which takes a value between -1 and 1. However, I realized that sckit learn uses a different method. I apologize for this.