Error in any() fn

Screen Link:

My Code:

def update_vals(x):
    if pd.isnull(x):
        return np.nan
    elif x == '-':
        return False
    else:
        return True

tafe_resignation['dissatisfied'] = tafe_resignation[tafe_col].applymap(update_vals).any(axis=1, skipna=False)
tafe_resignation['dissatisfied'].value_counts(dropna=False)

What I expected to happen:

False    241
True      91
NaN        8
Name: dissatisfied, dtype: int64

What actually happened:

False    241
True      91
True       8
Name: dissatisfied, dtype: int64

The code somehow changes displays NaN values as True. However, it does not count them as True.

tafe_resignation['dissatisfied'].isnull().sum()

Gives output 8.

Hi @bhavya,

If you still need help with this, can you please share with us the jupyter notebook file? I would like to check it on my end.

Thanks,
Sahil

Hi @Sahil
I have completed the project, and those values are counted as NaN by the system but the output still shows True/False instead of NaN. However, this is not generating any erratic results in the following cells.
You can check my notebook here. The problem is in cell 21.
Thanks,
Bhavya

1 Like

Hi @bhavya,

When I ran your code from the start, I am seeing NaN in value_counts instead of True. Can you try the same?

Best,
Sahil

Hi @Sahil
I tried to do that. I had done the same in the past as well. I am still getting the same results.


I guess it is some glitch in the system.
Regards,
Bhavya

1 Like

Yep, that’s possible.

1 Like

I am actually facing this problem with all my projects.
Every time the values are mapped to True/False/NaN, the value_counts() output displays NaN as either True/False.
Is there a way I can fix this?
Thanks,
Bhavya

@bhavya Does this happen in the Dataquest Jupyter environment or in your local Jupyter environment?

It happens in my local Jupyter environment.

1 Like

In that case, it could be due to library version differences. Can you try using pandas version: 0.22.0 and numpy version: 1.14.2 in your local environment

1 Like

Hi @Sahil
That worked out.
Thank you so much for helping me out.

1 Like

Update:

This is due to a bug in pandas version 1.0.1. Please update the pandas version in your local environment if you are experiencing this issue.

Thank you, @ebuschang, for figuring this out.

Best,
Sahil