Sklearn.metrics.roc_auc_score input preparation

In the docs:
y_score parameter says

For binary y_true, y_score is supposed to be the score of the class with greater label.

This seems to say we should always take the predict_proba index [1] (2nd column) when preparing y_score. (assuming class labels are {0,1})

This makes me think when is predict_proba index[0] useful?
Is index[0] useful when we want to measure roc_auc_score on the 0 class instead? To do this, must we also invert input to y_true putting y_true == 0 boolean inside the y_true parameter? If this makes sense, the documentation statement above seems to be inaccurate? (biased towards measuring roc_auc_score of class label 1 only?)