Nueral Network hyperparameters

what does max_iter hyperparameter mean?
what is the difference if i set it to 200 vs 1000?
I want to understand what is happening behind the scene.

There is insufficient information. Please provide more context or code.

Here’s a guide on how to ask a good question to improve your original post.

for n in neurons:
mlp = MLPClassifier(hidden_layer_sizes=(n,n), activation='relu', max_iter=1000)

mlp.fit(train_features, train_labels)
nn_predictions = mlp.predict(test_features)

accuracy = accuracy_score(test_labels, nn_predictions)
nn_accuracies.append(accuracy)

This is the piece of code from the nueral network mission, what does the max_iter parameter do here? Also I am not clear what does this line mean? “gradient descent occurred but the model han’t converged yet.”

To find out the what max_iter does, you have to read the documentation. The implementation details for MLPClassifer object might varies between different version.

On the interactive python command prompt, do the following:

> help(MLPClassifer)

to read the documentation.

Or visit the documentation website according to version number.

Maximum number of iterations. The solver iterates until convergence (determined by ‘tol’) or this number of iterations. For stochastic solvers (‘sgd’, ‘adam’), note that this determines the number of epochs (how many times each data point will be used), not the number of gradient steps.

Yeah I read that but i was hoping for a more simpler version or something that can build more intuition. Thanks

Here is similar question asked on stackoverflow.