Skip to content

Add activation function #299

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

Conversation

YF-Tung
Copy link

@YF-Tung YF-Tung commented Feb 18, 2019

An NN should always have activation functions (like relu), or otherwise it's just a trivial linear model.
By applying relu, accuracy for neural_network.py: 92% -> 95%,
and accuracy for neural_network_raw.py: 92% -> 94%
Also the learning rate is too large, even for an example. In practice it's typical to set it somewhere between 1e-2 and 1e-4.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant