Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

cross entropy is correct? #98

Open
liyunrui opened this issue Oct 9, 2021 · 2 comments
Open

cross entropy is correct? #98

liyunrui opened this issue Oct 9, 2021 · 2 comments

Comments

@liyunrui
Copy link

liyunrui commented Oct 9, 2021

No description provided.

@liyunrui
Copy link
Author

liyunrui commented Oct 9, 2021

class CrossEntropy(Loss):
def init(self): pass

def loss(self, y, p):
    # Avoid division by zero
    p = np.clip(p, 1e-15, 1 - 1e-15)
    return - y * np.log(p) - (1 - y) * np.log(1 - p)

def acc(self, y, p):
    return accuracy_score(np.argmax(y, axis=1), np.argmax(p, axis=1))

def gradient(self, y, p):
    # Avoid division by zero
    p = np.clip(p, 1e-15, 1 - 1e-15)
    return - (y / p) + (1 - y) / (1 - p)

loss = CrossEntropy()
y = np.array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
p = np.random.uniform(0,1,size=10)
loss.loss(y, p)

@zpengc
Copy link

zpengc commented Nov 1, 2021

def acc(self, y, p):
return accuracy_score(y, p)
is this ok?

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants