Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Issue with training model error: list indices must be integers or slices, not Loss. #42

Open
sonfack opened this issue Aug 18, 2022 · 1 comment
Labels
enhancement New feature or request

Comments

@sonfack
Copy link

sonfack commented Aug 18, 2022

I try to train a model using :

model.train(losses=Loss.SUPERVISED)

Error on:

model.train(losses=Loss.SUPERVISED)

Error message:

list indices must be integers or slices, not Loss.

My code:

from lnn import And, Loss, Implies, Or, Equivalent, Fact, Predicates, Model, Variables, World
model = Model()

AgeGreat60 = Predicates('AgeGreat60')
IncomeLess400 = Predicates('IncomeLessthan400')
Risk = Predicates('Risk', arity=2)

# Variables
x = Variables('x')
y = Variables('y')
z = Variables('z')

# Risk(x,y) <---- AgeGreat60(x) AND IncomeLess400(y)

Root = Implies(Risk(x,y), And(AgeGreat60(x), IncomeLess400(y)))

formulae = [
Root
]
model.add_knowledge(*formulae, world=World.OPEN)
[#model](https://www.linkedin.com/feed/hashtag/?keywords=%23model).add_knowledge(Root)


# Data
model.add_data({
AgeGreat60: {
('70'): Fact.TRUE,
('30'): Fact.FALSE,
('20'): Fact.FALSE,
},
IncomeLess400: {
('300'): Fact.TRUE,
('600'): Fact.FALSE,
},
Risk: {
('70', '300'): Fact.TRUE,
('30', '600'): Fact.FALSE,
('60', '500'): (0.6, 0.2),
('65', '450'): (0.8, 1)
}
})


model.add_labels({
Risk: {
('70', '300'): Fact.TRUE,
('30', '600'): Fact.FALSE,
}
})

# train the model and output results
model.train(losses=Loss.SUPERVISED)
model.print(params=True)
@NaweedAghmad NaweedAghmad added the enhancement New feature or request label Sep 5, 2022
@NaweedAghmad
Copy link
Collaborator

model.train(losses)
losses assumes that many losses are given, i.e. a list of losses.
losses=[Loss.SUPERVISED] solves this issue, but we should add an additional kwarg loss
that only accepts a single loss function.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants