Keras combine value of two loss funcation
I have a model content one encoder and two decoder with two loss function:
input_shape = (384, 512, 3)
model = Model(inputs=input, outputs=[1_features, 2_features])
model = build_model(input_shape, 3)
losses = {
"loss1_output": "categorical_crossentropy",
"loss2_output": "categorical_crossentropy"}
lossWeights = {"loss1_output": 1.0, "loss2_output": 1.0}
EPOCHS = 50
INIT_LR = 1e-3
opt = Adam(lr=INIT_LR, decay=INIT_LR / EPOCHS)
model.compile(optimizer=opt, loss=losses, loss_weights=lossWeights,
metrics=["accuracy"])
I would combine the value for both those losses in one loss value and backward the result of the combination.
My question is close to this one which I read and tried and I found the model called the loss function one time for each branch (output).
python tensorflow keras conv-neural-network semantic-segmentation
add a comment |
I have a model content one encoder and two decoder with two loss function:
input_shape = (384, 512, 3)
model = Model(inputs=input, outputs=[1_features, 2_features])
model = build_model(input_shape, 3)
losses = {
"loss1_output": "categorical_crossentropy",
"loss2_output": "categorical_crossentropy"}
lossWeights = {"loss1_output": 1.0, "loss2_output": 1.0}
EPOCHS = 50
INIT_LR = 1e-3
opt = Adam(lr=INIT_LR, decay=INIT_LR / EPOCHS)
model.compile(optimizer=opt, loss=losses, loss_weights=lossWeights,
metrics=["accuracy"])
I would combine the value for both those losses in one loss value and backward the result of the combination.
My question is close to this one which I read and tried and I found the model called the loss function one time for each branch (output).
python tensorflow keras conv-neural-network semantic-segmentation
Build a custom loss function where you combine your two losses and pass that as the loss when you compile the model. Here are what the default loss fuctions look like: github.com/keras-team/keras/blob/master/keras/losses.py. Just build your own based on a combination of the existing ones
– Karl
Nov 14 '18 at 20:14
That meaning I need to combine the two output to pass this to the custom loss function. Is there any another way?
– Zaher88abd
Nov 14 '18 at 20:19
Why wouldn't you want to do it like that?
– Karl
Nov 14 '18 at 20:21
Keras already combines the losses (that is what the loss weights are for).
– Matias Valdenegro
Nov 14 '18 at 21:23
add a comment |
I have a model content one encoder and two decoder with two loss function:
input_shape = (384, 512, 3)
model = Model(inputs=input, outputs=[1_features, 2_features])
model = build_model(input_shape, 3)
losses = {
"loss1_output": "categorical_crossentropy",
"loss2_output": "categorical_crossentropy"}
lossWeights = {"loss1_output": 1.0, "loss2_output": 1.0}
EPOCHS = 50
INIT_LR = 1e-3
opt = Adam(lr=INIT_LR, decay=INIT_LR / EPOCHS)
model.compile(optimizer=opt, loss=losses, loss_weights=lossWeights,
metrics=["accuracy"])
I would combine the value for both those losses in one loss value and backward the result of the combination.
My question is close to this one which I read and tried and I found the model called the loss function one time for each branch (output).
python tensorflow keras conv-neural-network semantic-segmentation
I have a model content one encoder and two decoder with two loss function:
input_shape = (384, 512, 3)
model = Model(inputs=input, outputs=[1_features, 2_features])
model = build_model(input_shape, 3)
losses = {
"loss1_output": "categorical_crossentropy",
"loss2_output": "categorical_crossentropy"}
lossWeights = {"loss1_output": 1.0, "loss2_output": 1.0}
EPOCHS = 50
INIT_LR = 1e-3
opt = Adam(lr=INIT_LR, decay=INIT_LR / EPOCHS)
model.compile(optimizer=opt, loss=losses, loss_weights=lossWeights,
metrics=["accuracy"])
I would combine the value for both those losses in one loss value and backward the result of the combination.
My question is close to this one which I read and tried and I found the model called the loss function one time for each branch (output).
python tensorflow keras conv-neural-network semantic-segmentation
python tensorflow keras conv-neural-network semantic-segmentation
edited Nov 14 '18 at 20:14
Zaher88abd
asked Nov 14 '18 at 20:01
Zaher88abdZaher88abd
72113
72113
Build a custom loss function where you combine your two losses and pass that as the loss when you compile the model. Here are what the default loss fuctions look like: github.com/keras-team/keras/blob/master/keras/losses.py. Just build your own based on a combination of the existing ones
– Karl
Nov 14 '18 at 20:14
That meaning I need to combine the two output to pass this to the custom loss function. Is there any another way?
– Zaher88abd
Nov 14 '18 at 20:19
Why wouldn't you want to do it like that?
– Karl
Nov 14 '18 at 20:21
Keras already combines the losses (that is what the loss weights are for).
– Matias Valdenegro
Nov 14 '18 at 21:23
add a comment |
Build a custom loss function where you combine your two losses and pass that as the loss when you compile the model. Here are what the default loss fuctions look like: github.com/keras-team/keras/blob/master/keras/losses.py. Just build your own based on a combination of the existing ones
– Karl
Nov 14 '18 at 20:14
That meaning I need to combine the two output to pass this to the custom loss function. Is there any another way?
– Zaher88abd
Nov 14 '18 at 20:19
Why wouldn't you want to do it like that?
– Karl
Nov 14 '18 at 20:21
Keras already combines the losses (that is what the loss weights are for).
– Matias Valdenegro
Nov 14 '18 at 21:23
Build a custom loss function where you combine your two losses and pass that as the loss when you compile the model. Here are what the default loss fuctions look like: github.com/keras-team/keras/blob/master/keras/losses.py. Just build your own based on a combination of the existing ones
– Karl
Nov 14 '18 at 20:14
Build a custom loss function where you combine your two losses and pass that as the loss when you compile the model. Here are what the default loss fuctions look like: github.com/keras-team/keras/blob/master/keras/losses.py. Just build your own based on a combination of the existing ones
– Karl
Nov 14 '18 at 20:14
That meaning I need to combine the two output to pass this to the custom loss function. Is there any another way?
– Zaher88abd
Nov 14 '18 at 20:19
That meaning I need to combine the two output to pass this to the custom loss function. Is there any another way?
– Zaher88abd
Nov 14 '18 at 20:19
Why wouldn't you want to do it like that?
– Karl
Nov 14 '18 at 20:21
Why wouldn't you want to do it like that?
– Karl
Nov 14 '18 at 20:21
Keras already combines the losses (that is what the loss weights are for).
– Matias Valdenegro
Nov 14 '18 at 21:23
Keras already combines the losses (that is what the loss weights are for).
– Matias Valdenegro
Nov 14 '18 at 21:23
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53307943%2fkeras-combine-value-of-two-loss-funcation%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53307943%2fkeras-combine-value-of-two-loss-funcation%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Build a custom loss function where you combine your two losses and pass that as the loss when you compile the model. Here are what the default loss fuctions look like: github.com/keras-team/keras/blob/master/keras/losses.py. Just build your own based on a combination of the existing ones
– Karl
Nov 14 '18 at 20:14
That meaning I need to combine the two output to pass this to the custom loss function. Is there any another way?
– Zaher88abd
Nov 14 '18 at 20:19
Why wouldn't you want to do it like that?
– Karl
Nov 14 '18 at 20:21
Keras already combines the losses (that is what the loss weights are for).
– Matias Valdenegro
Nov 14 '18 at 21:23