How to calculate the softmax in LSTM auto-encoder for input with zero padding
I am implementing a LSTM auto-encoder that with the input x and hopefully get the same output as x. However, I have several questions in the implementing, and please see my codes below:
dict_size = 10
max_sentence_length = 5
embed_dim = 20
x = Input(shape=(max_sentence_length,), dtype='int32')
encoder_input = Embedding(dict_size, embed_dim)(x)
encoder_output=LSTM(32, return_sequences=True)(encoder_input)
deocder_output = LSTM(32, return_sequences=True)(encoder_output)
y = Dense(dict_size, activation='softmax')(deocder_output)
model = Model(inputs=x, outputs=y)
model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
print(model.summary())
train_x = np.array([[3, 1, 2]])
train_x = pad_sequences(train_x, max_sentence_length, padding='post')
train_y = np_utils.to_categorical(train_x, dict_size)
model.fit(train_x, train_y, batch_size=1, epochs=1)
predict_y = model.predict(train_x)
print(predict_y)
In this codes, the first problem is that : "ValueError: Error when checking target: expected dense_1 to have 3 dimensions, but got array with shape (5, 10)", I really do not know how to solve this problem first.
The second question is that, the length of x is 3, while with extra padding its length becomes to 5; At the final stage, the output would be a 5x10 matrix, i.e., the last two elements of the final output should not participate into the calculation of the softmax. Is there any way to fix this problem quickly?
Many many thanks!
keras lstm
add a comment |
I am implementing a LSTM auto-encoder that with the input x and hopefully get the same output as x. However, I have several questions in the implementing, and please see my codes below:
dict_size = 10
max_sentence_length = 5
embed_dim = 20
x = Input(shape=(max_sentence_length,), dtype='int32')
encoder_input = Embedding(dict_size, embed_dim)(x)
encoder_output=LSTM(32, return_sequences=True)(encoder_input)
deocder_output = LSTM(32, return_sequences=True)(encoder_output)
y = Dense(dict_size, activation='softmax')(deocder_output)
model = Model(inputs=x, outputs=y)
model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
print(model.summary())
train_x = np.array([[3, 1, 2]])
train_x = pad_sequences(train_x, max_sentence_length, padding='post')
train_y = np_utils.to_categorical(train_x, dict_size)
model.fit(train_x, train_y, batch_size=1, epochs=1)
predict_y = model.predict(train_x)
print(predict_y)
In this codes, the first problem is that : "ValueError: Error when checking target: expected dense_1 to have 3 dimensions, but got array with shape (5, 10)", I really do not know how to solve this problem first.
The second question is that, the length of x is 3, while with extra padding its length becomes to 5; At the final stage, the output would be a 5x10 matrix, i.e., the last two elements of the final output should not participate into the calculation of the softmax. Is there any way to fix this problem quickly?
Many many thanks!
keras lstm
add a comment |
I am implementing a LSTM auto-encoder that with the input x and hopefully get the same output as x. However, I have several questions in the implementing, and please see my codes below:
dict_size = 10
max_sentence_length = 5
embed_dim = 20
x = Input(shape=(max_sentence_length,), dtype='int32')
encoder_input = Embedding(dict_size, embed_dim)(x)
encoder_output=LSTM(32, return_sequences=True)(encoder_input)
deocder_output = LSTM(32, return_sequences=True)(encoder_output)
y = Dense(dict_size, activation='softmax')(deocder_output)
model = Model(inputs=x, outputs=y)
model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
print(model.summary())
train_x = np.array([[3, 1, 2]])
train_x = pad_sequences(train_x, max_sentence_length, padding='post')
train_y = np_utils.to_categorical(train_x, dict_size)
model.fit(train_x, train_y, batch_size=1, epochs=1)
predict_y = model.predict(train_x)
print(predict_y)
In this codes, the first problem is that : "ValueError: Error when checking target: expected dense_1 to have 3 dimensions, but got array with shape (5, 10)", I really do not know how to solve this problem first.
The second question is that, the length of x is 3, while with extra padding its length becomes to 5; At the final stage, the output would be a 5x10 matrix, i.e., the last two elements of the final output should not participate into the calculation of the softmax. Is there any way to fix this problem quickly?
Many many thanks!
keras lstm
I am implementing a LSTM auto-encoder that with the input x and hopefully get the same output as x. However, I have several questions in the implementing, and please see my codes below:
dict_size = 10
max_sentence_length = 5
embed_dim = 20
x = Input(shape=(max_sentence_length,), dtype='int32')
encoder_input = Embedding(dict_size, embed_dim)(x)
encoder_output=LSTM(32, return_sequences=True)(encoder_input)
deocder_output = LSTM(32, return_sequences=True)(encoder_output)
y = Dense(dict_size, activation='softmax')(deocder_output)
model = Model(inputs=x, outputs=y)
model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
print(model.summary())
train_x = np.array([[3, 1, 2]])
train_x = pad_sequences(train_x, max_sentence_length, padding='post')
train_y = np_utils.to_categorical(train_x, dict_size)
model.fit(train_x, train_y, batch_size=1, epochs=1)
predict_y = model.predict(train_x)
print(predict_y)
In this codes, the first problem is that : "ValueError: Error when checking target: expected dense_1 to have 3 dimensions, but got array with shape (5, 10)", I really do not know how to solve this problem first.
The second question is that, the length of x is 3, while with extra padding its length becomes to 5; At the final stage, the output would be a 5x10 matrix, i.e., the last two elements of the final output should not participate into the calculation of the softmax. Is there any way to fix this problem quickly?
Many many thanks!
keras lstm
keras lstm
edited Nov 16 '18 at 10:04
Kevin Sun
asked Nov 16 '18 at 9:51
Kevin SunKevin Sun
1309
1309
add a comment |
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53335278%2fhow-to-calculate-the-softmax-in-lstm-auto-encoder-for-input-with-zero-padding%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53335278%2fhow-to-calculate-the-softmax-in-lstm-auto-encoder-for-input-with-zero-padding%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown