OR-Lambda-Layer operation with Keras





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







1















I'm creating a different project with Keras, it's a development of a Neural network based on predefined knowledge (described as IF-THEN rules), called Neurules. I've created a Python Module to train each one of my Neurons/Neurules from a given IF-THEN logical expression and after all I need to use Keras to create it as a network and reuse this Model.



I've already tested with one little example and it worked, everything manually added, the weights and bias. Now I've updated my script and it is giving me a JSON with all the weights to add to Keras (working until now).



Here comes my problem, I have one first Layer with 20 Neurules (neurons created from IF-THENs) but I only have 2 possible outputs, some of the Neurules/Neurons give me output[0] some of them the output[1], I want to add a layer in-between representing the OR connections.



E.g.:



Layer 1:
NEURON1, NEURON2, NEURON3



Output[0] is formed by: NEURON1 or NEURON2



Output[1] is formed by: NEURON2 or NEURON3



What I've done in my first little example was: I've created and trained an OR neuron with my pre-developed python module and then added a second layer with it. I've then connected the entries manually to the ORs Neurules (putting the weights in the correct connections and putting 0 when they shouldn't influence the OR). Now I have something bigger and I'm automating the whole process.



Visualization of the simple net:
Buffer is just forwarding the value, OR is doing an OR operation with the inputs.



Visualization of the layers



How can I create a Lambda Layer in Keras which takes some of the outputs, processes a logical OR and connects to one of the outputs?



I've found the Backend function:
tf.keras.backend.any, but I'm not able to use it until now, how should I use it? Probably in a Lambda layer, but how?



I need to connect, for example



(NEURON1 or NEURON4 or NEURON5) -> output[0]



(NEURON3 or NEURON6 or NEURON7) -> output[1]



In my system -1 represents False and 1 represents True.
Until now I've saved which Neurons are using each one of the 2 outputs in an array in a JSON like:



"secondLayerDescription": [
[0, 1, 4, 5, 6, 8, 12, 13, 14, 16, 18],
[2, 3, 7, 9, 10, 11, 15, 17, 19]
]


I hope someone can help me :)



EDIT: Giving an update, I found a solution after some days, I'm splitting my layer into 2 layers and operating them with lambda layers as following:



def logical_or_layer(x):
"""Processing an OR operation"""
import keras.backend
#normalized to 0,1
aux_array = keras.backend.sign(x)
aux_array = keras.backend.relu(aux_array)
# OR operation
aux_array = keras.backend.any(aux_array)
# casting back the True/False to 1,0
aux_array = keras.backend.cast(aux_array, dtype='float32')

return aux_array


#this is the input tensor
inputs = Input(shape=(inputSize,))

#this is the Neurule layer
x = Dense(neurulesQt, activation='softsign')(inputs)

#after each neuron layer, the outputs need to be put into SIGNUM (-1 or 1)
x = Lambda(signumTransform, output_shape=lambda x:x, name='signumAfterNeurules')(x)

#separating into 2 (2 possible outputs)
layer_split0 = Lambda( lambda x: x[:, :end_output0], output_shape=(11, ), name='layer_split0')(x)
layer_split1 = Lambda( lambda x: x[:, start_output1:end_output1], output_shape=(9,), name='layer_split1')(x)
#this is the OR layer
y_0 = Lambda(logical_or_layer, output_shape=(1,), name='or0')(layer_split0)
y_1 = Lambda(logical_or_layer, output_shape=(1,), name='or1')(layer_split1)


But I'm still having problems, I can't merge them together, I've raised a new question based on this new topic.










share|improve this question































    1















    I'm creating a different project with Keras, it's a development of a Neural network based on predefined knowledge (described as IF-THEN rules), called Neurules. I've created a Python Module to train each one of my Neurons/Neurules from a given IF-THEN logical expression and after all I need to use Keras to create it as a network and reuse this Model.



    I've already tested with one little example and it worked, everything manually added, the weights and bias. Now I've updated my script and it is giving me a JSON with all the weights to add to Keras (working until now).



    Here comes my problem, I have one first Layer with 20 Neurules (neurons created from IF-THENs) but I only have 2 possible outputs, some of the Neurules/Neurons give me output[0] some of them the output[1], I want to add a layer in-between representing the OR connections.



    E.g.:



    Layer 1:
    NEURON1, NEURON2, NEURON3



    Output[0] is formed by: NEURON1 or NEURON2



    Output[1] is formed by: NEURON2 or NEURON3



    What I've done in my first little example was: I've created and trained an OR neuron with my pre-developed python module and then added a second layer with it. I've then connected the entries manually to the ORs Neurules (putting the weights in the correct connections and putting 0 when they shouldn't influence the OR). Now I have something bigger and I'm automating the whole process.



    Visualization of the simple net:
    Buffer is just forwarding the value, OR is doing an OR operation with the inputs.



    Visualization of the layers



    How can I create a Lambda Layer in Keras which takes some of the outputs, processes a logical OR and connects to one of the outputs?



    I've found the Backend function:
    tf.keras.backend.any, but I'm not able to use it until now, how should I use it? Probably in a Lambda layer, but how?



    I need to connect, for example



    (NEURON1 or NEURON4 or NEURON5) -> output[0]



    (NEURON3 or NEURON6 or NEURON7) -> output[1]



    In my system -1 represents False and 1 represents True.
    Until now I've saved which Neurons are using each one of the 2 outputs in an array in a JSON like:



    "secondLayerDescription": [
    [0, 1, 4, 5, 6, 8, 12, 13, 14, 16, 18],
    [2, 3, 7, 9, 10, 11, 15, 17, 19]
    ]


    I hope someone can help me :)



    EDIT: Giving an update, I found a solution after some days, I'm splitting my layer into 2 layers and operating them with lambda layers as following:



    def logical_or_layer(x):
    """Processing an OR operation"""
    import keras.backend
    #normalized to 0,1
    aux_array = keras.backend.sign(x)
    aux_array = keras.backend.relu(aux_array)
    # OR operation
    aux_array = keras.backend.any(aux_array)
    # casting back the True/False to 1,0
    aux_array = keras.backend.cast(aux_array, dtype='float32')

    return aux_array


    #this is the input tensor
    inputs = Input(shape=(inputSize,))

    #this is the Neurule layer
    x = Dense(neurulesQt, activation='softsign')(inputs)

    #after each neuron layer, the outputs need to be put into SIGNUM (-1 or 1)
    x = Lambda(signumTransform, output_shape=lambda x:x, name='signumAfterNeurules')(x)

    #separating into 2 (2 possible outputs)
    layer_split0 = Lambda( lambda x: x[:, :end_output0], output_shape=(11, ), name='layer_split0')(x)
    layer_split1 = Lambda( lambda x: x[:, start_output1:end_output1], output_shape=(9,), name='layer_split1')(x)
    #this is the OR layer
    y_0 = Lambda(logical_or_layer, output_shape=(1,), name='or0')(layer_split0)
    y_1 = Lambda(logical_or_layer, output_shape=(1,), name='or1')(layer_split1)


    But I'm still having problems, I can't merge them together, I've raised a new question based on this new topic.










    share|improve this question



























      1












      1








      1








      I'm creating a different project with Keras, it's a development of a Neural network based on predefined knowledge (described as IF-THEN rules), called Neurules. I've created a Python Module to train each one of my Neurons/Neurules from a given IF-THEN logical expression and after all I need to use Keras to create it as a network and reuse this Model.



      I've already tested with one little example and it worked, everything manually added, the weights and bias. Now I've updated my script and it is giving me a JSON with all the weights to add to Keras (working until now).



      Here comes my problem, I have one first Layer with 20 Neurules (neurons created from IF-THENs) but I only have 2 possible outputs, some of the Neurules/Neurons give me output[0] some of them the output[1], I want to add a layer in-between representing the OR connections.



      E.g.:



      Layer 1:
      NEURON1, NEURON2, NEURON3



      Output[0] is formed by: NEURON1 or NEURON2



      Output[1] is formed by: NEURON2 or NEURON3



      What I've done in my first little example was: I've created and trained an OR neuron with my pre-developed python module and then added a second layer with it. I've then connected the entries manually to the ORs Neurules (putting the weights in the correct connections and putting 0 when they shouldn't influence the OR). Now I have something bigger and I'm automating the whole process.



      Visualization of the simple net:
      Buffer is just forwarding the value, OR is doing an OR operation with the inputs.



      Visualization of the layers



      How can I create a Lambda Layer in Keras which takes some of the outputs, processes a logical OR and connects to one of the outputs?



      I've found the Backend function:
      tf.keras.backend.any, but I'm not able to use it until now, how should I use it? Probably in a Lambda layer, but how?



      I need to connect, for example



      (NEURON1 or NEURON4 or NEURON5) -> output[0]



      (NEURON3 or NEURON6 or NEURON7) -> output[1]



      In my system -1 represents False and 1 represents True.
      Until now I've saved which Neurons are using each one of the 2 outputs in an array in a JSON like:



      "secondLayerDescription": [
      [0, 1, 4, 5, 6, 8, 12, 13, 14, 16, 18],
      [2, 3, 7, 9, 10, 11, 15, 17, 19]
      ]


      I hope someone can help me :)



      EDIT: Giving an update, I found a solution after some days, I'm splitting my layer into 2 layers and operating them with lambda layers as following:



      def logical_or_layer(x):
      """Processing an OR operation"""
      import keras.backend
      #normalized to 0,1
      aux_array = keras.backend.sign(x)
      aux_array = keras.backend.relu(aux_array)
      # OR operation
      aux_array = keras.backend.any(aux_array)
      # casting back the True/False to 1,0
      aux_array = keras.backend.cast(aux_array, dtype='float32')

      return aux_array


      #this is the input tensor
      inputs = Input(shape=(inputSize,))

      #this is the Neurule layer
      x = Dense(neurulesQt, activation='softsign')(inputs)

      #after each neuron layer, the outputs need to be put into SIGNUM (-1 or 1)
      x = Lambda(signumTransform, output_shape=lambda x:x, name='signumAfterNeurules')(x)

      #separating into 2 (2 possible outputs)
      layer_split0 = Lambda( lambda x: x[:, :end_output0], output_shape=(11, ), name='layer_split0')(x)
      layer_split1 = Lambda( lambda x: x[:, start_output1:end_output1], output_shape=(9,), name='layer_split1')(x)
      #this is the OR layer
      y_0 = Lambda(logical_or_layer, output_shape=(1,), name='or0')(layer_split0)
      y_1 = Lambda(logical_or_layer, output_shape=(1,), name='or1')(layer_split1)


      But I'm still having problems, I can't merge them together, I've raised a new question based on this new topic.










      share|improve this question
















      I'm creating a different project with Keras, it's a development of a Neural network based on predefined knowledge (described as IF-THEN rules), called Neurules. I've created a Python Module to train each one of my Neurons/Neurules from a given IF-THEN logical expression and after all I need to use Keras to create it as a network and reuse this Model.



      I've already tested with one little example and it worked, everything manually added, the weights and bias. Now I've updated my script and it is giving me a JSON with all the weights to add to Keras (working until now).



      Here comes my problem, I have one first Layer with 20 Neurules (neurons created from IF-THENs) but I only have 2 possible outputs, some of the Neurules/Neurons give me output[0] some of them the output[1], I want to add a layer in-between representing the OR connections.



      E.g.:



      Layer 1:
      NEURON1, NEURON2, NEURON3



      Output[0] is formed by: NEURON1 or NEURON2



      Output[1] is formed by: NEURON2 or NEURON3



      What I've done in my first little example was: I've created and trained an OR neuron with my pre-developed python module and then added a second layer with it. I've then connected the entries manually to the ORs Neurules (putting the weights in the correct connections and putting 0 when they shouldn't influence the OR). Now I have something bigger and I'm automating the whole process.



      Visualization of the simple net:
      Buffer is just forwarding the value, OR is doing an OR operation with the inputs.



      Visualization of the layers



      How can I create a Lambda Layer in Keras which takes some of the outputs, processes a logical OR and connects to one of the outputs?



      I've found the Backend function:
      tf.keras.backend.any, but I'm not able to use it until now, how should I use it? Probably in a Lambda layer, but how?



      I need to connect, for example



      (NEURON1 or NEURON4 or NEURON5) -> output[0]



      (NEURON3 or NEURON6 or NEURON7) -> output[1]



      In my system -1 represents False and 1 represents True.
      Until now I've saved which Neurons are using each one of the 2 outputs in an array in a JSON like:



      "secondLayerDescription": [
      [0, 1, 4, 5, 6, 8, 12, 13, 14, 16, 18],
      [2, 3, 7, 9, 10, 11, 15, 17, 19]
      ]


      I hope someone can help me :)



      EDIT: Giving an update, I found a solution after some days, I'm splitting my layer into 2 layers and operating them with lambda layers as following:



      def logical_or_layer(x):
      """Processing an OR operation"""
      import keras.backend
      #normalized to 0,1
      aux_array = keras.backend.sign(x)
      aux_array = keras.backend.relu(aux_array)
      # OR operation
      aux_array = keras.backend.any(aux_array)
      # casting back the True/False to 1,0
      aux_array = keras.backend.cast(aux_array, dtype='float32')

      return aux_array


      #this is the input tensor
      inputs = Input(shape=(inputSize,))

      #this is the Neurule layer
      x = Dense(neurulesQt, activation='softsign')(inputs)

      #after each neuron layer, the outputs need to be put into SIGNUM (-1 or 1)
      x = Lambda(signumTransform, output_shape=lambda x:x, name='signumAfterNeurules')(x)

      #separating into 2 (2 possible outputs)
      layer_split0 = Lambda( lambda x: x[:, :end_output0], output_shape=(11, ), name='layer_split0')(x)
      layer_split1 = Lambda( lambda x: x[:, start_output1:end_output1], output_shape=(9,), name='layer_split1')(x)
      #this is the OR layer
      y_0 = Lambda(logical_or_layer, output_shape=(1,), name='or0')(layer_split0)
      y_1 = Lambda(logical_or_layer, output_shape=(1,), name='or1')(layer_split1)


      But I'm still having problems, I can't merge them together, I've raised a new question based on this new topic.







      python tensorflow keras keras-layer






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 20 '18 at 11:48







      Vinicius

















      asked Nov 16 '18 at 15:15









      ViniciusVinicius

      158




      158
























          1 Answer
          1






          active

          oldest

          votes


















          0














          I found a way to do it, I need to sort my Neurules Layer, split them and then, with one Lambda Layer for each split, do some processing as shown in the EDITed part of the question: normalize the inputs, use the backend.any and then cast the True or False back to float.






          share|improve this answer
























            Your Answer






            StackExchange.ifUsing("editor", function () {
            StackExchange.using("externalEditor", function () {
            StackExchange.using("snippets", function () {
            StackExchange.snippets.init();
            });
            });
            }, "code-snippets");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "1"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53340597%2for-lambda-layer-operation-with-keras%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            I found a way to do it, I need to sort my Neurules Layer, split them and then, with one Lambda Layer for each split, do some processing as shown in the EDITed part of the question: normalize the inputs, use the backend.any and then cast the True or False back to float.






            share|improve this answer




























              0














              I found a way to do it, I need to sort my Neurules Layer, split them and then, with one Lambda Layer for each split, do some processing as shown in the EDITed part of the question: normalize the inputs, use the backend.any and then cast the True or False back to float.






              share|improve this answer


























                0












                0








                0







                I found a way to do it, I need to sort my Neurules Layer, split them and then, with one Lambda Layer for each split, do some processing as shown in the EDITed part of the question: normalize the inputs, use the backend.any and then cast the True or False back to float.






                share|improve this answer













                I found a way to do it, I need to sort my Neurules Layer, split them and then, with one Lambda Layer for each split, do some processing as shown in the EDITed part of the question: normalize the inputs, use the backend.any and then cast the True or False back to float.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Nov 20 '18 at 11:47









                ViniciusVinicius

                158




                158
































                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53340597%2for-lambda-layer-operation-with-keras%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Bressuire

                    Vorschmack

                    Quarantine