inception v3 using tf.data?












0














I'm using a bit of code that is derived from inception v3 as distributed by the Google folks, but it's now complaining that the queue runners used to read the data are deprecated (tf.train.string_input_producer in image_processing.py, and similar). Apparently I'm supposed to switch to tf.data for this kind of stuff.



Unfortunately, the documentation on tf.data isn't doing much to relieve my concern that I've got too much data to fit in memory, especially given that I want to batch it in a reusable way, etc. I'm confident that the tf.data stuff can do this; I just don't know how to do it. Can anyone point me to a full example of code that uses tf.data to deal with batches of data that won't all fit in memory? Ideally, it would simply be an updated version of the inception-v3 code, but I'd be happy to try and work with anything. Thanks!










share|improve this question






















  • when I first started reading about the tf.data API, I found these few links to be very useful for me outside of the official api documentation: Data pipeline tutorial from Stanford mnist with tf data example Input pipeline performance guide Hope you find any of these useful!
    – kvish
    Nov 13 '18 at 2:13






  • 1




    Very helpful; thanks!
    – Matt Ginsberg
    Nov 13 '18 at 23:33
















0














I'm using a bit of code that is derived from inception v3 as distributed by the Google folks, but it's now complaining that the queue runners used to read the data are deprecated (tf.train.string_input_producer in image_processing.py, and similar). Apparently I'm supposed to switch to tf.data for this kind of stuff.



Unfortunately, the documentation on tf.data isn't doing much to relieve my concern that I've got too much data to fit in memory, especially given that I want to batch it in a reusable way, etc. I'm confident that the tf.data stuff can do this; I just don't know how to do it. Can anyone point me to a full example of code that uses tf.data to deal with batches of data that won't all fit in memory? Ideally, it would simply be an updated version of the inception-v3 code, but I'd be happy to try and work with anything. Thanks!










share|improve this question






















  • when I first started reading about the tf.data API, I found these few links to be very useful for me outside of the official api documentation: Data pipeline tutorial from Stanford mnist with tf data example Input pipeline performance guide Hope you find any of these useful!
    – kvish
    Nov 13 '18 at 2:13






  • 1




    Very helpful; thanks!
    – Matt Ginsberg
    Nov 13 '18 at 23:33














0












0








0







I'm using a bit of code that is derived from inception v3 as distributed by the Google folks, but it's now complaining that the queue runners used to read the data are deprecated (tf.train.string_input_producer in image_processing.py, and similar). Apparently I'm supposed to switch to tf.data for this kind of stuff.



Unfortunately, the documentation on tf.data isn't doing much to relieve my concern that I've got too much data to fit in memory, especially given that I want to batch it in a reusable way, etc. I'm confident that the tf.data stuff can do this; I just don't know how to do it. Can anyone point me to a full example of code that uses tf.data to deal with batches of data that won't all fit in memory? Ideally, it would simply be an updated version of the inception-v3 code, but I'd be happy to try and work with anything. Thanks!










share|improve this question













I'm using a bit of code that is derived from inception v3 as distributed by the Google folks, but it's now complaining that the queue runners used to read the data are deprecated (tf.train.string_input_producer in image_processing.py, and similar). Apparently I'm supposed to switch to tf.data for this kind of stuff.



Unfortunately, the documentation on tf.data isn't doing much to relieve my concern that I've got too much data to fit in memory, especially given that I want to batch it in a reusable way, etc. I'm confident that the tf.data stuff can do this; I just don't know how to do it. Can anyone point me to a full example of code that uses tf.data to deal with batches of data that won't all fit in memory? Ideally, it would simply be an updated version of the inception-v3 code, but I'd be happy to try and work with anything. Thanks!







python tensorflow






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 13 '18 at 1:35









Matt Ginsberg

63




63












  • when I first started reading about the tf.data API, I found these few links to be very useful for me outside of the official api documentation: Data pipeline tutorial from Stanford mnist with tf data example Input pipeline performance guide Hope you find any of these useful!
    – kvish
    Nov 13 '18 at 2:13






  • 1




    Very helpful; thanks!
    – Matt Ginsberg
    Nov 13 '18 at 23:33


















  • when I first started reading about the tf.data API, I found these few links to be very useful for me outside of the official api documentation: Data pipeline tutorial from Stanford mnist with tf data example Input pipeline performance guide Hope you find any of these useful!
    – kvish
    Nov 13 '18 at 2:13






  • 1




    Very helpful; thanks!
    – Matt Ginsberg
    Nov 13 '18 at 23:33
















when I first started reading about the tf.data API, I found these few links to be very useful for me outside of the official api documentation: Data pipeline tutorial from Stanford mnist with tf data example Input pipeline performance guide Hope you find any of these useful!
– kvish
Nov 13 '18 at 2:13




when I first started reading about the tf.data API, I found these few links to be very useful for me outside of the official api documentation: Data pipeline tutorial from Stanford mnist with tf data example Input pipeline performance guide Hope you find any of these useful!
– kvish
Nov 13 '18 at 2:13




1




1




Very helpful; thanks!
– Matt Ginsberg
Nov 13 '18 at 23:33




Very helpful; thanks!
– Matt Ginsberg
Nov 13 '18 at 23:33












1 Answer
1






active

oldest

votes


















0














Well, I eventually got this working. The various documents referenced in the comment on my question had what I needed, and I gradually figured out which parameters passed to queuerunners corresponded to which parameters in the tf.data stuff.



There was one gotcha that took a while for me to sort out. In the inception implementation, the number of examples used for validation is rounded up to be a multiple of the batch size; presumably the validation set is reshuffled and some examples are used more than once. (This does not strike me as great practice, but generally the number of validation instances is way larger than the batch size, so only a relative few are double counted.)



In the tf.data stuff, enabling shuffling and reuse is a separate thing and I didn't do it on the validation data. Then things broke because there weren't enough unique validation instances, and I had to track that down.



I hope this helps the next person with this issue. Unfortunately, my code has drifted quite far from Inception v3 and I doubt that it would be helpful for me to post my modification. Thanks!






share|improve this answer





















    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53272508%2finception-v3-using-tf-data%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    Well, I eventually got this working. The various documents referenced in the comment on my question had what I needed, and I gradually figured out which parameters passed to queuerunners corresponded to which parameters in the tf.data stuff.



    There was one gotcha that took a while for me to sort out. In the inception implementation, the number of examples used for validation is rounded up to be a multiple of the batch size; presumably the validation set is reshuffled and some examples are used more than once. (This does not strike me as great practice, but generally the number of validation instances is way larger than the batch size, so only a relative few are double counted.)



    In the tf.data stuff, enabling shuffling and reuse is a separate thing and I didn't do it on the validation data. Then things broke because there weren't enough unique validation instances, and I had to track that down.



    I hope this helps the next person with this issue. Unfortunately, my code has drifted quite far from Inception v3 and I doubt that it would be helpful for me to post my modification. Thanks!






    share|improve this answer


























      0














      Well, I eventually got this working. The various documents referenced in the comment on my question had what I needed, and I gradually figured out which parameters passed to queuerunners corresponded to which parameters in the tf.data stuff.



      There was one gotcha that took a while for me to sort out. In the inception implementation, the number of examples used for validation is rounded up to be a multiple of the batch size; presumably the validation set is reshuffled and some examples are used more than once. (This does not strike me as great practice, but generally the number of validation instances is way larger than the batch size, so only a relative few are double counted.)



      In the tf.data stuff, enabling shuffling and reuse is a separate thing and I didn't do it on the validation data. Then things broke because there weren't enough unique validation instances, and I had to track that down.



      I hope this helps the next person with this issue. Unfortunately, my code has drifted quite far from Inception v3 and I doubt that it would be helpful for me to post my modification. Thanks!






      share|improve this answer
























        0












        0








        0






        Well, I eventually got this working. The various documents referenced in the comment on my question had what I needed, and I gradually figured out which parameters passed to queuerunners corresponded to which parameters in the tf.data stuff.



        There was one gotcha that took a while for me to sort out. In the inception implementation, the number of examples used for validation is rounded up to be a multiple of the batch size; presumably the validation set is reshuffled and some examples are used more than once. (This does not strike me as great practice, but generally the number of validation instances is way larger than the batch size, so only a relative few are double counted.)



        In the tf.data stuff, enabling shuffling and reuse is a separate thing and I didn't do it on the validation data. Then things broke because there weren't enough unique validation instances, and I had to track that down.



        I hope this helps the next person with this issue. Unfortunately, my code has drifted quite far from Inception v3 and I doubt that it would be helpful for me to post my modification. Thanks!






        share|improve this answer












        Well, I eventually got this working. The various documents referenced in the comment on my question had what I needed, and I gradually figured out which parameters passed to queuerunners corresponded to which parameters in the tf.data stuff.



        There was one gotcha that took a while for me to sort out. In the inception implementation, the number of examples used for validation is rounded up to be a multiple of the batch size; presumably the validation set is reshuffled and some examples are used more than once. (This does not strike me as great practice, but generally the number of validation instances is way larger than the batch size, so only a relative few are double counted.)



        In the tf.data stuff, enabling shuffling and reuse is a separate thing and I didn't do it on the validation data. Then things broke because there weren't enough unique validation instances, and I had to track that down.



        I hope this helps the next person with this issue. Unfortunately, my code has drifted quite far from Inception v3 and I doubt that it would be helpful for me to post my modification. Thanks!







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 23 '18 at 20:09









        Matt Ginsberg

        63




        63






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53272508%2finception-v3-using-tf-data%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Bressuire

            Vorschmack

            Quarantine