Can Python generators be invoked non-lazily?












1















I know that in Python, generators are invoked lazily. For example:



>>> def G():
... print('this was evaluated now 1')
... yield 1
... print('this was evaluated now 2')
... yield 2
...
>>> g = G()
>>> next(g)
this was evaluated now 1
1
>>> next(g)
this was evaluated now 2
2


The line print('this was evaluated now 1') was evaluated only after the first next(g) was called.



I wonder whether there is a simple way to invoke the generator non-lazily. This means that when calling g = G(), the function would calculate everything up to and including the first yield result, without actually yielding. Then, on the first call to next(g), the already-calculated result will be yielded, and also everything up to and including the second yield result would be calculated. And so on.



How can this be achieved?





Here is the expected behavior under this non-lazy scheme:



>>> g = G()
this was evaluated now 1
>>> next(g)
1
this was evaluated now 2
>>> next(g)
2




Here is a solution attempt, which does not work:



>>> class NonLazyGenerator():
... def __init__(self,G):
... self.g = G()
... self.next_value = next(self.g)
...
... def __next__(self):
... current_value = self.next_value
... try:
... self.next_value = next(self.g)
... except StopIteration:
... pass
... return current_value
...
>>> g = NonLazyGenerator(G)
this was evaluated now 1
>>> next(g)
this was evaluated now 2
1
>>> next(g)
2


This fails since the value is yielded only after the return statement, while the calculation of everything up to the next yield happens before the return statement. This example made me realize that it may not be possible to perform what I am seeking for, since it would require doing steps after the function has returned (might require multi-threading).










share|improve this question


















  • 1





    Why would you want that? Yielding would still do calculation work for the next result, so you are not gaining anything.

    – schwobaseggl
    Nov 16 '18 at 8:23











  • I can give an explanation for why I would want this, but the purpose of my question is not to find justifications for or against doing this, but rather to understand how to do this.

    – Lior
    Nov 17 '18 at 11:09











  • The reason is that I have a function which has in its body the a yield statement, and this turns the function into a generator. However, this yield is only reachable conditioned on the value of an argument to the function. I want the function to behave as a regular function when this argument is False, and as a generator otherwise. If generators would be evaluated non-lazily. this would solve the problem. (Of course, there are other ways to solve this problem, and of course, this isn't necessary good programming, but as I said, I'm currently only interested in an answer to the question)

    – Lior
    Nov 17 '18 at 11:40


















1















I know that in Python, generators are invoked lazily. For example:



>>> def G():
... print('this was evaluated now 1')
... yield 1
... print('this was evaluated now 2')
... yield 2
...
>>> g = G()
>>> next(g)
this was evaluated now 1
1
>>> next(g)
this was evaluated now 2
2


The line print('this was evaluated now 1') was evaluated only after the first next(g) was called.



I wonder whether there is a simple way to invoke the generator non-lazily. This means that when calling g = G(), the function would calculate everything up to and including the first yield result, without actually yielding. Then, on the first call to next(g), the already-calculated result will be yielded, and also everything up to and including the second yield result would be calculated. And so on.



How can this be achieved?





Here is the expected behavior under this non-lazy scheme:



>>> g = G()
this was evaluated now 1
>>> next(g)
1
this was evaluated now 2
>>> next(g)
2




Here is a solution attempt, which does not work:



>>> class NonLazyGenerator():
... def __init__(self,G):
... self.g = G()
... self.next_value = next(self.g)
...
... def __next__(self):
... current_value = self.next_value
... try:
... self.next_value = next(self.g)
... except StopIteration:
... pass
... return current_value
...
>>> g = NonLazyGenerator(G)
this was evaluated now 1
>>> next(g)
this was evaluated now 2
1
>>> next(g)
2


This fails since the value is yielded only after the return statement, while the calculation of everything up to the next yield happens before the return statement. This example made me realize that it may not be possible to perform what I am seeking for, since it would require doing steps after the function has returned (might require multi-threading).










share|improve this question


















  • 1





    Why would you want that? Yielding would still do calculation work for the next result, so you are not gaining anything.

    – schwobaseggl
    Nov 16 '18 at 8:23











  • I can give an explanation for why I would want this, but the purpose of my question is not to find justifications for or against doing this, but rather to understand how to do this.

    – Lior
    Nov 17 '18 at 11:09











  • The reason is that I have a function which has in its body the a yield statement, and this turns the function into a generator. However, this yield is only reachable conditioned on the value of an argument to the function. I want the function to behave as a regular function when this argument is False, and as a generator otherwise. If generators would be evaluated non-lazily. this would solve the problem. (Of course, there are other ways to solve this problem, and of course, this isn't necessary good programming, but as I said, I'm currently only interested in an answer to the question)

    – Lior
    Nov 17 '18 at 11:40
















1












1








1








I know that in Python, generators are invoked lazily. For example:



>>> def G():
... print('this was evaluated now 1')
... yield 1
... print('this was evaluated now 2')
... yield 2
...
>>> g = G()
>>> next(g)
this was evaluated now 1
1
>>> next(g)
this was evaluated now 2
2


The line print('this was evaluated now 1') was evaluated only after the first next(g) was called.



I wonder whether there is a simple way to invoke the generator non-lazily. This means that when calling g = G(), the function would calculate everything up to and including the first yield result, without actually yielding. Then, on the first call to next(g), the already-calculated result will be yielded, and also everything up to and including the second yield result would be calculated. And so on.



How can this be achieved?





Here is the expected behavior under this non-lazy scheme:



>>> g = G()
this was evaluated now 1
>>> next(g)
1
this was evaluated now 2
>>> next(g)
2




Here is a solution attempt, which does not work:



>>> class NonLazyGenerator():
... def __init__(self,G):
... self.g = G()
... self.next_value = next(self.g)
...
... def __next__(self):
... current_value = self.next_value
... try:
... self.next_value = next(self.g)
... except StopIteration:
... pass
... return current_value
...
>>> g = NonLazyGenerator(G)
this was evaluated now 1
>>> next(g)
this was evaluated now 2
1
>>> next(g)
2


This fails since the value is yielded only after the return statement, while the calculation of everything up to the next yield happens before the return statement. This example made me realize that it may not be possible to perform what I am seeking for, since it would require doing steps after the function has returned (might require multi-threading).










share|improve this question














I know that in Python, generators are invoked lazily. For example:



>>> def G():
... print('this was evaluated now 1')
... yield 1
... print('this was evaluated now 2')
... yield 2
...
>>> g = G()
>>> next(g)
this was evaluated now 1
1
>>> next(g)
this was evaluated now 2
2


The line print('this was evaluated now 1') was evaluated only after the first next(g) was called.



I wonder whether there is a simple way to invoke the generator non-lazily. This means that when calling g = G(), the function would calculate everything up to and including the first yield result, without actually yielding. Then, on the first call to next(g), the already-calculated result will be yielded, and also everything up to and including the second yield result would be calculated. And so on.



How can this be achieved?





Here is the expected behavior under this non-lazy scheme:



>>> g = G()
this was evaluated now 1
>>> next(g)
1
this was evaluated now 2
>>> next(g)
2




Here is a solution attempt, which does not work:



>>> class NonLazyGenerator():
... def __init__(self,G):
... self.g = G()
... self.next_value = next(self.g)
...
... def __next__(self):
... current_value = self.next_value
... try:
... self.next_value = next(self.g)
... except StopIteration:
... pass
... return current_value
...
>>> g = NonLazyGenerator(G)
this was evaluated now 1
>>> next(g)
this was evaluated now 2
1
>>> next(g)
2


This fails since the value is yielded only after the return statement, while the calculation of everything up to the next yield happens before the return statement. This example made me realize that it may not be possible to perform what I am seeking for, since it would require doing steps after the function has returned (might require multi-threading).







python generator






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 16 '18 at 7:45









LiorLior

1,467615




1,467615








  • 1





    Why would you want that? Yielding would still do calculation work for the next result, so you are not gaining anything.

    – schwobaseggl
    Nov 16 '18 at 8:23











  • I can give an explanation for why I would want this, but the purpose of my question is not to find justifications for or against doing this, but rather to understand how to do this.

    – Lior
    Nov 17 '18 at 11:09











  • The reason is that I have a function which has in its body the a yield statement, and this turns the function into a generator. However, this yield is only reachable conditioned on the value of an argument to the function. I want the function to behave as a regular function when this argument is False, and as a generator otherwise. If generators would be evaluated non-lazily. this would solve the problem. (Of course, there are other ways to solve this problem, and of course, this isn't necessary good programming, but as I said, I'm currently only interested in an answer to the question)

    – Lior
    Nov 17 '18 at 11:40
















  • 1





    Why would you want that? Yielding would still do calculation work for the next result, so you are not gaining anything.

    – schwobaseggl
    Nov 16 '18 at 8:23











  • I can give an explanation for why I would want this, but the purpose of my question is not to find justifications for or against doing this, but rather to understand how to do this.

    – Lior
    Nov 17 '18 at 11:09











  • The reason is that I have a function which has in its body the a yield statement, and this turns the function into a generator. However, this yield is only reachable conditioned on the value of an argument to the function. I want the function to behave as a regular function when this argument is False, and as a generator otherwise. If generators would be evaluated non-lazily. this would solve the problem. (Of course, there are other ways to solve this problem, and of course, this isn't necessary good programming, but as I said, I'm currently only interested in an answer to the question)

    – Lior
    Nov 17 '18 at 11:40










1




1





Why would you want that? Yielding would still do calculation work for the next result, so you are not gaining anything.

– schwobaseggl
Nov 16 '18 at 8:23





Why would you want that? Yielding would still do calculation work for the next result, so you are not gaining anything.

– schwobaseggl
Nov 16 '18 at 8:23













I can give an explanation for why I would want this, but the purpose of my question is not to find justifications for or against doing this, but rather to understand how to do this.

– Lior
Nov 17 '18 at 11:09





I can give an explanation for why I would want this, but the purpose of my question is not to find justifications for or against doing this, but rather to understand how to do this.

– Lior
Nov 17 '18 at 11:09













The reason is that I have a function which has in its body the a yield statement, and this turns the function into a generator. However, this yield is only reachable conditioned on the value of an argument to the function. I want the function to behave as a regular function when this argument is False, and as a generator otherwise. If generators would be evaluated non-lazily. this would solve the problem. (Of course, there are other ways to solve this problem, and of course, this isn't necessary good programming, but as I said, I'm currently only interested in an answer to the question)

– Lior
Nov 17 '18 at 11:40







The reason is that I have a function which has in its body the a yield statement, and this turns the function into a generator. However, this yield is only reachable conditioned on the value of an argument to the function. I want the function to behave as a regular function when this argument is False, and as a generator otherwise. If generators would be evaluated non-lazily. this would solve the problem. (Of course, there are other ways to solve this problem, and of course, this isn't necessary good programming, but as I said, I'm currently only interested in an answer to the question)

– Lior
Nov 17 '18 at 11:40














2 Answers
2






active

oldest

votes


















1














You could probably write some kind of decorator for it, such as:



def eagergenerator(mygen):
class GeneratorWrapper:
def __init__(self, *args, **kwargs):
self.g = mygen(*args, **kwargs)
self.last = next(self.g)
def __iter__(self):
return self
def __next__(self):
if self.last is self:
raise StopIteration
fake_yield = self.last
try:
self.last = next(self.g)
return fake_yield
except StopIteration:
self.last = self
return fake_yield
return GeneratorWrapper


Then you can simply decorate your normal generators:



@eagergenerator
def G():
print("one")
yield 1
print("two")
yield 2


which will work as follows:



>>> g = G()                               
one
>>> next(g)
two
1
>>> next(g)
2
>>> next(g)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "eagergen.py", line 10, in __next__
raise StopIteration
StopIteration
>>>





share|improve this answer



















  • 1





    A nice example (+1). I'd like to add few notes if somebody wants to base own code on it. I did not try, but I think an "empty" generator needs to be handled as a special case with try/next/except StopIteration in __init__. Also the value of StopIteration (i.e. the generator's return value) should be preserved by the wrapper.

    – VPfB
    Nov 16 '18 at 8:36






  • 1





    This was just a proof-of-concept, but feel free to edit my answer!

    – L3viathan
    Nov 16 '18 at 8:47



















1














credit: this was inspired by @L3viathan answer



In this version, itertools.tee is used to store the one yielded value the wrapper is behind the original generator.



import itertools

def eagergenerator(mygen):
class GeneratorWrapper:
def __init__(self, *args, **kwargs):
self.g0, self.g1 = itertools.tee(mygen(*args, **kwargs))
self._next0()
def _next0(self):
try:
next(self.g0)
except StopIteration:
pass
def __iter__(self):
return self
def __next__(self):
self._next0()
return next(self.g1)
return GeneratorWrapper





share|improve this answer























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53333470%2fcan-python-generators-be-invoked-non-lazily%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1














    You could probably write some kind of decorator for it, such as:



    def eagergenerator(mygen):
    class GeneratorWrapper:
    def __init__(self, *args, **kwargs):
    self.g = mygen(*args, **kwargs)
    self.last = next(self.g)
    def __iter__(self):
    return self
    def __next__(self):
    if self.last is self:
    raise StopIteration
    fake_yield = self.last
    try:
    self.last = next(self.g)
    return fake_yield
    except StopIteration:
    self.last = self
    return fake_yield
    return GeneratorWrapper


    Then you can simply decorate your normal generators:



    @eagergenerator
    def G():
    print("one")
    yield 1
    print("two")
    yield 2


    which will work as follows:



    >>> g = G()                               
    one
    >>> next(g)
    two
    1
    >>> next(g)
    2
    >>> next(g)
    Traceback (most recent call last):
    File "<stdin>", line 1, in <module>
    File "eagergen.py", line 10, in __next__
    raise StopIteration
    StopIteration
    >>>





    share|improve this answer



















    • 1





      A nice example (+1). I'd like to add few notes if somebody wants to base own code on it. I did not try, but I think an "empty" generator needs to be handled as a special case with try/next/except StopIteration in __init__. Also the value of StopIteration (i.e. the generator's return value) should be preserved by the wrapper.

      – VPfB
      Nov 16 '18 at 8:36






    • 1





      This was just a proof-of-concept, but feel free to edit my answer!

      – L3viathan
      Nov 16 '18 at 8:47
















    1














    You could probably write some kind of decorator for it, such as:



    def eagergenerator(mygen):
    class GeneratorWrapper:
    def __init__(self, *args, **kwargs):
    self.g = mygen(*args, **kwargs)
    self.last = next(self.g)
    def __iter__(self):
    return self
    def __next__(self):
    if self.last is self:
    raise StopIteration
    fake_yield = self.last
    try:
    self.last = next(self.g)
    return fake_yield
    except StopIteration:
    self.last = self
    return fake_yield
    return GeneratorWrapper


    Then you can simply decorate your normal generators:



    @eagergenerator
    def G():
    print("one")
    yield 1
    print("two")
    yield 2


    which will work as follows:



    >>> g = G()                               
    one
    >>> next(g)
    two
    1
    >>> next(g)
    2
    >>> next(g)
    Traceback (most recent call last):
    File "<stdin>", line 1, in <module>
    File "eagergen.py", line 10, in __next__
    raise StopIteration
    StopIteration
    >>>





    share|improve this answer



















    • 1





      A nice example (+1). I'd like to add few notes if somebody wants to base own code on it. I did not try, but I think an "empty" generator needs to be handled as a special case with try/next/except StopIteration in __init__. Also the value of StopIteration (i.e. the generator's return value) should be preserved by the wrapper.

      – VPfB
      Nov 16 '18 at 8:36






    • 1





      This was just a proof-of-concept, but feel free to edit my answer!

      – L3viathan
      Nov 16 '18 at 8:47














    1












    1








    1







    You could probably write some kind of decorator for it, such as:



    def eagergenerator(mygen):
    class GeneratorWrapper:
    def __init__(self, *args, **kwargs):
    self.g = mygen(*args, **kwargs)
    self.last = next(self.g)
    def __iter__(self):
    return self
    def __next__(self):
    if self.last is self:
    raise StopIteration
    fake_yield = self.last
    try:
    self.last = next(self.g)
    return fake_yield
    except StopIteration:
    self.last = self
    return fake_yield
    return GeneratorWrapper


    Then you can simply decorate your normal generators:



    @eagergenerator
    def G():
    print("one")
    yield 1
    print("two")
    yield 2


    which will work as follows:



    >>> g = G()                               
    one
    >>> next(g)
    two
    1
    >>> next(g)
    2
    >>> next(g)
    Traceback (most recent call last):
    File "<stdin>", line 1, in <module>
    File "eagergen.py", line 10, in __next__
    raise StopIteration
    StopIteration
    >>>





    share|improve this answer













    You could probably write some kind of decorator for it, such as:



    def eagergenerator(mygen):
    class GeneratorWrapper:
    def __init__(self, *args, **kwargs):
    self.g = mygen(*args, **kwargs)
    self.last = next(self.g)
    def __iter__(self):
    return self
    def __next__(self):
    if self.last is self:
    raise StopIteration
    fake_yield = self.last
    try:
    self.last = next(self.g)
    return fake_yield
    except StopIteration:
    self.last = self
    return fake_yield
    return GeneratorWrapper


    Then you can simply decorate your normal generators:



    @eagergenerator
    def G():
    print("one")
    yield 1
    print("two")
    yield 2


    which will work as follows:



    >>> g = G()                               
    one
    >>> next(g)
    two
    1
    >>> next(g)
    2
    >>> next(g)
    Traceback (most recent call last):
    File "<stdin>", line 1, in <module>
    File "eagergen.py", line 10, in __next__
    raise StopIteration
    StopIteration
    >>>






    share|improve this answer












    share|improve this answer



    share|improve this answer










    answered Nov 16 '18 at 7:58









    L3viathanL3viathan

    16.6k13050




    16.6k13050








    • 1





      A nice example (+1). I'd like to add few notes if somebody wants to base own code on it. I did not try, but I think an "empty" generator needs to be handled as a special case with try/next/except StopIteration in __init__. Also the value of StopIteration (i.e. the generator's return value) should be preserved by the wrapper.

      – VPfB
      Nov 16 '18 at 8:36






    • 1





      This was just a proof-of-concept, but feel free to edit my answer!

      – L3viathan
      Nov 16 '18 at 8:47














    • 1





      A nice example (+1). I'd like to add few notes if somebody wants to base own code on it. I did not try, but I think an "empty" generator needs to be handled as a special case with try/next/except StopIteration in __init__. Also the value of StopIteration (i.e. the generator's return value) should be preserved by the wrapper.

      – VPfB
      Nov 16 '18 at 8:36






    • 1





      This was just a proof-of-concept, but feel free to edit my answer!

      – L3viathan
      Nov 16 '18 at 8:47








    1




    1





    A nice example (+1). I'd like to add few notes if somebody wants to base own code on it. I did not try, but I think an "empty" generator needs to be handled as a special case with try/next/except StopIteration in __init__. Also the value of StopIteration (i.e. the generator's return value) should be preserved by the wrapper.

    – VPfB
    Nov 16 '18 at 8:36





    A nice example (+1). I'd like to add few notes if somebody wants to base own code on it. I did not try, but I think an "empty" generator needs to be handled as a special case with try/next/except StopIteration in __init__. Also the value of StopIteration (i.e. the generator's return value) should be preserved by the wrapper.

    – VPfB
    Nov 16 '18 at 8:36




    1




    1





    This was just a proof-of-concept, but feel free to edit my answer!

    – L3viathan
    Nov 16 '18 at 8:47





    This was just a proof-of-concept, but feel free to edit my answer!

    – L3viathan
    Nov 16 '18 at 8:47













    1














    credit: this was inspired by @L3viathan answer



    In this version, itertools.tee is used to store the one yielded value the wrapper is behind the original generator.



    import itertools

    def eagergenerator(mygen):
    class GeneratorWrapper:
    def __init__(self, *args, **kwargs):
    self.g0, self.g1 = itertools.tee(mygen(*args, **kwargs))
    self._next0()
    def _next0(self):
    try:
    next(self.g0)
    except StopIteration:
    pass
    def __iter__(self):
    return self
    def __next__(self):
    self._next0()
    return next(self.g1)
    return GeneratorWrapper





    share|improve this answer




























      1














      credit: this was inspired by @L3viathan answer



      In this version, itertools.tee is used to store the one yielded value the wrapper is behind the original generator.



      import itertools

      def eagergenerator(mygen):
      class GeneratorWrapper:
      def __init__(self, *args, **kwargs):
      self.g0, self.g1 = itertools.tee(mygen(*args, **kwargs))
      self._next0()
      def _next0(self):
      try:
      next(self.g0)
      except StopIteration:
      pass
      def __iter__(self):
      return self
      def __next__(self):
      self._next0()
      return next(self.g1)
      return GeneratorWrapper





      share|improve this answer


























        1












        1








        1







        credit: this was inspired by @L3viathan answer



        In this version, itertools.tee is used to store the one yielded value the wrapper is behind the original generator.



        import itertools

        def eagergenerator(mygen):
        class GeneratorWrapper:
        def __init__(self, *args, **kwargs):
        self.g0, self.g1 = itertools.tee(mygen(*args, **kwargs))
        self._next0()
        def _next0(self):
        try:
        next(self.g0)
        except StopIteration:
        pass
        def __iter__(self):
        return self
        def __next__(self):
        self._next0()
        return next(self.g1)
        return GeneratorWrapper





        share|improve this answer













        credit: this was inspired by @L3viathan answer



        In this version, itertools.tee is used to store the one yielded value the wrapper is behind the original generator.



        import itertools

        def eagergenerator(mygen):
        class GeneratorWrapper:
        def __init__(self, *args, **kwargs):
        self.g0, self.g1 = itertools.tee(mygen(*args, **kwargs))
        self._next0()
        def _next0(self):
        try:
        next(self.g0)
        except StopIteration:
        pass
        def __iter__(self):
        return self
        def __next__(self):
        self._next0()
        return next(self.g1)
        return GeneratorWrapper






        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 16 '18 at 8:49









        VPfBVPfB

        4,43711231




        4,43711231






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53333470%2fcan-python-generators-be-invoked-non-lazily%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Xamarin.iOS Cant Deploy on Iphone

            Glorious Revolution

            Dulmage-Mendelsohn matrix decomposition in Python