Python3 - TypeException take 1 argument 2 where given [closed]












0















The program i am working on have a class with constructor defined as follow :



def Oracle(object) :
Agold = None
sentence = None

def __init__(self, sentence, Agold):
self.Agold = Agold
self.sentence = sentence


but, when i call the constructor in my main method, as follow :



oracle = Oracle(words, ref_tree)


python 3 give me this error :



Traceback (most recent call last):
File "oracle_test.py", line 52, in test_exemple
oracle = Oracle(words, ref_tree)
TypeError: Oracle() takes 1 positional argument but 2 were given


i don't understand the origin of this problem, and i don't see what gone wrong.



Can someone give me an explanation ?
Thanks










share|improve this question













closed as off-topic by user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit Nov 15 '18 at 7:16


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question was caused by a problem that can no longer be reproduced or a simple typographical error. While similar questions may be on-topic here, this one was resolved in a manner unlikely to help future readers. This can often be avoided by identifying and closely inspecting the shortest program necessary to reproduce the problem before posting." – user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit

If this question can be reworded to fit the rules in the help center, please edit the question.





















    0















    The program i am working on have a class with constructor defined as follow :



    def Oracle(object) :
    Agold = None
    sentence = None

    def __init__(self, sentence, Agold):
    self.Agold = Agold
    self.sentence = sentence


    but, when i call the constructor in my main method, as follow :



    oracle = Oracle(words, ref_tree)


    python 3 give me this error :



    Traceback (most recent call last):
    File "oracle_test.py", line 52, in test_exemple
    oracle = Oracle(words, ref_tree)
    TypeError: Oracle() takes 1 positional argument but 2 were given


    i don't understand the origin of this problem, and i don't see what gone wrong.



    Can someone give me an explanation ?
    Thanks










    share|improve this question













    closed as off-topic by user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit Nov 15 '18 at 7:16


    This question appears to be off-topic. The users who voted to close gave this specific reason:


    • "This question was caused by a problem that can no longer be reproduced or a simple typographical error. While similar questions may be on-topic here, this one was resolved in a manner unlikely to help future readers. This can often be avoided by identifying and closely inspecting the shortest program necessary to reproduce the problem before posting." – user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit

    If this question can be reworded to fit the rules in the help center, please edit the question.



















      0












      0








      0








      The program i am working on have a class with constructor defined as follow :



      def Oracle(object) :
      Agold = None
      sentence = None

      def __init__(self, sentence, Agold):
      self.Agold = Agold
      self.sentence = sentence


      but, when i call the constructor in my main method, as follow :



      oracle = Oracle(words, ref_tree)


      python 3 give me this error :



      Traceback (most recent call last):
      File "oracle_test.py", line 52, in test_exemple
      oracle = Oracle(words, ref_tree)
      TypeError: Oracle() takes 1 positional argument but 2 were given


      i don't understand the origin of this problem, and i don't see what gone wrong.



      Can someone give me an explanation ?
      Thanks










      share|improve this question














      The program i am working on have a class with constructor defined as follow :



      def Oracle(object) :
      Agold = None
      sentence = None

      def __init__(self, sentence, Agold):
      self.Agold = Agold
      self.sentence = sentence


      but, when i call the constructor in my main method, as follow :



      oracle = Oracle(words, ref_tree)


      python 3 give me this error :



      Traceback (most recent call last):
      File "oracle_test.py", line 52, in test_exemple
      oracle = Oracle(words, ref_tree)
      TypeError: Oracle() takes 1 positional argument but 2 were given


      i don't understand the origin of this problem, and i don't see what gone wrong.



      Can someone give me an explanation ?
      Thanks







      python arguments






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Nov 15 '18 at 2:24









      red lanternred lantern

      82




      82




      closed as off-topic by user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit Nov 15 '18 at 7:16


      This question appears to be off-topic. The users who voted to close gave this specific reason:


      • "This question was caused by a problem that can no longer be reproduced or a simple typographical error. While similar questions may be on-topic here, this one was resolved in a manner unlikely to help future readers. This can often be avoided by identifying and closely inspecting the shortest program necessary to reproduce the problem before posting." – user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit

      If this question can be reworded to fit the rules in the help center, please edit the question.







      closed as off-topic by user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit Nov 15 '18 at 7:16


      This question appears to be off-topic. The users who voted to close gave this specific reason:


      • "This question was caused by a problem that can no longer be reproduced or a simple typographical error. While similar questions may be on-topic here, this one was resolved in a manner unlikely to help future readers. This can often be avoided by identifying and closely inspecting the shortest program necessary to reproduce the problem before posting." – user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit

      If this question can be reworded to fit the rules in the help center, please edit the question.
























          1 Answer
          1






          active

          oldest

          votes


















          0














          You defined Oracle as a function instead of a class. Use class instead of def. Also, assuming Agold and sentence are supposed to be instance variables instead of class variables, Agold = None and sentence = None are not needed (see this).



          class Oracle(object):
          def __init__(self, sentence, Agold):
          self.Agold = Agold
          self.sentence = sentence





          share|improve this answer






























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            You defined Oracle as a function instead of a class. Use class instead of def. Also, assuming Agold and sentence are supposed to be instance variables instead of class variables, Agold = None and sentence = None are not needed (see this).



            class Oracle(object):
            def __init__(self, sentence, Agold):
            self.Agold = Agold
            self.sentence = sentence





            share|improve this answer




























              0














              You defined Oracle as a function instead of a class. Use class instead of def. Also, assuming Agold and sentence are supposed to be instance variables instead of class variables, Agold = None and sentence = None are not needed (see this).



              class Oracle(object):
              def __init__(self, sentence, Agold):
              self.Agold = Agold
              self.sentence = sentence





              share|improve this answer


























                0












                0








                0







                You defined Oracle as a function instead of a class. Use class instead of def. Also, assuming Agold and sentence are supposed to be instance variables instead of class variables, Agold = None and sentence = None are not needed (see this).



                class Oracle(object):
                def __init__(self, sentence, Agold):
                self.Agold = Agold
                self.sentence = sentence





                share|improve this answer













                You defined Oracle as a function instead of a class. Use class instead of def. Also, assuming Agold and sentence are supposed to be instance variables instead of class variables, Agold = None and sentence = None are not needed (see this).



                class Oracle(object):
                def __init__(self, sentence, Agold):
                self.Agold = Agold
                self.sentence = sentence






                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Nov 15 '18 at 2:29









                Tomothy32Tomothy32

                7,0081626




                7,0081626

















                    Popular posts from this blog

                    Bressuire

                    Vorschmack

                    Quarantine