Python3 - TypeException take 1 argument 2 where given [closed]
The program i am working on have a class with constructor defined as follow :
def Oracle(object) :
Agold = None
sentence = None
def __init__(self, sentence, Agold):
self.Agold = Agold
self.sentence = sentence
but, when i call the constructor in my main method, as follow :
oracle = Oracle(words, ref_tree)
python 3 give me this error :
Traceback (most recent call last):
File "oracle_test.py", line 52, in test_exemple
oracle = Oracle(words, ref_tree)
TypeError: Oracle() takes 1 positional argument but 2 were given
i don't understand the origin of this problem, and i don't see what gone wrong.
Can someone give me an explanation ?
Thanks
python arguments
closed as off-topic by user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit Nov 15 '18 at 7:16
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question was caused by a problem that can no longer be reproduced or a simple typographical error. While similar questions may be on-topic here, this one was resolved in a manner unlikely to help future readers. This can often be avoided by identifying and closely inspecting the shortest program necessary to reproduce the problem before posting." – user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit
If this question can be reworded to fit the rules in the help center, please edit the question.
add a comment |
The program i am working on have a class with constructor defined as follow :
def Oracle(object) :
Agold = None
sentence = None
def __init__(self, sentence, Agold):
self.Agold = Agold
self.sentence = sentence
but, when i call the constructor in my main method, as follow :
oracle = Oracle(words, ref_tree)
python 3 give me this error :
Traceback (most recent call last):
File "oracle_test.py", line 52, in test_exemple
oracle = Oracle(words, ref_tree)
TypeError: Oracle() takes 1 positional argument but 2 were given
i don't understand the origin of this problem, and i don't see what gone wrong.
Can someone give me an explanation ?
Thanks
python arguments
closed as off-topic by user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit Nov 15 '18 at 7:16
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question was caused by a problem that can no longer be reproduced or a simple typographical error. While similar questions may be on-topic here, this one was resolved in a manner unlikely to help future readers. This can often be avoided by identifying and closely inspecting the shortest program necessary to reproduce the problem before posting." – user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit
If this question can be reworded to fit the rules in the help center, please edit the question.
add a comment |
The program i am working on have a class with constructor defined as follow :
def Oracle(object) :
Agold = None
sentence = None
def __init__(self, sentence, Agold):
self.Agold = Agold
self.sentence = sentence
but, when i call the constructor in my main method, as follow :
oracle = Oracle(words, ref_tree)
python 3 give me this error :
Traceback (most recent call last):
File "oracle_test.py", line 52, in test_exemple
oracle = Oracle(words, ref_tree)
TypeError: Oracle() takes 1 positional argument but 2 were given
i don't understand the origin of this problem, and i don't see what gone wrong.
Can someone give me an explanation ?
Thanks
python arguments
The program i am working on have a class with constructor defined as follow :
def Oracle(object) :
Agold = None
sentence = None
def __init__(self, sentence, Agold):
self.Agold = Agold
self.sentence = sentence
but, when i call the constructor in my main method, as follow :
oracle = Oracle(words, ref_tree)
python 3 give me this error :
Traceback (most recent call last):
File "oracle_test.py", line 52, in test_exemple
oracle = Oracle(words, ref_tree)
TypeError: Oracle() takes 1 positional argument but 2 were given
i don't understand the origin of this problem, and i don't see what gone wrong.
Can someone give me an explanation ?
Thanks
python arguments
python arguments
asked Nov 15 '18 at 2:24
red lanternred lantern
82
82
closed as off-topic by user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit Nov 15 '18 at 7:16
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question was caused by a problem that can no longer be reproduced or a simple typographical error. While similar questions may be on-topic here, this one was resolved in a manner unlikely to help future readers. This can often be avoided by identifying and closely inspecting the shortest program necessary to reproduce the problem before posting." – user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit
If this question can be reworded to fit the rules in the help center, please edit the question.
closed as off-topic by user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit Nov 15 '18 at 7:16
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question was caused by a problem that can no longer be reproduced or a simple typographical error. While similar questions may be on-topic here, this one was resolved in a manner unlikely to help future readers. This can often be avoided by identifying and closely inspecting the shortest program necessary to reproduce the problem before posting." – user2357112, Unheilig, Shiladitya, Billal Begueradj, V-rund Puro-hit
If this question can be reworded to fit the rules in the help center, please edit the question.
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
You defined Oracle
as a function instead of a class. Use class
instead of def
. Also, assuming Agold
and sentence
are supposed to be instance variables instead of class variables, Agold = None
and sentence = None
are not needed (see this).
class Oracle(object):
def __init__(self, sentence, Agold):
self.Agold = Agold
self.sentence = sentence
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
You defined Oracle
as a function instead of a class. Use class
instead of def
. Also, assuming Agold
and sentence
are supposed to be instance variables instead of class variables, Agold = None
and sentence = None
are not needed (see this).
class Oracle(object):
def __init__(self, sentence, Agold):
self.Agold = Agold
self.sentence = sentence
add a comment |
You defined Oracle
as a function instead of a class. Use class
instead of def
. Also, assuming Agold
and sentence
are supposed to be instance variables instead of class variables, Agold = None
and sentence = None
are not needed (see this).
class Oracle(object):
def __init__(self, sentence, Agold):
self.Agold = Agold
self.sentence = sentence
add a comment |
You defined Oracle
as a function instead of a class. Use class
instead of def
. Also, assuming Agold
and sentence
are supposed to be instance variables instead of class variables, Agold = None
and sentence = None
are not needed (see this).
class Oracle(object):
def __init__(self, sentence, Agold):
self.Agold = Agold
self.sentence = sentence
You defined Oracle
as a function instead of a class. Use class
instead of def
. Also, assuming Agold
and sentence
are supposed to be instance variables instead of class variables, Agold = None
and sentence = None
are not needed (see this).
class Oracle(object):
def __init__(self, sentence, Agold):
self.Agold = Agold
self.sentence = sentence
answered Nov 15 '18 at 2:29
Tomothy32Tomothy32
7,0081626
7,0081626
add a comment |
add a comment |