Forward pass output of a pertained network changes without back propagation
2
I am using Chainer's pertained model vgg (here named net). Every time I run the following code, I get a different result: img = Image.open("/Users/macintosh/Desktop/Code/Ger.jpg") img = Variable(vgg.prepare(img)) img = img.reshape((1,) + img.shape) print(net(img,layers=['prob'])['prob']) I have checked vgg.prepare() several times but its output is the same, and there is no random initialization here (net is a pre-trained vgg network). So why is this happening?
python neural-network pre-trained-model chainer vgg-net
share | improve this question
asked Nov 15 '18 at 17:09
sama...