Validation 为什么模型嵌入不同(tensorflow)?

Validation 为什么模型嵌入不同(tensorflow)?,validation,tensorflow,conv-neural-network,Validation,Tensorflow,Conv Neural Network,我已经训练了一个循环神经网络,现在我正在分析验证集的结果 我使用了一个受VGG-16启发的模型来生成图像嵌入(类似于分类,但不是在最后一层应用softmax,而是使用tf.nn.l2_规范化并输出其结果) 以下是我评估验证集结果的方式: for q in range(total_batch): s = q * batch_size e = (q+1) *batch_size input1,input2, input3 = training.n

我已经训练了一个循环神经网络,现在我正在分析验证集的结果

我使用了一个受VGG-16启发的模型来生成图像嵌入(类似于分类,但不是在最后一层应用softmax,而是使用
tf.nn.l2_规范化
并输出其结果)

以下是我评估验证集结果的方式:

for q in range(total_batch):

        s  = q * batch_size
        e = (q+1) *batch_size

        input1,input2, input3 = training.next_batch(s,e)

        m1 = sess.run([model1], feed_dict={x_anchor:input1})
当我将2个或更多图像馈送到网络时,我得到以下结果(似乎正确):

但当我将批处理大小设置为1图像时,嵌入会发生变化:

model 1[[ 1.         -0.99999994  1.         -1.         -1.         -1.         -1.
   1.         -1.         -1.         -0.99999994  1.          1.         -1.
   0.99999994  1.          1.          1.         -1.          1.
   0.99999994  1.          1.          1.         -0.99999994 -1.         -1.
   1.00000012 -1.          1.         -1.         -1.         -1.
   0.99999994 -1.         -1.          1.         -1.          1.         -1.
  -0.99999994 -0.99999994  1.         -0.99999994  1.          1.         -1.
   1.         -1.          0.99999994  1.         -1.          0.99999994
   1.          1.          0.99999994  1.         -1.         -1.         -1.
   1.         -1.          1.          1.         -1.          1.          1.
  -1.          1.         -1.         -0.99999994  1.          1.         -1.
  -1.         -1.          1.         -1.          1.         -0.99999994
  -0.99999994 -0.99999994 -1.          1.          1.         -1.          1.
  -1.         -0.99999994 -1.         -0.99999994 -1.         -0.99999994
  -1.         -1.         -1.          1.          1.          1.
  -0.99999994 -1.          1.          0.99999994  0.99999994 -1.
   0.99999994  1.         -1.          1.          0.99999994  1.
  -0.99999994 -1.          1.         -0.99999994  1.00000012 -1.         -1.
  -1.          1.          1.          1.          0.99999994 -1.         -1.
   1.          0.99999994 -1.00000012]]
--------------------------------------
为什么会这样

稍后编辑:

这就是当我批量更改图像数量时发生的情况:对于2、3和10个图像,没有任何变化

1-2型批量图像

[[ 0.2743271   0.91178268  0.5872615  -0.98093092 -0.02889113 -0.03134303
  -0.04661543  0.45045683 -0.72907752 -0.99990839 -0.98905557 -0.9919185
  -0.91167277  0.30989948  0.04958323 -0.21191983  0.04454518 -0.44197813
  -0.99413317  0.03212294 -0.1868417  -0.94774926  0.55885661 -0.9531188
   0.29837319 -0.91574109 -0.02016532 -0.87102389 -0.74571055  0.03648382
   0.88016605 -0.0329459  -0.82466573  0.49874297 -0.91681188 -0.75305611
  -0.9985494  -0.71474195  0.0363951  -0.87884218  0.27167588  0.28420082
  -0.22817914 -0.04758676  0.84340817  0.99999797  0.13699238  0.03059645
  -0.90965801  0.41122833 -0.40550923 -0.0283178   0.02512571  0.3185125
  -0.9999544  -0.89122361  0.03199656  0.02893824  0.4112269  -0.38217896
  -0.85480034  0.97177809 -0.6597178  -0.04604623  0.03689161  0.78157973
  -0.91014808 -0.90671444  0.93194991  0.60108137  0.90480918  0.6574111
  -0.18326542 -0.02942787 -0.04679498 -0.04116671 -0.05752649 -0.02889678
   0.04707722  0.14425533 -0.93019849 -0.9645502  -0.66847634 -0.06463299
   0.01547062 -0.49338332 -0.41203031 -0.33844921 -0.35335046  0.7529974
   0.39456338 -0.52115846  0.08408639  0.04551683  0.22299762 -0.02990589
   0.96985286  0.44175237 -0.9540993  -0.47797546  0.92866713 -0.86449015
   0.72105265 -0.26022291  0.56368029  0.91930032  0.02831106  0.14027278
  -0.74462092 -0.01777472 -0.25965354  0.08585165  0.41533408 -0.73962849
  -0.41705173  0.96994138 -0.04474245 -0.37005508  0.5702228   0.01939926
   0.91649407 -0.08130386 -0.82676262 -0.6780622   0.02080963  0.02526239
  -0.21303178 -0.85304713]
 [ 0.96163648 -0.41067314  0.8093974  -0.19435681 -0.99958253 -0.99950862
  -0.99891287  0.89279819 -0.68443102  0.01353077 -0.1475431   0.12687652
   0.41091695 -0.95076936  0.99877     0.97728705  0.99900734  0.89702582
   0.10816276  0.99948382  0.98239011  0.31901628  0.82926422  0.30259636
  -0.9544493  -0.40176904 -0.99979657  0.49124065 -0.66626996  0.99933416
  -0.47466597 -0.99945712 -0.56562042  0.86675    -0.39931941 -0.6579563
   0.05384279 -0.69938827  0.99933743 -0.47711253 -0.96238875 -0.95876473
   0.97361922 -0.99886709  0.53727323 -0.00204876 -0.99057204  0.99953175
  -0.41535798  0.91153234  0.91409093 -0.99959892  0.99968427  0.94791865
   0.00955276  0.45356417  0.99948794 -0.99958116 -0.91153306 -0.92408824
   0.51895714 -0.23589683  0.75151342  0.99893922 -0.99931931  0.62380534
   0.4142831  -0.42174494  0.36258689 -0.79918784 -0.42581734  0.75353211
   0.98306352 -0.99956691 -0.99890453 -0.99915218  0.99834388 -0.99958235
   0.99889123 -0.98954052 -0.36705688 -0.26389948 -0.74373341  0.99790913
   0.99988031 -0.86981201  0.91117018 -0.94098461 -0.93549109 -0.65802348
  -0.91886866 -0.85345995 -0.99645847 -0.99896359 -0.974819   -0.99955267
  -0.2436911   0.89713699  0.29949048 -0.87837315 -0.37091413  0.50264978
   0.69288033  0.96554852 -0.82599306  0.39355665  0.9995991  -0.9901129
   0.6674875   0.99984199  0.96570182 -0.99630791 -0.90966898  0.67301542
  -0.9088828   0.24333875 -0.99899858 -0.92900985 -0.82149017  0.99981183
   0.40004814  0.99668926  0.56255096 -0.73500454 -0.99978352  0.99968088
   0.97704524 -0.52183366]]
--------------------------------------
[[ 0.0844328  -0.77254468  0.02411784 -0.00955349 -0.94160694 -0.94066721
  -0.88625979  0.1149781  -0.02488928 -0.0051736  -0.01160398 -0.02201086
  -0.03521669  0.06318482  0.90330708  0.22742757  0.88465506  0.69219536
  -0.09456988  0.94272465  0.78211629 -0.54738432  0.03007991 -0.02421509
   0.76738203 -0.00945781 -0.94134593 -0.07806421 -0.02830387  0.93667924
  -0.77421671 -0.9394843  -0.01889145  0.02541017 -0.12410384 -0.02550597
  -0.01388813 -0.05055251  0.93260431 -0.01248641 -0.81171089  0.32357386
  -0.58824182 -0.89974731  0.12902646  0.7322554  -0.73108768  0.94067121
  -0.11807644  0.07763585 -0.55410898 -0.94702822  0.91589779  0.060298
  -0.01905362 -0.09003559  0.93983525 -0.01288702  0.73827815 -0.13281474
   0.7777003   0.04422731  0.87029737  0.92503053 -0.90938175  0.11651902
   0.59449238 -0.21909271  0.00718945  0.65837044 -0.48126209  0.0279669
   0.58864224 -0.94854271 -0.90761161 -0.92175764  0.15994184 -0.94678712
   0.90896451 -0.48565978 -0.02508452 -0.0084288  -0.02446353  0.92655313
   0.94136518 -0.12520707 -0.5488351  -0.13468495 -0.10819445 -0.28640082
   0.04190208 -0.07546353 -0.88280475 -0.90112931 -0.04243152 -0.94591689
   0.09673197  0.11114503 -0.66887909 -0.11806588  0.01580797  0.66838497
   0.0267245  -0.30286619 -0.9696154   0.03818087  0.94815379 -0.49568826
  -0.06035382  0.89603657 -0.20980616 -0.14883731  0.36203983  0.26767451
  -0.0712135   0.01647491 -0.91310835 -0.07197624 -0.11620539  0.94761276
   0.11829913  0.0154587   0.25997716 -0.02925223 -0.88819623  0.94844997
   0.89705831 -0.01577406]
 [ 0.06992867 -0.31909466  0.02536829 -0.01694431 -0.33658543 -0.3355417
  -0.43915471  0.08860384 -0.02602066 -0.00991058 -0.01751714 -0.0475461
  -0.17316175  0.9174453   0.42861098 -0.88639027  0.43813875  0.29763162
  -0.08426359  0.33345035 -0.4970766  -0.83511752  0.02875736 -0.21122056
   0.60217178 -0.01281438 -0.31258523 -0.07853186 -0.03347619  0.34996024
   0.09653269 -0.34235266 -0.02185564  0.03394587 -0.11823756 -0.02901799
  -0.02814217 -0.06023972  0.35179645 -0.01470418 -0.44020525  0.92780703
  -0.60654044 -0.43617141  0.12288523  0.61880648  0.57744491  0.33927616
  -0.11079701  0.05827573 -0.8260603  -0.32088026  0.32848966  0.04612691
  -0.04382927 -0.13040958  0.34031361 -0.00834278  0.61152399 -0.09501366
   0.32608342  0.21010639  0.21165863  0.28613895 -0.31460357  0.09254558
  -0.34285307 -0.94639134  0.00946684  0.71495456  0.84588212  0.02944838
  -0.62024397 -0.31660575 -0.41654193 -0.38240901  0.0965206  -0.32049972
   0.412752    0.77851194 -0.03856483 -0.01161524 -0.02505984 -0.11366697
   0.30903593 -0.09858466 -0.82802993 -0.09703995 -0.07281385  0.05652994
   0.11177502 -0.0600725   0.05701264 -0.33766046 -0.00946549 -0.32439664
   0.09326869  0.08744934 -0.73073256 -0.09487063  0.03088553  0.32274199
   0.02751617 -0.90462279 -0.17905805  0.05551227  0.31582361  0.7906518
  -0.11476897  0.36962783 -0.91509163 -0.08848038  0.92485541 -0.07423788
  -0.05653551  0.0251725  -0.40672073 -0.05931272  0.09049221  0.29042748
   0.1437183   0.01071607 -0.03970633 -0.03280446 -0.3763141   0.30833092
   0.41030872 -0.01896038]
 [-0.99397242  0.54895651  0.9993872  -0.99981076 -0.00930807 -0.05056711
   0.14726442 -0.98940867 -0.9993515  -0.99993747 -0.99977916 -0.99862647
   0.9842636  -0.39281276  0.01813768  0.40322316 -0.15943591  0.65748084
   0.99194562  0.00900941  0.37577793  0.05430508  0.99913377  0.97713834
  -0.22025895 -0.9998731   0.12711462  0.99385047 -0.99903864  0.01264404
  -0.62551564 -0.01279757 -0.99958265  0.99910063  0.98519951 -0.99925339
  -0.99950749 -0.996903   -0.0805502  -0.99981397  0.38385507 -0.18567207
   0.5348646  -0.01446672 -0.9839977  -0.28439516  0.36341196  0.00541824
   0.98680389 -0.99527711  0.10289539 -0.01316902 -0.23070674 -0.99711412
  -0.99885732  0.98736358  0.02993833  0.9998821   0.28457668  0.9865762
  -0.53744942 -0.97667766  0.44472823 -0.24988614  0.27212018 -0.9888674
  -0.72734493 -0.23736458  0.99992937  0.23534702 -0.22993524  0.99917501
   0.51845711 -0.00524166 -0.05228702 -0.06423581 -0.98239625 -0.02956322
   0.0584744   0.39755991 -0.99894112 -0.99989706 -0.99938661 -0.3585794
  -0.13538241  0.98722047  0.11465792  0.98612529  0.99145955 -0.95644069
   0.99284977  0.99533743  0.46626735  0.27194008  0.99905449 -0.00283846
  -0.9909308  -0.98994917 -0.13649452  0.98846346  0.99939793 -0.67014861
   0.99926412  0.29988277 -0.16668546  0.99772769  0.03549499  0.3593924
   0.99155712 -0.24595471  0.34436712  0.98489523 -0.11648855  0.96064514
   0.99585754  0.9995473  -0.0284851   0.99564117 -0.98909426 -0.13297313
  -0.98252243 -0.99982309  0.96479797 -0.99903357  0.26361924  0.07331257
   0.16411291 -0.99969572]]
1-3型批量图像

[[ 0.2743271   0.91178268  0.5872615  -0.98093092 -0.02889113 -0.03134303
  -0.04661543  0.45045683 -0.72907752 -0.99990839 -0.98905557 -0.9919185
  -0.91167277  0.30989948  0.04958323 -0.21191983  0.04454518 -0.44197813
  -0.99413317  0.03212294 -0.1868417  -0.94774926  0.55885661 -0.9531188
   0.29837319 -0.91574109 -0.02016532 -0.87102389 -0.74571055  0.03648382
   0.88016605 -0.0329459  -0.82466573  0.49874297 -0.91681188 -0.75305611
  -0.9985494  -0.71474195  0.0363951  -0.87884218  0.27167588  0.28420082
  -0.22817914 -0.04758676  0.84340817  0.99999797  0.13699238  0.03059645
  -0.90965801  0.41122833 -0.40550923 -0.0283178   0.02512571  0.3185125
  -0.9999544  -0.89122361  0.03199656  0.02893824  0.4112269  -0.38217896
  -0.85480034  0.97177809 -0.6597178  -0.04604623  0.03689161  0.78157973
  -0.91014808 -0.90671444  0.93194991  0.60108137  0.90480918  0.6574111
  -0.18326542 -0.02942787 -0.04679498 -0.04116671 -0.05752649 -0.02889678
   0.04707722  0.14425533 -0.93019849 -0.9645502  -0.66847634 -0.06463299
   0.01547062 -0.49338332 -0.41203031 -0.33844921 -0.35335046  0.7529974
   0.39456338 -0.52115846  0.08408639  0.04551683  0.22299762 -0.02990589
   0.96985286  0.44175237 -0.9540993  -0.47797546  0.92866713 -0.86449015
   0.72105265 -0.26022291  0.56368029  0.91930032  0.02831106  0.14027278
  -0.74462092 -0.01777472 -0.25965354  0.08585165  0.41533408 -0.73962849
  -0.41705173  0.96994138 -0.04474245 -0.37005508  0.5702228   0.01939926
   0.91649407 -0.08130386 -0.82676262 -0.6780622   0.02080963  0.02526239
  -0.21303178 -0.85304713]
 [ 0.96163648 -0.41067314  0.8093974  -0.19435681 -0.99958253 -0.99950862
  -0.99891287  0.89279819 -0.68443102  0.01353077 -0.1475431   0.12687652
   0.41091695 -0.95076936  0.99877     0.97728705  0.99900734  0.89702582
   0.10816276  0.99948382  0.98239011  0.31901628  0.82926422  0.30259636
  -0.9544493  -0.40176904 -0.99979657  0.49124065 -0.66626996  0.99933416
  -0.47466597 -0.99945712 -0.56562042  0.86675    -0.39931941 -0.6579563
   0.05384279 -0.69938827  0.99933743 -0.47711253 -0.96238875 -0.95876473
   0.97361922 -0.99886709  0.53727323 -0.00204876 -0.99057204  0.99953175
  -0.41535798  0.91153234  0.91409093 -0.99959892  0.99968427  0.94791865
   0.00955276  0.45356417  0.99948794 -0.99958116 -0.91153306 -0.92408824
   0.51895714 -0.23589683  0.75151342  0.99893922 -0.99931931  0.62380534
   0.4142831  -0.42174494  0.36258689 -0.79918784 -0.42581734  0.75353211
   0.98306352 -0.99956691 -0.99890453 -0.99915218  0.99834388 -0.99958235
   0.99889123 -0.98954052 -0.36705688 -0.26389948 -0.74373341  0.99790913
   0.99988031 -0.86981201  0.91117018 -0.94098461 -0.93549109 -0.65802348
  -0.91886866 -0.85345995 -0.99645847 -0.99896359 -0.974819   -0.99955267
  -0.2436911   0.89713699  0.29949048 -0.87837315 -0.37091413  0.50264978
   0.69288033  0.96554852 -0.82599306  0.39355665  0.9995991  -0.9901129
   0.6674875   0.99984199  0.96570182 -0.99630791 -0.90966898  0.67301542
  -0.9088828   0.24333875 -0.99899858 -0.92900985 -0.82149017  0.99981183
   0.40004814  0.99668926  0.56255096 -0.73500454 -0.99978352  0.99968088
   0.97704524 -0.52183366]]
--------------------------------------
[[ 0.0844328  -0.77254468  0.02411784 -0.00955349 -0.94160694 -0.94066721
  -0.88625979  0.1149781  -0.02488928 -0.0051736  -0.01160398 -0.02201086
  -0.03521669  0.06318482  0.90330708  0.22742757  0.88465506  0.69219536
  -0.09456988  0.94272465  0.78211629 -0.54738432  0.03007991 -0.02421509
   0.76738203 -0.00945781 -0.94134593 -0.07806421 -0.02830387  0.93667924
  -0.77421671 -0.9394843  -0.01889145  0.02541017 -0.12410384 -0.02550597
  -0.01388813 -0.05055251  0.93260431 -0.01248641 -0.81171089  0.32357386
  -0.58824182 -0.89974731  0.12902646  0.7322554  -0.73108768  0.94067121
  -0.11807644  0.07763585 -0.55410898 -0.94702822  0.91589779  0.060298
  -0.01905362 -0.09003559  0.93983525 -0.01288702  0.73827815 -0.13281474
   0.7777003   0.04422731  0.87029737  0.92503053 -0.90938175  0.11651902
   0.59449238 -0.21909271  0.00718945  0.65837044 -0.48126209  0.0279669
   0.58864224 -0.94854271 -0.90761161 -0.92175764  0.15994184 -0.94678712
   0.90896451 -0.48565978 -0.02508452 -0.0084288  -0.02446353  0.92655313
   0.94136518 -0.12520707 -0.5488351  -0.13468495 -0.10819445 -0.28640082
   0.04190208 -0.07546353 -0.88280475 -0.90112931 -0.04243152 -0.94591689
   0.09673197  0.11114503 -0.66887909 -0.11806588  0.01580797  0.66838497
   0.0267245  -0.30286619 -0.9696154   0.03818087  0.94815379 -0.49568826
  -0.06035382  0.89603657 -0.20980616 -0.14883731  0.36203983  0.26767451
  -0.0712135   0.01647491 -0.91310835 -0.07197624 -0.11620539  0.94761276
   0.11829913  0.0154587   0.25997716 -0.02925223 -0.88819623  0.94844997
   0.89705831 -0.01577406]
 [ 0.06992867 -0.31909466  0.02536829 -0.01694431 -0.33658543 -0.3355417
  -0.43915471  0.08860384 -0.02602066 -0.00991058 -0.01751714 -0.0475461
  -0.17316175  0.9174453   0.42861098 -0.88639027  0.43813875  0.29763162
  -0.08426359  0.33345035 -0.4970766  -0.83511752  0.02875736 -0.21122056
   0.60217178 -0.01281438 -0.31258523 -0.07853186 -0.03347619  0.34996024
   0.09653269 -0.34235266 -0.02185564  0.03394587 -0.11823756 -0.02901799
  -0.02814217 -0.06023972  0.35179645 -0.01470418 -0.44020525  0.92780703
  -0.60654044 -0.43617141  0.12288523  0.61880648  0.57744491  0.33927616
  -0.11079701  0.05827573 -0.8260603  -0.32088026  0.32848966  0.04612691
  -0.04382927 -0.13040958  0.34031361 -0.00834278  0.61152399 -0.09501366
   0.32608342  0.21010639  0.21165863  0.28613895 -0.31460357  0.09254558
  -0.34285307 -0.94639134  0.00946684  0.71495456  0.84588212  0.02944838
  -0.62024397 -0.31660575 -0.41654193 -0.38240901  0.0965206  -0.32049972
   0.412752    0.77851194 -0.03856483 -0.01161524 -0.02505984 -0.11366697
   0.30903593 -0.09858466 -0.82802993 -0.09703995 -0.07281385  0.05652994
   0.11177502 -0.0600725   0.05701264 -0.33766046 -0.00946549 -0.32439664
   0.09326869  0.08744934 -0.73073256 -0.09487063  0.03088553  0.32274199
   0.02751617 -0.90462279 -0.17905805  0.05551227  0.31582361  0.7906518
  -0.11476897  0.36962783 -0.91509163 -0.08848038  0.92485541 -0.07423788
  -0.05653551  0.0251725  -0.40672073 -0.05931272  0.09049221  0.29042748
   0.1437183   0.01071607 -0.03970633 -0.03280446 -0.3763141   0.30833092
   0.41030872 -0.01896038]
 [-0.99397242  0.54895651  0.9993872  -0.99981076 -0.00930807 -0.05056711
   0.14726442 -0.98940867 -0.9993515  -0.99993747 -0.99977916 -0.99862647
   0.9842636  -0.39281276  0.01813768  0.40322316 -0.15943591  0.65748084
   0.99194562  0.00900941  0.37577793  0.05430508  0.99913377  0.97713834
  -0.22025895 -0.9998731   0.12711462  0.99385047 -0.99903864  0.01264404
  -0.62551564 -0.01279757 -0.99958265  0.99910063  0.98519951 -0.99925339
  -0.99950749 -0.996903   -0.0805502  -0.99981397  0.38385507 -0.18567207
   0.5348646  -0.01446672 -0.9839977  -0.28439516  0.36341196  0.00541824
   0.98680389 -0.99527711  0.10289539 -0.01316902 -0.23070674 -0.99711412
  -0.99885732  0.98736358  0.02993833  0.9998821   0.28457668  0.9865762
  -0.53744942 -0.97667766  0.44472823 -0.24988614  0.27212018 -0.9888674
  -0.72734493 -0.23736458  0.99992937  0.23534702 -0.22993524  0.99917501
   0.51845711 -0.00524166 -0.05228702 -0.06423581 -0.98239625 -0.02956322
   0.0584744   0.39755991 -0.99894112 -0.99989706 -0.99938661 -0.3585794
  -0.13538241  0.98722047  0.11465792  0.98612529  0.99145955 -0.95644069
   0.99284977  0.99533743  0.46626735  0.27194008  0.99905449 -0.00283846
  -0.9909308  -0.98994917 -0.13649452  0.98846346  0.99939793 -0.67014861
   0.99926412  0.29988277 -0.16668546  0.99772769  0.03549499  0.3593924
   0.99155712 -0.24595471  0.34436712  0.98489523 -0.11648855  0.96064514
   0.99585754  0.9995473  -0.0284851   0.99564117 -0.98909426 -0.13297313
  -0.98252243 -0.99982309  0.96479797 -0.99903357  0.26361924  0.07331257
   0.16411291 -0.99969572]]
1-10型批量图像

 [[ 0.04818952 -0.05918095  0.01802165 ...,  0.05679128  0.11442375
  -0.01418886]
 [ 0.05519219 -0.1301478   0.01969874 ...,  0.07767208  0.18997246
  -0.01425523]
 [ 0.08626937 -0.25595629  0.02016677 ...,  0.63242364  0.43534788
  -0.01190863]
 ..., 
 [-0.14989793  0.49290144  0.18958355 ...,  0.01846622 -0.25266901
  -0.22219713]
 [ 0.09586276 -0.28965887  0.02260818 ...,  0.69641638  0.4886305
  -0.01328262]
 [-0.01896885  0.5706526   0.03627656 ...,  0.0253126  -0.36303344
  -0.05107375]]
model1[0]批量处理10个图像:

model 1[  5.86070597e-01   1.95135430e-01   2.46683449e-01  -1.15100272e-01
  -8.81946564e-01  -8.91003788e-01  -9.79653239e-01   4.03030038e-01
  -3.37121964e-01  -8.15378204e-02  -1.51473030e-01  -6.80846691e-01
  -9.61061358e-01   9.99767303e-01   9.17174935e-01  -9.18523490e-01
   4.61293280e-01  -2.48836592e-01  -6.79437101e-01   8.90483081e-01
  -9.21952069e-01  -9.33306396e-01   3.34582895e-01  -9.64680552e-01
   2.17209637e-01  -9.24504027e-02  -8.05327177e-01  -9.81654227e-01
  -5.13453305e-01   8.98176014e-01   4.65111285e-01  -8.81239653e-01
  -2.21239150e-01   1.82802275e-01  -9.89428520e-01  -3.78754020e-01
  -1.94648057e-01  -9.60758328e-01   9.99629676e-01  -1.34037957e-01
   5.10413982e-02   9.51475441e-01  -1.16250336e-01  -9.02551413e-01
   9.75457489e-01   3.81219059e-01   6.40182137e-01   8.98657262e-01
  -9.81229663e-01   3.48009765e-01  -8.25251520e-01  -8.78372252e-01
   7.62746096e-01   3.04450452e-01  -3.40841383e-01  -9.91277814e-01
   8.71233225e-01  -5.17561100e-04   4.97255594e-01  -1.69827744e-01
  -1.96575001e-01   9.88305330e-01  -3.83750439e-01  -1.71618521e-01
   1.40758827e-01   7.19896436e-01  -3.79732788e-01  -5.24435103e-01
   7.77314454e-02   8.57556820e-01   5.83529353e-01   3.34895849e-01
  -8.37150455e-01  -8.86918783e-01  -8.40985894e-01  -8.68736804e-01
  -1.96579075e-03  -8.52129221e-01   8.46933126e-01   6.71221852e-01
  -6.03814125e-01  -1.12122051e-01  -2.26175249e-01  -5.59274018e-01
   5.39124966e-01  -3.51777464e-01  -8.13132882e-01  -1.37575284e-01
  -2.99482316e-01   7.41403461e-01   8.68456364e-01  -4.78453755e-01
   3.74983132e-01   1.13467500e-01   1.33464202e-01  -8.81694794e-01
   7.77389646e-01   3.82896155e-01  -9.17372942e-01  -3.26944888e-01
   3.77652109e-01  -1.51620105e-01   2.89691836e-01  -9.90572333e-01
   3.18472356e-01   8.29179347e-01   8.44251931e-01   7.09410369e-01
  -9.93203878e-01  -4.48296890e-02  -9.96632695e-01   8.17295071e-03
   9.04701591e-01  -7.95541167e-01  -4.05060053e-01   1.68647662e-01
  -9.02341783e-01  -8.14558685e-01   9.71956789e-01   9.01448905e-01
   9.96791184e-01  -5.61384344e-03  -7.10371912e-01  -4.87548530e-01
   4.36763577e-02   8.43983173e-01  -1.29987687e-01  -1.70727238e-01]
后期编辑:这是我的列车评估功能。唯一改变的是批次大小:

def train(x_anchor, x_positive, x_negative, idx_model):
    global weights, bias, batch_size, keep_rate, suma, nr

    sess =  tf.Session()

    saver = tf.train.import_meta_graph('/home/bogdan/triplet/model_85_la_suta/my-model_'+str(idx_model)+'.meta')
    saver.restore(sess,tf.train.latest_checkpoint('./triplet/model_85_la_suta/'))



    graph = tf.get_default_graph()

    with tf.device('/gpu:1'):
        with tf.variable_scope("siamese") as scope:
            model1 = siamese_convnet(x_anchor, graph)
            scope.reuse_variables()
            model2 = siamese_convnet(x_positive, graph)
            scope.reuse_variables()
            model3 = siamese_convnet(x_negative, graph)


    eps = 1e-10
    d_pos = tf.sqrt(tf.reduce_sum(tf.square(model1 - model2), 1) + eps)
    d_neg = tf.sqrt(tf.reduce_sum(tf.square(model1 - model3), 1) + eps)


    training = lfw_generated_test.inputData()
    training.shuffle_epoca();



    nr_training_examples = training.get_nr_training()
    print ("nr_training_examples "+str(nr_training_examples))

    total_batch = int(nr_training_examples/batch_size)

    print("tb "+str(total_batch))

    avg_acc_test = 0;

    for q in range(total_batch):

        s  = q * batch_size
        e = (q+1) *batch_size

        input1,input2, input3 = training.next_batch(s,e)

        distance1, distance2, m1, m2, m3 = sess.run([d_pos, d_neg, model1, model2, model3], feed_dict={x_anchor:input1, x_positive:input2, x_negative:input3})

        '''print("input 1"+str(input1))
        print("--------------------------------------")'''

        '''print("dist1 = "+str(distance1))
        print("--------------------------------------")
        print("dist2 = "+str(distance2))
        print("--------------------------------------")
        print("--------------------------------------")'''
        ''''print(np.shape(distance1))
        print(np.shape(distance2)) '''

        print("model 1"+str(m1[0]))
        print("--------------------------------------")
        ''''print("--------------------------------------")
        print("model 2"+str(m2))
        print("--------------------------------------")
        print("--------------------------------------")
        print("model 3"+str(m3))
        print("--------------------------------------")
        print("--------------------------------------")'''
        ''''print(m2)
        print(m3)'''

        ''''print(np.shape(model1))
        print(np.shape(model2))
        print(np.shape(model3))'''

        #print(str(q)+" ----------------------------------------------------------------")

        test_acc = compute_accuracy(distance1, distance2)



        avg_acc_test +=test_acc*100

    print('Accuract TEST set %0.2f' % (avg_acc_test/(total_batch)))


batch_size = 2

#70x70 images

x_anchor = tf.placeholder('float', [None, 4900])
x_positive = tf.placeholder('float', [None, 4900])
x_negative = tf.placeholder('float', [None, 4900])
labels = tf.placeholder(tf.float32, [None, 1]) #0 sau 1 (impostor sau genuine)

train(x_anchor, x_positive, x_negative, 99);

你在使用批处理规范化吗?你确定你的模型中没有任何随机性吗?@nessuno我没有使用批处理normalization@AvijitDasgupta我使用的是一个保存的模型(已经训练过),权重和偏差已经计算好了,我用graph.get_tensor_by_name检索它们,所以不,没有任何随机性。@Helloli当您向批处理中添加一个以上的图像时,输出是否会更改?从2点到3点?