Logistic回归,梯度计算误差 我目前正在攻读安得烈NG机器学习课程。在第二个练习中,进行逻辑回归。我遇到了一个大麻烦,到目前为止还无法解决。错误发生在成本函数中。(这是我正在使用的python版本)
看起来我已经完美地计算了梯度(我也复制了许多不同的计算梯度的方法),当我运行这个单元格时,这个错误会弹出。有人能帮我解决这个问题吗Logistic回归,梯度计算误差 我目前正在攻读安得烈NG机器学习课程。在第二个练习中,进行逻辑回归。我遇到了一个大麻烦,到目前为止还无法解决。错误发生在成本函数中。(这是我正在使用的python版本),python,logistic-regression,Python,Logistic Regression,看起来我已经完美地计算了梯度(我也复制了许多不同的计算梯度的方法),当我运行这个单元格时,这个错误会弹出。有人能帮我解决这个问题吗 initial_theta = np.zeros((n+1,1)) cost, grad = costFunction(initial_theta, X, y) print('Cost at initial theta (zeros): {:.3f}'.format(cost)) print('Expected cost (approx): 0.693\n') p
initial_theta = np.zeros((n+1,1))
cost, grad = costFunction(initial_theta, X, y)
print('Cost at initial theta (zeros): {:.3f}'.format(cost))
print('Expected cost (approx): 0.693\n')
print('Gradient at initial theta (zeros):')
print('\t[{:.4f}, {:.4f}, {:.4f}]'.format(*grad))
print('Expected gradients (approx):\n\t[-0.1000, -12.0092, -11.2628]\n')
# Compute and display cost and gradient with non-zero theta
test_theta = np.array([-24, 0.2, 0.2])
cost, grad = costFunction(test_theta, X, y)
print('Cost at test theta: {:.3f}'.format(cost))
print('Expected cost (approx): 0.218\n')
print('Gradient at test theta:')
print('\t[{:.3f}, {:.3f}, {:.3f}]'.format(*grad))
print('Expected gradients (approx):\n\t[0.043, 2.566, 2.647]')
initial_theta = np.zeros((n+1,1))
cost, grad = costFunction(initial_theta, X, y)
print('Cost at initial theta (zeros): {:.3f}'.format(cost))
print('Expected cost (approx): 0.693\n')
print('Gradient at initial theta (zeros):')
print('\t[{:.4f}, {:.4f}, {:.4f}]'.format(*grad))
print('Expected gradients (approx):\n\t[-0.1000, -12.0092, -11.2628]\n')
# Compute and display cost and gradient with non-zero theta
test_theta = np.array([-24, 0.2, 0.2])
cost, grad = costFunction(test_theta, X, y)
print('Cost at test theta: {:.3f}'.format(cost))
print('Expected cost (approx): 0.218\n')
print('Gradient at test theta:')
print('\t[{:.3f}, {:.3f}, {:.3f}]'.format(*grad))
print('Expected gradients (approx):\n\t[0.043, 2.566, 2.647]')
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-43-4242e4a47821> in <module>
7 print('Expected cost (approx): 0.693\n')
8 print('Gradient at initial theta (zeros):')
----> 9 print('\t[{:.4f}, {:.4f}, {:.4f}]'.format(*grad))
10 print('Expected gradients (approx):\n\t[-0.1000, -12.0092, -11.2628]\n')
11
TypeError: unsupported format string passed to numpy.ndarray.__format__
> [[ 0.5 0.5 0.5 -0.5 -0.5
> 0.5 -0.5 -0.5 -0.5 -0.5
> 0.5 0.5 -0.5 -0.5 0.5 -0.5 -0.5 0.5 -0.5 -0.5
> 0.5 -0.5 0.5 0.5 -0.5 -0.5 -0.5 0.5 0.5 0.5 -0.5 -0.5 0.5 -0.5 0.5
> 0.5 0.5 -0.5 0.5 0.5 -0.5 0.5 -0.5 0.5 0.5
> 0.5 -0.5 -0.5 -0.5 -0.5 -0.5 -0.5 -0.5 0.5 0.5
> 0.5 -0.5 0.5 -0.5 -0.5 -0.5 0.5 0.5 0.5 0.5
> 0.5 -0.5 0.5 -0.5 -0.5
> 0.5 -0.5 -0.5 -0.5 -0.5 -0.5 -0.5 -0.5 0.5 0.5 -0.5 -0.5 -0.5 -0.5 -0.5 -0.5 0.5 -0.5 -0.5 0.5 -0.5 -0.5 0.5 -0.5 -0.5 -0.5 -0.5 -0.5 -0.5 -0.5 ] [ 32.82213703 32.82213703 32.82213703 -32.82213703 -32.82213703
> 32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703
> 32.82213703 32.82213703 -32.82213703 -32.82213703 32.82213703 -32.82213703 -32.82213703 32.82213703 -32.82213703 -32.82213703
> 32.82213703 -32.82213703 32.82213703 32.82213703 -32.82213703 -32.82213703 -32.82213703 32.82213703 32.82213703 32.82213703 -32.82213703 -32.82213703 32.82213703 -32.82213703 32.82213703
> 32.82213703 32.82213703 -32.82213703 32.82213703 32.82213703 -32.82213703 32.82213703 -32.82213703 32.82213703 32.82213703
> 32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703 32.82213703 32.82213703
> 32.82213703 -32.82213703 32.82213703 -32.82213703 -32.82213703 -32.82213703 32.82213703 32.82213703 32.82213703 32.82213703
> 32.82213703 -32.82213703 32.82213703 -32.82213703 -32.82213703
> 32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703 32.82213703 32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703 32.82213703 -32.82213703 -32.82213703 32.82213703 -32.82213703 -32.82213703 32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703 -32.82213703] [ 33.11099904 33.11099904 33.11099904 -33.11099904 -33.11099904
> 33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904
> 33.11099904 33.11099904 -33.11099904 -33.11099904 33.11099904 -33.11099904 -33.11099904 33.11099904 -33.11099904 -33.11099904
> 33.11099904 -33.11099904 33.11099904 33.11099904 -33.11099904 -33.11099904 -33.11099904 33.11099904 33.11099904 33.11099904 -33.11099904 -33.11099904 33.11099904 -33.11099904 33.11099904
> 33.11099904 33.11099904 -33.11099904 33.11099904 33.11099904 -33.11099904 33.11099904 -33.11099904 33.11099904 33.11099904
> 33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904 33.11099904 33.11099904
> 33.11099904 -33.11099904 33.11099904 -33.11099904 -33.11099904 -33.11099904 33.11099904 33.11099904 33.11099904 33.11099904
> 33.11099904 -33.11099904 33.11099904 -33.11099904 -33.11099904
> 33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904 33.11099904 33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904 33.11099904 -33.11099904 -33.11099904 33.11099904 -33.11099904 -33.11099904 33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904 -33.11099904]]