Python 刚定义了一个新变量,现在程序陷入了一个无限循环

Python 刚定义了一个新变量,现在程序陷入了一个无限循环,python,python-3.x,while-loop,infinite-loop,Python,Python 3.x,While Loop,Infinite Loop,我是编程的初学者,我写了一个实现优化算法的脚本。一开始效果不错;但后来我试图通过定义一个新变量来加快速度,现在由于某种原因,它似乎陷入了一个无限循环。以下是第一个版本(我在评论中指出了将发生更改的地方): 现在,程序效率非常低,因为在第二个while循环中,每次迭代都会计算数量f(x_k)+(alpha*c*(gradTrans@p_k))[0,0]:,即使它是常量。因此,我决定将这个数量命名为RHS=f(x_k)+(alpha*c*(gradTrans@p_k))[0,0]:,并将其放入whi

我是编程的初学者,我写了一个实现优化算法的脚本。一开始效果不错;但后来我试图通过定义一个新变量来加快速度,现在由于某种原因,它似乎陷入了一个无限循环。以下是第一个版本(我在评论中指出了将发生更改的地方):

现在,程序效率非常低,因为在第二个while循环中,每次迭代都会计算数量
f(x_k)+(alpha*c*(gradTrans@p_k))[0,0]:
,即使它是常量。因此,我决定将这个数量命名为
RHS=f(x_k)+(alpha*c*(gradTrans@p_k))[0,0]:
,并将其放入while循环中。新代码如下。我所做的只是将这个量定义为一个变量,现在程序陷入了一个无限循环。非常感谢你的帮助

# This program uses the Steepest Descent Method to 
# minimize the Rosenbrock function
import numpy as np
import time

# Define the Rosenbrock Function
def f(x_k):
    x, y = x_k[0, 0], x_k[0, 1] 
    return 100 * (y - x**2)**2 + (1 - x)**2

# Gradient of f 
def gradient(x_k):
    x, y = x_k[0, 0], x_k[0, 1] 
    return  np.array([[-400*x*(y-x**2)-2*(1-x), 200*(y-x**2)]])




def main():
    start = time.time()
    # Define the starting guess
    x_k = np.array([[2, 2]])
    # Define counter for number of steps
    numSteps = 0

    # Keep iterating until both components of the gradient are less than 0.1 in absolute value
    while abs((gradient(x_k)[0, 0])) > 0.1 or abs((gradient(x_k))[0, 1]) > 0.1:
        numSteps = numSteps + 1

        # Step direction
        p_k = - gradient(x_k)
        gradTrans = - p_k.T

        # Now we use a backtracking algorithm to find a step length
        alpha = 1.0
        ratio = 0.8
        c = 0.01 # This is just a constant that is used in the algorithm

        # This loop selects an alpha which satisfies the Armijo condition  
        RHS = f(x_k) + (alpha * c * (gradTrans  @ p_k))[0, 0]

        #####################################
        ###### CHANGE HAS OCCURED ###########
        #####################################

        while f(x_k + alpha * p_k) > RHS:
            alpha = ratio * alpha

        x_k = x_k + alpha * p_k
    end =  time.time()
    print("The number of steps is: ", numSteps)
    print("The final step is:", x_k)
    print("The gradient is: ", gradient(x_k))
    print("The elapsed time is:", round(end - start), "seconds.")



main()

RHS
需要在循环内使用
alpha
的新值重新计算。(不知道这是为了加快速度。)

哦,上帝,谢谢,我真是太蠢了。为了加快速度,我要做的是在循环之前计算f(x_k)和(gradTrans@p_k))[0,0],然后在循环中使用这些值
# This program uses the Steepest Descent Method to 
# minimize the Rosenbrock function
import numpy as np
import time

# Define the Rosenbrock Function
def f(x_k):
    x, y = x_k[0, 0], x_k[0, 1] 
    return 100 * (y - x**2)**2 + (1 - x)**2

# Gradient of f 
def gradient(x_k):
    x, y = x_k[0, 0], x_k[0, 1] 
    return  np.array([[-400*x*(y-x**2)-2*(1-x), 200*(y-x**2)]])




def main():
    start = time.time()
    # Define the starting guess
    x_k = np.array([[2, 2]])
    # Define counter for number of steps
    numSteps = 0

    # Keep iterating until both components of the gradient are less than 0.1 in absolute value
    while abs((gradient(x_k)[0, 0])) > 0.1 or abs((gradient(x_k))[0, 1]) > 0.1:
        numSteps = numSteps + 1

        # Step direction
        p_k = - gradient(x_k)
        gradTrans = - p_k.T

        # Now we use a backtracking algorithm to find a step length
        alpha = 1.0
        ratio = 0.8
        c = 0.01 # This is just a constant that is used in the algorithm

        # This loop selects an alpha which satisfies the Armijo condition  
        RHS = f(x_k) + (alpha * c * (gradTrans  @ p_k))[0, 0]

        #####################################
        ###### CHANGE HAS OCCURED ###########
        #####################################

        while f(x_k + alpha * p_k) > RHS:
            alpha = ratio * alpha

        x_k = x_k + alpha * p_k
    end =  time.time()
    print("The number of steps is: ", numSteps)
    print("The final step is:", x_k)
    print("The gradient is: ", gradient(x_k))
    print("The elapsed time is:", round(end - start), "seconds.")



main()