Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/vba/17.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Excel 如何使用VBA自动单击URL?_Excel_Vba - Fatal编程技术网

Excel 如何使用VBA自动单击URL?

Excel 如何使用VBA自动单击URL?,excel,vba,Excel,Vba,我在Excel中有数千个项目ID的列表。我需要VBA代码将项目ID插入预设的内部HTML链接中,然后“单击”该链接(以下列为“example.com”)。在浏览器中单击HTML链接将启动文件的自动下载。我已经复制了下面的代码,但仍然会出现各种错误。感谢您对此的任何想法 Sub followWebsiteLink() Dim ie As InternetExplorer Dim html As HTMLDocument Dim Link As String Dim Data As Work

我在Excel中有数千个项目ID的列表。我需要VBA代码将项目ID插入预设的内部HTML链接中,然后“单击”该链接(以下列为“example.com”)。在浏览器中单击HTML链接将启动文件的自动下载。我已经复制了下面的代码,但仍然会出现各种错误。感谢您对此的任何想法

Sub followWebsiteLink()

Dim ie As InternetExplorer

Dim html As HTMLDocument

Dim Link As String

Dim Data As Worksheet

Dim startRow, endRow As Integer

Set Data = Sheets("Sheet1")

startRow = 34

endRow = 3574

Application.ScreenUpdating = False

Set ie = New InternetExplorer

ie.Visible = True

With Data

    For i = startRow To endRow

        Link = "https://www.example.com/" & .Range("C" & i).Value

        Link.Click

        startRow = startRow + 1

    Next i

End With

End Sub
试试这个

Option Explicit

Private Declare Function ShellExecute Lib "shell32.dll" Alias "ShellExecuteA" ( _
  ByVal hWnd As Long, _
  ByVal Operation As String, _
  ByVal Filename As String, _
  Optional ByVal Parameters As String, _
  Optional ByVal Directory As String, _
  Optional ByVal WindowStyle As Long = vbMinimizedFocus) As Long

Public Sub OpenUrl()
    Dim lngSuccess As Long
    lngSuccess = ShellExecute(0, "Open", "https://www.google.com")
End Sub

。。。相应地调整它以适应您的代码。

A
String
没有
Click
方法,因此它甚至不会编译。(您正在寻找IE的
导航
方法)作为一般建议:不要走这条路。IE已被弃用,即使您让代码在技术上正常工作,导航到>3000个URL并期望文件下载也注定会失败。看看这个链接,它可能会给你一个出发点: