Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/fsharp/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/ms-access/4.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
F# 为什么f代码中的成员字典总是空的?_F#_Dictionary - Fatal编程技术网

F# 为什么f代码中的成员字典总是空的?

F# 为什么f代码中的成员字典总是空的?,f#,dictionary,F#,Dictionary,我想为所有的URL刮一页,然后把它们放到字典里。我用字典创建了一个类。但我似乎无法在其中添加元素 type crawler = new()= {} member this.urls = new Dictionary<string,string>() member this.start (url : string)= let hw = new HtmlWeb() let doc = hw.Load(url)

我想为所有的URL刮一页,然后把它们放到字典里。我用字典创建了一个类。但我似乎无法在其中添加元素

type crawler =

     new()= {}
     member this.urls  = new Dictionary<string,string>()
     member this.start (url : string)=
        let hw = new HtmlWeb()
        let doc = hw.Load(url)
        let docNode = doc.DocumentNode
        let links = docNode.SelectNodes(".//a")

        for aLink in links do
            let href = aLink.GetAttributeValue("href"," ")
            if href.StartsWith("http://")  && href.EndsWith(".html") then
              this.urls.Add(href, href)

为什么字典URL为空?

因为这里的URL是在每次调用时返回新字典的属性

type Crawler() =  
    let urls = new Dictionary<string,string>()
    member this.Urls  = urls
    member this.Start (url : string)=        
        let hw = new HtmlWeb()        
        let doc = hw.Load(url)        
        let docNode = doc.DocumentNode        
        let links = docNode.SelectNodes(".//a")        
        for aLink in links do            
            let href = aLink.GetAttributeValue("href"," ")            
            if href.StartsWith("http://")  && href.EndsWith(".html") then              
                urls.Add(href, href)

这不是你的问题,但如果你有兴趣采取更实用的方法,这里有一种方法:

type Crawler = 
  { Urls : Set<string> }

[<CompilationRepresentation(CompilationRepresentationFlags.ModuleSuffix)>]
module Crawler =

  [<CompiledName("Start")>]
  let start crawler (url:string) = 
    let { Urls = oldUrls } = crawler
    let newUrls =
      HtmlWeb().Load(url).DocumentNode.SelectNodes(".//a")
      |> Seq.cast<HtmlNode>
      |> Seq.choose (fun link ->
        match link.GetAttributeValue("href"," ") with
        | href when href.StartsWith("http://") && href.EndsWith(".html") -> Some href
        | _ -> None)
      |> Set.ofSeq
      |> Set.union oldUrls
    { crawler with Urls = newUrls }

您的数据和行为现在是分开的。爬虫是一种不可变的记录类型。start接受爬虫程序并返回一个新的爬虫程序,其中包含更新的URL列表。我用Set替换了Dictionary,因为键和值是相同的;消除了未使用的let绑定,并悄悄地进行了一些模式匹配。这在C语言中也应该有一个相对友好的界面。

最好的选择是什么?他的代码就是解决方案。它使用类似于类级值的字段,而不是每次调用属性get函数。它也可以在没有_uu.Urls属性的情况下工作,注意到他没有使用它。哇,我不知道你可以用这样的模式匹配从记录类型中提取值!