Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/r/68.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
在R中解析Twitter API时处理rjson错误_Json_R_Twitter - Fatal编程技术网

在R中解析Twitter API时处理rjson错误

在R中解析Twitter API时处理rjson错误,json,r,twitter,Json,R,Twitter,当使用下面的代码从Twitter流式API读取原始JSON响应的文本文件时,我遇到以下错误: Error in FUN(c("{\"geo\":null,\"retweet_count\":0,\"favorited\":false,\"text\":\"\\u663c\\u98ef\\u2026\\u9762\\u5012\\u304f\\u305b\\u3047\\u3001\\uff11\\uff10\\u79d2\\u30c1\\u30e3\\u30fc\\u30b8\\u3067\\u

当使用下面的代码从Twitter流式API读取原始JSON响应的文本文件时,我遇到以下错误:

Error in FUN(c("{\"geo\":null,\"retweet_count\":0,\"favorited\":false,\"text\":\"\\u663c\\u98ef\\u2026\\u9762\\u5012\\u304f\\u305b\\u3047\\u3001\\uff11\\uff10\\u79d2\\u30c1\\u30e3\\u30fc\\u30b8\\u3067\\u826f\\u3044\\u306a\",\"in_reply_to_status_id_str\":null,\"in_reply_to_screen_name\":null,\"in_reply_to_user_id_str\":null,\"retweeted\":false,\"in_reply_to_status_id\":null,\"source\":\"\\u003Ca href=\\\"http:\\/\\/twittbot.net\\/\\\" rel=\\\"nofollow\\\"\\u003Etwittbot.net\\u003C\\/a\\u003E\",\"id_str\":\"158028263334219776\",\"entities\":{\"hashtags\":[],\"urls\":[],\"user_mentions\":[]},\"contributors\":null,\"place\":null,\"truncated\":false,\"created_at\":\"Sat Jan 14 03:30:48 +0000 2012\",\"coordinates\":null,\"user\":{\"is_translator\":false,\"time_zone\":\"Tokyo\",\"profile_background_color\":\"000000\",\"followers_count\":559,\"profile_image_url\":\"http:\\/\\/a3.twimg.com\\/profile_images\\/1499599901\\/_____normal.jpg\",\"default_profile\":false,\"contributors_enabled\":false,\"profile_background_tile\":false,\"profile_image_url_https\":\"https:\\/\\/si0.twimg.com\\/profile_images\\/1499599901\\/_____normal.jpg\",\"profile_sidebar_fill_color\":\"9fcca8\",\"location\":\"\\u771f\\u30fb\\u5e1d\\u56fd\\u5b66\\u5712\",\"description\":\"\\u30a4\\u30ca\\u30ba\\u30de\\u30a4\\u30ec\\u30d6\\u30f3\\u306b\\u767b\\u5834\\u3059\\u308b\\u4e0d\\u52d5\\u660e\\u738b\\u306e\\u975e\\u516c\\u5f0fbot\\u3067\\u3059\\u3002\\u516c\\u5f0f\\u3068\\u306f\\u95a2\\u4fc2\\u3042\\u308a\\u307e\\u305b\\u3093\\uff01\\u30b2\\u30fc\\u30e0\\u3068\\u30a2\\u30cb\\u30e1\\uff12\\u671f\\u306e\\u53f0\\u8a5e\\u3092\\u545f\\u304d\\u307e\\u3059\\u3001\\u634f\\u9020\\u3082\\u6709\\u308a\\uff01\\uff12\\u671f\\u8a2d\\u5b9a\\u306a\\u306e\\u3067\\u512a\\u3057\\u3055\\u304c\\u6b20\\u7247\\u3082\\u3042\\u308a\\u307e\\u305b\\u3093\\u3002\\u3053\\u308c\\u3089\\u304c\\u8a31\\u305b\\u308b\\u65b9\\u306e\\u307f\\u30d5\\u30a9\\u30ed\\u30fc\\u304a\\u9858\\u3044\\u3057\\u307e\\u3059\\u3002\",\"screen_name\":\"2nd_akio_bot\",\"verified\":false,\"profile_sidebar_border_color\":\"181A1E\",\"id_str\":\"267521818\",\"default_profile_image\":false,\"lang\":\"ja\",\"statuses_count\":23384,\"notifications\":null,\"profile_use_background_image\":false,\"favourites_count\":1,\"geo_enabled\":false,\"created_at\":\"Thu Mar 17 02:43:11 +0000 2011\",\"profile_text_color\":\"2e1208\",\"protected\":false,\"following\":null,\"profile_background_image_url\":\"http:\\/\\/a1.twimg.com\\/images\\/themes\\/theme9\\/bg.gif\",\"name\":\"\\u4e0d\\u52d5\\u660e\\u738b\",\"show_all_inline_media\":false,\"follow_request_sent\":null,\"profile_background_image_url_https\":\"https:\\/\\/si0.twimg.com\\/images\\/themes\\/theme9\\/bg.gif\",\"friends_count\":516,\"profile_link_color\":\"004217\",\"id\":267521818,\"listed_count\":59,\"utc_offset\":32400,\"url\":\"http:\\/\\/m-pe.tv\\/u\\/page.php?uid=botsetsumei&id=2\"},\"id\":158028263334219776,\"in_reply_to_user_id\":null}",  : 
  no data to parse
这是我在for循环中的代码片段:

results.list <- lapply(readLines(f, n=-1, warn=FALSE), fromJSON)

results.list这是我能够拼凑出的一个黑客解决方案。考虑到巨大的数据量(每个文件125K行x25+个文件),我不确定这是最好的解决方案,但至少R没有停止

convertTwitter <- function(x) {
  ## adapted from http://biocodenv.com/wordpress/?p=15
  ## ?Control
  z <- try(fromJSON(x))
  if(class(z) != "try-error")  {
    return(z)
  }
}

convertTwitter这是我能够拼凑出的黑客解决方案。考虑到巨大的数据量(每个文件125K行x25+个文件),我不确定这是最好的解决方案,但至少R没有停止

convertTwitter <- function(x) {
  ## adapted from http://biocodenv.com/wordpress/?p=15
  ## ?Control
  z <- try(fromJSON(x))
  if(class(z) != "try-error")  {
    return(z)
  }
}
twitter