mah271 mah271 - 1 month ago 18
R Question

R scraping a table on https site

I am trying to scrape the report log table from the following website "https://www.heritageunits.com/Locomotive/Detail/NS8098" using the RCurl package with the attached code. It pulls in elements from the page, but when I scroll through the 10 items in the list stored under "page", none of the elements include the table.

library("RCurl")
# Read page
page <- GET(
url="https://heritageunits.com/Locomotive/Detail/NS8098",
config(cainfo = cafile), ssl.verifyhost = FALSE
)


I would also like to scrape the data from the tables on this page when you toggle to the reports from the previous days, but am not sure how to code this in R to select any previous report pages. Any help would be appreciated. Thanks.

Answer

Occasionally I am able to find a json file in the source that you can it directly but I couldn't find one. I went with RSelenium and had it click the next button and cycle through. This method is frail so you have to pay attention when you run it. If the datatable is not fully loaded it will duplicate that last page so I used a small Sys.sleep to make sure that it waited long enough. I would recommend checking for duplicate rows at the end to catch this. Again it is frail but it works

library(RSelenium)
library(XML)
library(foreach)


# Start Selenium server
checkForServer()
startServer()

remDr <- 
  remoteDriver(
    remoteServerAddr = "localhost" 
    , port = 4444
    , browserName = "chrome"
)

remDr$open()

# Navigate to page
remDr$navigate("https://www.heritageunits.com/Locomotive/Detail/NS8098")

# Snag the html
outhtml <- remDr$findElement(using = 'xpath', "//*")
out<-outhtml$getElementAttribute("outerHTML")[[1]]

# Parse with RCurl
doc<-htmlParse(out, encoding = "UTF-8")

# get the last page so we can cycle through
PageNodes <- getNodeSet(doc, '//*[(@id = "history_paginate")]')
Pages <- sapply(X = PageNodes, FUN = xmlValue)
LastPage = as.numeric(gsub('Previous12345\\…(.*)Next', '\\1',Pages))


# loop through one click at a time
Locomotive <- foreach(i = 1:(LastPage-1), .combine = 'rbind', .verbose = TRUE) %do% {

  if(i == 1){

    readHTMLTable(doc)$history

  } else {

    nextpage <- remDr$findElement("css selector", '#history_next')
    nextpage$sendKeysToElement(list(key ="enter"))

    # Take it slow so it gets each page
    Sys.sleep(.50)

    outhtml <- remDr$findElement(using = 'xpath', "//*")
    out<-outhtml$getElementAttribute("outerHTML")[[1]]

    # Parse with RCurl
    doc<-htmlParse(out, encoding = "UTF-8")
    readHTMLTable(doc)$history
  }


}