t0b4cc0 t0b4cc0 - 25 days ago 9
R Question

Error in open.connection Timeout was reached

I have this currently working. Im new to R so idk how connections work and its one thing google didnt help.

My code gets me numbers from a crest api.

df <- readWorksheetFromFile("marketTypeIDS.xls", sheet=1, startRow = 1, endCol = 2)

typeIDs <- unname(unlist(df[,1]))


itemcount <- 0
monthvolumes <- seq(1:11897)

baseurl <- "https://public-crest.eveonline.com/market/10000048/types/"

monthlyvolume <- (0)

tmpvol <- (0)

for (i in 1:11897)
itemcount <- fromJSON(paste0(baseurl, typeIDs[i], "/history/"), flatten = TRUE)$totalCount
if (itemcount ==0)
monthvolumes[i] <- 0
currentdate <- as.Date(fromJSON(paste0(baseurl, typeIDs[i], "/history/"), flatten = TRUE)$items[itemcount,8])
if (as.numeric(currentdate)<calcday)
tmpvol <- fromJSON(paste0(baseurl, typeIDs[i], "/history/"), flatten = TRUE)$items[itemcount,6]
monthlyvolume <- monthlyvolume+tmpvol
itemcount <- itemcount-1
if (itemcount==0)

It stops at ~700 (should do it over 11000 times tho) and then gives me this error:

Error in open.connection(con, "rb") : Timeout was reached
6 open.connection(con, "rb")
5 open(con, "rb")
4 parse_con(txt, 1024^2, bigint_as_char)
3 parseJSON(txt, bigint_as_char)
2 fromJSON_string(txt = txt, simplifyVector = simplifyVector, simplifyDataFrame = simplifyDataFrame,
simplifyMatrix = simplifyMatrix, flatten = flatten, ...)
1 fromJSON(paste0(baseurl, typeIDs[i], "/history/"), flatten = TRUE)
In addition: Warning message:
closing unused connection 3 (https://public-crest.eveonline.com/market/10000048/types/18/history/)

This connection was created on the first run of the for loop (it starts with the 18 in the link)
How could I close this connection beforehand so it doesnt break the loop? (this just ran about an hour so its hard to test through "trying")

Thanks for help in advance!
If u have any other suggestions my ears are open!

Answer Source

I solved this so that it doesnt matter at what point it breaks. It always starts at the last written record if there is one.

So instead of looping over 11897 records from 1 I did read out the last written thing from the output file.

Ive wrapped all of it in a Try/Catch like @hrbrmstr suggested and I have put all that in an endless loop that only breaks when the last record was written.

Not a beautiful solution but one that perfectly worked and was easy to implement, since it only restarted a few times and ran till the end, and I only needed it a few times.