hunter-stanke / rFIA

rFIA

Home Page:https://rfia.netlify.com/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Issue with getFIA - url connects to 404 page

sofikru opened this issue · comments

fia_states <- c('ID', 'MT', 'WY')
fia.dwnld <- getFIA(states = fia_states,
tables = c('PLOT','COND','SOILS_LAB','SOILS_SAMPLE_LOC'))

I am a very green beginner with rFIA, but cannot find any other online resources to help me with this issue. When I enter the code above, I get the warning message below:

Error in download.file(urls[n], temp, timeout = 3600) :
cannot open URL 'https://apps.fs.usda.gov/fia/datamart/CSV/REF_PLOT.zip'

In addition: Warning message:
In download.file(urls[n], temp, timeout = 3600) :
cannot open URL 'https://apps.fs.usda.gov/fia/datamart/CSV/REF_PLOT.zip': HTTP status was '404 Not Found'

Send email here letting them know which files are inaccessible: SM.FS.fia@usda.gov
A work around is to download the SQLite databases and then extract the tables manually from there.

That's helpful, thank you!!

All CSV files have been unavailable from the FIADB data mart for almost the last month or so -- my understanding is that it's not localized to specific tables. Good luck!

@daniel-perret & @sofikru that was also my experience running into this about a month ago, all the CSV were inaccessible. Karin Kralicek, with the FIA program, sent me this script for saving the individual tables from the SQLite database as CSV.

I'm seeing and hearing more support and focus on delivering FIA via the SQLite for the broader user audience, so @hunter-stanke would it make sense to somehow implement using the SQLite as an alternative to, or instead of, the CSV for getFIA?

###############################################################################
## Title: How to save tables from a SQLite database as individual CSVs`
## Author: Karin Kralicek (karin.kralicek@usda.gov)
## Date: 04/11/2022
##
## About: 
## - Work-around for issue with Datamart downloads for CSVs, which are giving a
##   404 error at present.
###############################################################################
library(DBI)

# Before running the code below download SQLite db for a particular state from
# the Datamart & unzip:
# - https://apps.fs.usda.gov/fia/datamart/datamart_sqlite.html
# - This script uses Colorado as an example.

# Set path to where the SQLite is located
path_db <- "C:/Users/karinkralicek/Downloads/SQLite_FIADB_CO/"

# Set path to where you want the CSVs to appear
path_out <- "C:/Users/karinkralicek/Downloads/"

# Take a reference to the db (again, using CO as the example)
con <- dbConnect(RSQLite::SQLite(), 
                 paste0(path_db, "FIADB_CO.db"))

# grab all the table names
# - alternatively, subset this to only those tables you want
#   e.g. `db_table_names <- c("SURVEY", "PLOT")`
db_table_names <- dbListTables(con)

# iterate through the table names and write out a csv for each table in the db
lapply(db_table_names, function(x) {
  write.csv(dbReadTable(x, conn = con),
            file = paste0(path_db, "CO_", x, ".csv"))
})

# close connection
dbDisconnect(con)

^^This worked, thank you!! One thing, though: for the last step, there's a typo:
lapply(db_table_names, function(x) {
write.csv(dbReadTable(conn = con, x),
file = paste0(path_db, "PNW_", x, ".csv"))
})

switch x with conn to make the script work with no errors!

Thanks to everybody for the quick and helpful responses here! I'm in a new position now, and as such my response time has taken a hit. For the immediate future (i.e., next week or two), the above solution from Karin Kralicek at FIA is going to be our bet. Assuming the data mart isn't back online in that time frame.

However, I really like @whalend's idea about accommodating the SQLite state-subsets. This has been on my mind for a while, and would solve a number of issues. I'll work on it this weekend and follow up.