I am a new R user and I am interested in downloading a particular set of data using rgbif. I did:
occ_search(scientificName = "Lupinus", hasCoordinate = TRUE, continent = c('africa', 'europe', 'asia'),
basisOfRecord = "PRESERVED_SPECIMEN", decimalLatitude = '-1.50750278, 47.62390556',
decimalLongitude = '-18.75250000, 52.85138889',
fields = c('scientificName', 'decimalLatitude', 'decimalLongitude', 'country'), return = 'data')
Which works fine to do the search. However, what I would like to do is save the search as an object to create an .csv file of it. If I go:
OS <- occ_search(scientificName = "Lupinus", hasCoordinate = TRUE, continent = c('africa', 'europe', 'asia'),
basisOfRecord = "PRESERVED_SPECIMEN", decimalLatitude = '-1.50750278, 47.62390556',
decimalLongitude = '-18.75250000, 52.85138889',
fields = c('scientificName', 'decimalLatitude', 'decimalLongitude', 'country'), return = 'data')
Then
OS1 <- as.data.frame(OS)
I get the following error:
Error in as.data.frame.default(occ) : cannot coerce class ""gbif"" to a data.frame
I also tried using occ_download this way:
OD <- occ_download("scientificName = Lupinus",
"hasCoordinate = TRUE",
"continent = africa,europe,asia",
"basisOfRecord = PRESERVED_SPECIMEN",
"decimalLatitude >= -1.50750278",
"decimalLatitude <= 47.62390556",
"decimalLongitude >= -18.75250000",
"decimalLongitude <= 52.85138889")
And all I get is a file with 0 observations and 235 columns.
Any help will be highly appreciated!
The reason your occ_search output is not compatible with as.data.frame is in the continent = c('africa', 'europe', 'asia') argument. If you look at the length(OS) and str(OS), you will see that it actually has three data.frames in it. You can access the individual elements like this:
OS$europe
# or
OS['europe']
# or
OS[[1]]
You can combine these like this
OC1 <- rbind(OS$africa, OS$europe, OS$asia)
Or, if you are not going to know all the different names, like this:
do.call(rbind, OS)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With