I'm running R on Fedora 31 on a Dell XPS laptop with 8Gb RAM. I'm attempting to plot this GeoTIFF using ggplot2, so that I can overlay other data using code I've already written with ggplot2. I've been roughly following this lesson on working with raster data in R. After converting the TIFF into a RasterLayer into a data frame, the R program fails when loading the data frame with ggplot2, simply outputting "Killed" and exiting.
Here is a minimal code sample that produces this error:
library(tidyverse)
library(raster)
library(rgdal)
afg_pop <- raster("afg_ppp_2020.tif")
pop_df <- as.data.frame(afg_pop, xy = TRUE)
ggplot() +
# This is the line that results with the error: "Killed"
geom_raster(data = pop_df , aes(x = x, y = y, fill = afg_ppp_2020))
Running dmesg reveals that R ran out of memory:
[20563.603882] Out of memory: Killed process 42316 (R) total-vm:11845908kB, anon-rss:6878420kB, file-rss:4kB, shmem-rss:0kB, UID:1000 pgtables:19984kB oom_score_adj:0
It's hard for me to believe that even with a data file this large R is running out of the memory required to handle it. Why does R need so much memory to perform this task, and more importantly what other method can I use to plot this data, preferably using ggplot2?
I'm relatively new to R, so please forgive me if I'm ignoring something obvious here. Any help would be appreciated!