diff --git a/2023-08-delta/search.json b/2023-08-delta/search.json index 8362b31f..4e7f9ff0 100644 --- a/2023-08-delta/search.json +++ b/2023-08-delta/search.json @@ -298,7 +298,7 @@ "href": "session_10.html#about-the-data", "title": "10  Using sf for Spatia Data & Intro to Making Maps", "section": "10.2 About the data", - "text": "10.2 About the data\nAll of the data used in this tutorial are simplified versions of real datasets available on the KNB Data Repository. We are using simplified datasets to ease the processing burden on all our computers since the original geospatial datasets are high-resolution. These simplified versions of the datasets may contain topological errors.\nThe spatial data we will be using to create the map are:\n\n\n\nData\nOriginal datasets\n\n\n\n\nAlaska regional boundaries\nJared Kibele and Jeanette Clark. 2018. State of Alaska’s Salmon and People Regional Boundaries. Knowledge Network for Biocomplexity. doi:10.5063/F1125QWP.\n\n\nCommunity locations and population\nJeanette Clark, Sharis Ochs, Derek Strong, and National Historic Geographic Information System. 2018. Languages used in Alaskan households, 1990-2015. Knowledge Network for Biocomplexity. doi:10.5063/F11G0JHX.\n\n\nAlaska rivers\nThe rivers shapefile is a simplified version of Jared Kibele and Jeanette Clark. Rivers of Alaska grouped by SASAP region, 2018. Knowledge Network for Biocomplexity. doi:10.5063/F1SJ1HVW.\n\n\n\n\n\n\n\n\n\nSetup\n\n\n\n\nNavigate to this dataset on KNB’s test site and download the zip folder.\nUpload the zip folder to the data folder in the training_{USERNAME} project. You don’t need to unzip the folder ahead of time, uploading will automatically unzip the folder.\nCreate a new R Markdown file.\n\nTitle it “Intro to sf package for Spatial Data and Making Maps”\nSave the file and name it “intro-sf-spatial-data-maps”.\n\nLoad the following libraries at the top of your R Markdown file.\n\n\nlibrary(readr)\nlibrary(sf)\nlibrary(ggplot2)\nlibrary(leaflet)\nlibrary(scales)\nlibrary(ggmap)\nlibrary(dplyr)" + "text": "10.2 About the data\nAll of the data used in this tutorial are simplified versions of real datasets available on the KNB Data Repository. We are using simplified datasets to ease the processing burden on all our computers since the original geospatial datasets are high-resolution. These simplified versions of the datasets may contain topological errors.\nThe spatial data we will be using to create the map are:\n\n\n\nData\nOriginal datasets\n\n\n\n\nAlaska regional boundaries\nJared Kibele and Jeanette Clark. 2018. State of Alaska’s Salmon and People Regional Boundaries. Knowledge Network for Biocomplexity. doi:10.5063/F1125QWP.\n\n\nCommunity locations and population\nJeanette Clark, Sharis Ochs, Derek Strong, and National Historic Geographic Information System. 2018. Languages used in Alaskan households, 1990-2015. Knowledge Network for Biocomplexity. doi:10.5063/F11G0JHX.\n\n\nAlaska rivers\nThe rivers shapefile is a simplified version of Jared Kibele and Jeanette Clark. Rivers of Alaska grouped by SASAP region, 2018. Knowledge Network for Biocomplexity. doi:10.5063/F1SJ1HVW.\n\n\n\n\n\n\n\n\n\nSetup\n\n\n\n\nNavigate to this dataset on KNB’s test site and download the zip folder.\nUpload the zip folder to the data folder in the training_{USERNAME} project. You don’t need to unzip the folder ahead of time, uploading will automatically unzip the folder.\n\nAlternatively, programatically download and extract the demo data with:\n\n\n\nknb_url <- 'https://dev.nceas.ucsb.edu/knb/d1/mn/v2/object/urn%3Auuid%3Aaceaecb2-1ce0-4d41-a839-d3607d32bb58'\ndownload.file(url = knb_url, destfile = 'demo_data.zip')\nunzip('demo_data.zip', exdir = 'data')\nfile.remove('demo_data.zip')\n\n\nCreate a new R Markdown file.\n\nTitle it “Intro to sf package for Spatial Data and Making Maps”\nSave the file and name it “intro-sf-spatial-data-maps”.\n\nLoad the following libraries at the top of your R Markdown file.\n\n\nlibrary(readr)\nlibrary(sf)\nlibrary(ggplot2)\nlibrary(leaflet)\nlibrary(scales)\nlibrary(ggmap)\nlibrary(dplyr)" }, { "objectID": "session_10.html#exploring-the-data-using-plot-and-st_crs", diff --git a/2023-08-delta/session_03.html b/2023-08-delta/session_03.html index 66966317..fb773f92 100644 --- a/2023-08-delta/session_03.html +++ b/2023-08-delta/session_03.html @@ -917,8 +917,8 @@

datatable(locations)
-
- +
+
@@ -935,8 +935,8 @@

popup = ~ restore_loc )
-
- +
+


@@ -961,8 +961,8 @@

color = "white", opacity = 1)
-
- +
+


@@ -991,8 +991,8 @@

color = "white", opacity = 1)
-
- +
+


diff --git a/2023-08-delta/session_10.html b/2023-08-delta/session_10.html index 988d5455..70326d7e 100644 --- a/2023-08-delta/session_10.html +++ b/2023-08-delta/session_10.html @@ -369,7 +369,18 @@

  1. Navigate to this dataset on KNB’s test site and download the zip folder.
  2. -
  3. Upload the zip folder to the data folder in the training_{USERNAME} project. You don’t need to unzip the folder ahead of time, uploading will automatically unzip the folder.
  4. +
  5. Upload the zip folder to the data folder in the training_{USERNAME} project. You don’t need to unzip the folder ahead of time, uploading will automatically unzip the folder. +
      +
    1. Alternatively, programatically download and extract the demo data with:
    2. +
  6. +
+
+
knb_url <- 'https://dev.nceas.ucsb.edu/knb/d1/mn/v2/object/urn%3Auuid%3Aaceaecb2-1ce0-4d41-a839-d3607d32bb58'
+download.file(url = knb_url, destfile = 'demo_data.zip')
+unzip('demo_data.zip', exdir = 'data')
+file.remove('demo_data.zip')
+
+
  1. Create a new R Markdown file.
    1. Title it “Intro to sf package for Spatial Data and Making Maps”
    2. @@ -378,13 +389,13 @@

      Load the following libraries at the top of your R Markdown file.

    -
    library(readr)
    -library(sf)
    -library(ggplot2)
    -library(leaflet)
    -library(scales)
    -library(ggmap)
    -library(dplyr)
    +
    library(readr)
    +library(sf)
    +library(ggplot2)
    +library(leaflet)
    +library(scales)
    +library(ggmap)
    +library(dplyr)
    @@ -393,19 +404,19 @@

    10.3 Exploring the data using plot() and st_crs()

    First let’s read in the shapefile of regional boundaries in Alaska using read_sf() and then create a basic plot of the data plot().

    -
    # read in shapefile using read_sf()
    -ak_regions <- read_sf("data/ak_regions_simp.shp")
    +
    # read in shapefile using read_sf()
    +ak_regions <- read_sf("data/ak_regions_simp.shp")
    -
    # quick plot
    -plot(ak_regions)
    +
    # quick plot
    +plot(ak_regions)
    -

    +

    We can also examine it’s class using class().

    -
    class(ak_regions)
    +
    class(ak_regions)
    [1] "sf"         "tbl_df"     "tbl"        "data.frame"
    @@ -414,7 +425,7 @@

    -
    head(ak_regions)
    +
    head(ak_regions)
    Simple feature collection with 6 features and 3 fields
     Geometry type: MULTIPOLYGON
    @@ -431,7 +442,7 @@ 

    glimpse(ak_regions)

    +
    glimpse(ak_regions)
    Rows: 13
     Columns: 4
    @@ -453,7 +464,7 @@ 

    blog post that explains these concepts in more detail with very helpful diagrams and examples.

    You can view what crs is set by using the function st_crs().

    -
    st_crs(ak_regions)
    +
    st_crs(ak_regions)
    Coordinate Reference System:
       User input: WGS 84 
    @@ -483,10 +494,10 @@ 

    3338.

    -
    ak_regions_3338 <- ak_regions %>%
    -    st_transform(crs = 3338)
    -
    -st_crs(ak_regions_3338)
    +
    ak_regions_3338 <- ak_regions %>%
    +    st_transform(crs = 3338)
    +
    +st_crs(ak_regions_3338)
    Coordinate Reference System:
       User input: EPSG:3338 
    @@ -535,9 +546,9 @@ 

    -
    plot(ak_regions_3338)
    +
    plot(ak_regions_3338)
    -

    +

    Much better!

    @@ -550,15 +561,15 @@

    10.4.1 select()

    -
    # returns the names of all the columns in dataset
    -colnames(ak_regions_3338)
    +
    # returns the names of all the columns in dataset
    +colnames(ak_regions_3338)
    [1] "region_id" "region"    "mgmt_area" "geometry" 
    -
    ak_regions_3338 %>%
    -    select(region)
    +
    ak_regions_3338 %>%
    +    select(region)
    Simple feature collection with 13 features and 1 field
     Geometry type: MULTIPOLYGON
    @@ -588,7 +599,7 @@ 

    10.4.2 filter()

    -
    unique(ak_regions_3338$region)
    +
    unique(ak_regions_3338$region)
     [1] "Aleutian Islands"     "Arctic"               "Bristol Bay"         
      [4] "Chignik"              "Copper River"         "Kodiak"              
    @@ -598,8 +609,8 @@ 

    -
    ak_regions_3338 %>%
    -    filter(region == "Southeast")
    +
    ak_regions_3338 %>%
    +    filter(region == "Southeast")
    Simple feature collection with 1 feature and 3 fields
     Geometry type: MULTIPOLYGON
    @@ -638,19 +649,19 @@ 

    1. Read in alaska_population.csv using read.csv()

    -
    # read in population data
    -pop <- read_csv("data/alaska_population.csv")
    +
    # read in population data
    +pop <- read_csv("data/alaska_population.csv")

    Turn pop into a spatial object

    The st_join() function is a spatial left join. The arguments for both the left and right tables are objects of class sf which means we will first need to turn our population data.frame with latitude and longitude coordinates into an sf object.

    We can do this easily using the st_as_sf() function, which takes as arguments the coordinates and the crs. The remove = F specification here ensures that when we create our geometry column, we retain our original lat lng columns, which we will need later for plotting. Although it isn’t said anywhere explicitly in the file, let’s assume that the coordinate system used to reference the latitude longitude coordinates is WGS84, which has a crs number of 4326.

    -
    pop_4326 <- st_as_sf(pop,
    -                     coords = c('lng', 'lat'),
    -                     crs = 4326,
    -                     remove = F)
    -
    -head(pop_4326)
    +
    pop_4326 <- st_as_sf(pop,
    +                     coords = c('lng', 'lat'),
    +                     crs = 4326,
    +                     remove = F)
    +
    +head(pop_4326)
    Simple feature collection with 6 features and 5 fields
     Geometry type: POINT
    @@ -672,18 +683,18 @@ 

    Now we can do our spatial join! You can specify what geometry function the join uses (st_intersects, st_within, st_crosses, st_is_within_distance…) in the join argument. The geometry function you use will depend on what kind of operation you want to do, and the geometries of your shapefiles.

    In this case, we want to find what region each city falls within, so we will use st_within.

    -
    pop_joined <- st_join(pop_4326, ak_regions_3338, join = st_within)
    +
    pop_joined <- st_join(pop_4326, ak_regions_3338, join = st_within)

    This gives an error!

    -
    Error: st_crs(x) == st_crs(y) is not TRUE
    +
    Error: st_crs(x) == st_crs(y) is not TRUE

    Turns out, this won’t work right now because our coordinate reference systems are not the same. Luckily, this is easily resolved using st_transform(), and projecting our population object into Alaska Albers.

    -
    pop_3338 <- st_transform(pop_4326, crs = 3338)
    +
    pop_3338 <- st_transform(pop_4326, crs = 3338)
    -
    pop_joined <- st_join(pop_3338, ak_regions_3338, join = st_within)
    -
    -head(pop_joined)
    +
    pop_joined <- st_join(pop_3338, ak_regions_3338, join = st_within)
    +
    +head(pop_joined)
    Simple feature collection with 6 features and 8 fields
     Geometry type: POINT
    @@ -718,12 +729,12 @@ 

    3. Calculate the total population by region using group_by() and summarize()

    Next we compute the total population for each region. In this case, we want to do a group_by() and summarise() as this were a regular data.frame. Otherwise all of our point geometries would be included in the aggregation, which is not what we want. Our goal is just to get the total population by region. We remove the sticky geometry using as.data.frame(), on the advice of the sf::tidyverse help page.

    -
    pop_region <- pop_joined %>%
    -    as.data.frame() %>%
    -    group_by(region) %>%
    -    summarise(total_pop = sum(population))
    -
    -head(pop_region)
    +
    pop_region <- pop_joined %>%
    +    as.data.frame() %>%
    +    group_by(region) %>%
    +    summarise(total_pop = sum(population))
    +
    +head(pop_region)
    # A tibble: 6 × 2
       region           total_pop
    @@ -738,12 +749,12 @@ 

    And use a regular left_join() to get the information back to the Alaska region shapefile. Note that we need this step in order to regain our region geometries so that we can make some maps.

    -
    pop_region_3338 <- left_join(ak_regions_3338, pop_region, by = "region")
    -
    -# plot to check
    -plot(pop_region_3338["total_pop"])
    +
    pop_region_3338 <- left_join(ak_regions_3338, pop_region, by = "region")
    +
    +# plot to check
    +plot(pop_region_3338["total_pop"])
    -

    +

    So far, we have learned how to use sf and dplyr to use a spatial join on two datasets and calculate a summary metric from the result of that join.

    @@ -762,46 +773,46 @@

    Say we want to calculate the population by Alaska management area, as opposed to region.

    -
    pop_mgmt_338 <- pop_region_3338 %>%
    -    group_by(mgmt_area) %>%
    -    summarize(total_pop = sum(total_pop))
    -
    -plot(pop_mgmt_338["total_pop"])
    +
    pop_mgmt_338 <- pop_region_3338 %>%
    +    group_by(mgmt_area) %>%
    +    summarize(total_pop = sum(total_pop))
    +
    +plot(pop_mgmt_338["total_pop"])
    -

    +

    Notice that the region geometries were combined into a single polygon for each management area.

    If we don’t want to combine geometries, we can specify do_union = F as an argument.

    -
    pop_mgmt_3338 <- pop_region_3338 %>%
    -    group_by(mgmt_area) %>%
    -    summarize(total_pop = sum(total_pop), do_union = F)
    -
    -plot(pop_mgmt_3338["total_pop"])
    +
    pop_mgmt_3338 <- pop_region_3338 %>%
    +    group_by(mgmt_area) %>%
    +    summarize(total_pop = sum(total_pop), do_union = F)
    +
    +plot(pop_mgmt_3338["total_pop"])
    -

    +

    4. Save the spatial object to a new file using write_sf()

    Save the spatial object to disk using write_sf() and specifying the filename. Writing your file with the extension .shp will assume an ESRI driver driver, but there are many other format options available.

    -
    write_sf(pop_region_3338, "data/ak_regions_population.shp")
    +
    write_sf(pop_region_3338, "data/ak_regions_population.shp")

    10.5.1 Visualize with ggplot

    ggplot2 now has integrated functionality to plot sf objects using geom_sf().

    We can plot sf objects just like regular data.frames using geom_sf.

    -
    ggplot(pop_region_3338) +
    -    geom_sf(aes(fill = total_pop)) +
    -    labs(fill = "Total Population") +
    -    scale_fill_continuous(low = "khaki",
    -                          high =  "firebrick",
    -                          labels = comma) +
    -    theme_bw()
    +
    ggplot(pop_region_3338) +
    +    geom_sf(aes(fill = total_pop)) +
    +    labs(fill = "Total Population") +
    +    scale_fill_continuous(low = "khaki",
    +                          high =  "firebrick",
    +                          labels = comma) +
    +    theme_bw()
    -

    +

    We can also plot multiple shapefiles in the same plot. Say if we want to visualize rivers in Alaska, in addition to the location of communities, since many communities in Alaska are on rivers. We can read in a rivers shapefile, doublecheck the crs to make sure it is what we need, and then plot all three shapefiles - the regional population (polygons), the locations of cities (points), and the rivers (linestrings).

    @@ -851,25 +862,25 @@

    -
    rivers_3338 <- read_sf("data/ak_rivers_simp.shp")
    -st_crs(rivers_3338)
    +
    rivers_3338 <- read_sf("data/ak_rivers_simp.shp")
    +st_crs(rivers_3338)

    Note that although no EPSG code is set explicitly, with some sluething we can determine that this is EPSG:3338. This site is helpful for looking up EPSG codes.

    -
    ggplot() +
    -    geom_sf(data = pop_region_3338, aes(fill = total_pop)) +
    -    geom_sf(data = pop_3338, size = 0.5) +
    -    geom_sf(data = rivers_3338,
    -            aes(linewidth = StrOrder)) +
    -    scale_linewidth(range = c(0.05, 0.5), guide = "none") +
    -    labs(title = "Total Population by Alaska Region",
    -         fill = "Total Population") +
    -    scale_fill_continuous(low = "khaki",
    -                          high =  "firebrick",
    -                          labels = comma) +
    -    theme_bw() 
    +
    ggplot() +
    +    geom_sf(data = pop_region_3338, aes(fill = total_pop)) +
    +    geom_sf(data = pop_3338, size = 0.5) +
    +    geom_sf(data = rivers_3338,
    +            aes(linewidth = StrOrder)) +
    +    scale_linewidth(range = c(0.05, 0.5), guide = "none") +
    +    labs(title = "Total Population by Alaska Region",
    +         fill = "Total Population") +
    +    scale_fill_continuous(low = "khaki",
    +                          high =  "firebrick",
    +                          labels = comma) +
    +    theme_bw() 
    -

    +

    @@ -879,51 +890,51 @@

    -
    pop_3857 <- pop_3338 %>%
    -    st_transform(crs = 3857)
    +
    pop_3857 <- pop_3338 %>%
    +    st_transform(crs = 3857)

    Next, let’s grab a base map from the Stamen map tile server covering the region of interest. First we include a function that transforms the bounding box (which starts in EPSG:4326) to also be in the EPSG:3857 CRS, which is the projection that the map raster is returned in from Stamen. This is an issue with ggmap described in more detail here

    -
    # Define a function to fix the bbox to be in EPSG:3857
    -# See https://github.com/dkahle/ggmap/issues/160#issuecomment-397055208
    -ggmap_bbox_to_3857 <- function(map) {
    -    if (!inherits(map, "ggmap"))
    -        stop("map must be a ggmap object")
    -    # Extract the bounding box (in lat/lon) from the ggmap to a numeric vector,
    -    # and set the names to what sf::st_bbox expects:
    -    map_bbox <- setNames(unlist(attr(map, "bb")),
    -                         c("ymin", "xmin", "ymax", "xmax"))
    -    
    -    # Coonvert the bbox to an sf polygon, transform it to 3857,
    -    # and convert back to a bbox (convoluted, but it works)
    -    bbox_3857 <-
    -        st_bbox(st_transform(st_as_sfc(st_bbox(map_bbox, crs = 4326)), 3857))
    -    
    -    # Overwrite the bbox of the ggmap object with the transformed coordinates
    -    attr(map, "bb")$ll.lat <- bbox_3857["ymin"]
    -    attr(map, "bb")$ll.lon <- bbox_3857["xmin"]
    -    attr(map, "bb")$ur.lat <- bbox_3857["ymax"]
    -    attr(map, "bb")$ur.lon <- bbox_3857["xmax"]
    -    map
    -}
    +
    # Define a function to fix the bbox to be in EPSG:3857
    +# See https://github.com/dkahle/ggmap/issues/160#issuecomment-397055208
    +ggmap_bbox_to_3857 <- function(map) {
    +    if (!inherits(map, "ggmap"))
    +        stop("map must be a ggmap object")
    +    # Extract the bounding box (in lat/lon) from the ggmap to a numeric vector,
    +    # and set the names to what sf::st_bbox expects:
    +    map_bbox <- setNames(unlist(attr(map, "bb")),
    +                         c("ymin", "xmin", "ymax", "xmax"))
    +    
    +    # Coonvert the bbox to an sf polygon, transform it to 3857,
    +    # and convert back to a bbox (convoluted, but it works)
    +    bbox_3857 <-
    +        st_bbox(st_transform(st_as_sfc(st_bbox(map_bbox, crs = 4326)), 3857))
    +    
    +    # Overwrite the bbox of the ggmap object with the transformed coordinates
    +    attr(map, "bb")$ll.lat <- bbox_3857["ymin"]
    +    attr(map, "bb")$ll.lon <- bbox_3857["xmin"]
    +    attr(map, "bb")$ur.lat <- bbox_3857["ymax"]
    +    attr(map, "bb")$ur.lon <- bbox_3857["xmax"]
    +    map
    +}

    Next, we define the bounding box of interest, and use get_stamenmap() to get the basemap. Then we run our function defined above on the result of the get_stamenmap() call.

    -
    bbox <- c(-170, 52,-130, 64) # this is roughly southern Alaska
    -ak_map <- get_stamenmap(bbox, zoom = 4) # get base map
    -ak_map_3857 <- ggmap_bbox_to_3857(ak_map) # fix the bbox to be in EPSG:3857
    +
    bbox <- c(-170, 52,-130, 64) # this is roughly southern Alaska
    +ak_map <- get_stamenmap(bbox, zoom = 4) # get base map
    +ak_map_3857 <- ggmap_bbox_to_3857(ak_map) # fix the bbox to be in EPSG:3857

    Finally, plot both the base raster map with the population data overlayed, which is easy now that everything is in the same projection (3857):

    -
    ggmap(ak_map_3857) +
    -    geom_sf(data = pop_3857,
    -            aes(color = population),
    -            inherit.aes = F) +
    -    scale_color_continuous(low = "khaki",
    -                           high =  "firebrick",
    -                           labels = comma)
    +
    ggmap(ak_map_3857) +
    +    geom_sf(data = pop_3857,
    +            aes(color = population),
    +            inherit.aes = F) +
    +    scale_color_continuous(low = "khaki",
    +                           high =  "firebrick",
    +                           labels = comma)
    -

    +

    @@ -933,16 +944,16 @@

    -
    epsg3338 <- leaflet::leafletCRS(
    -    crsClass = "L.Proj.CRS",
    -    code = "EPSG:3338",
    -    proj4def =  "+proj=aea +lat_1=55 +lat_2=65 +lat_0=50 +lon_0=-154 +x_0=0 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs",
    -    resolutions = 2 ^ (16:7)
    -)
    +
    epsg3338 <- leaflet::leafletCRS(
    +    crsClass = "L.Proj.CRS",
    +    code = "EPSG:3338",
    +    proj4def =  "+proj=aea +lat_1=55 +lat_2=65 +lat_0=50 +lon_0=-154 +x_0=0 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs",
    +    resolutions = 2 ^ (16:7)
    +)

    You might notice that this looks familiar! The syntax is a bit different, but most of this information is also contained within the crs of our shapefile:

    -
    st_crs(pop_region_3338)
    +
    st_crs(pop_region_3338)
    Coordinate Reference System:
       User input: EPSG:3338 
    @@ -992,48 +1003,22 @@ 

    -
    pop_region_4326 <- pop_region_3338 %>% st_transform(crs = 4326)
    +
    pop_region_4326 <- pop_region_3338 %>% st_transform(crs = 4326)

    -
    m <- leaflet(options = leafletOptions(crs = epsg3338)) %>%
    -    addPolygons(data = pop_region_4326,
    -                fillColor = "gray",
    -                weight = 1)
    -
    -m
    +
    m <- leaflet(options = leafletOptions(crs = epsg3338)) %>%
    +    addPolygons(data = pop_region_4326,
    +                fillColor = "gray",
    +                weight = 1)
    +
    +m
    -
    - +
    +

    We can add labels, legends, and a color scale.

    -
    pal <- colorNumeric(palette = "Reds", domain = pop_region_4326$total_pop)
    -
    -m <- leaflet(options = leafletOptions(crs = epsg3338)) %>%
    -    addPolygons(
    -        data = pop_region_4326,
    -        fillColor = ~ pal(total_pop),
    -        weight = 1,
    -        color = "black",
    -        fillOpacity = 1,
    -        label = ~ region
    -    ) %>%
    -    addLegend(
    -        position = "bottomleft",
    -        pal = pal,
    -        values = range(pop_region_4326$total_pop),
    -        title = "Total Population"
    -    )
    -
    -m
    -
    -
    - -
    -
    -

    We can also add the individual communities, with popup labels showing their population, on top of that!

    -
    pal <- colorNumeric(palette = "Reds", domain = pop_region_4326$total_pop)
     
     m <- leaflet(options = leafletOptions(crs = epsg3338)) %>%
    @@ -1042,31 +1027,57 @@ 

    fillColor = ~ pal(total_pop), weight = 1, color = "black", - fillOpacity = 1 - ) %>% - addCircleMarkers( - data = pop_4326, - lat = ~ lat, - lng = ~ lng, - radius = ~ log(population / 500), - # arbitrary scaling - fillColor = "gray", - fillOpacity = 1, - weight = 0.25, - color = "black", - label = ~ paste0(pop_4326$city, ", population ", comma(pop_4326$population)) - ) %>% - addLegend( - position = "bottomleft", - pal = pal, - values = range(pop_region_4326$total_pop), - title = "Total Population" - ) - -m

    + fillOpacity = 1, + label = ~ region + ) %>% + addLegend( + position = "bottomleft", + pal = pal, + values = range(pop_region_4326$total_pop), + title = "Total Population" + ) + +m

    +
    +
    + +
    +
    +

    We can also add the individual communities, with popup labels showing their population, on top of that!

    +
    +
    pal <- colorNumeric(palette = "Reds", domain = pop_region_4326$total_pop)
    +
    +m <- leaflet(options = leafletOptions(crs = epsg3338)) %>%
    +    addPolygons(
    +        data = pop_region_4326,
    +        fillColor = ~ pal(total_pop),
    +        weight = 1,
    +        color = "black",
    +        fillOpacity = 1
    +    ) %>%
    +    addCircleMarkers(
    +        data = pop_4326,
    +        lat = ~ lat,
    +        lng = ~ lng,
    +        radius = ~ log(population / 500),
    +        # arbitrary scaling
    +        fillColor = "gray",
    +        fillOpacity = 1,
    +        weight = 0.25,
    +        color = "black",
    +        label = ~ paste0(pop_4326$city, ", population ", comma(pop_4326$population))
    +    ) %>%
    +    addLegend(
    +        position = "bottomleft",
    +        pal = pal,
    +        values = range(pop_region_4326$total_pop),
    +        title = "Total Population"
    +    )
    +
    +m
    -
    - +
    +
    diff --git a/2023-08-delta/session_10_files/figure-html/unnamed-chunk-19-1.png b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-19-1.png index b499a675..3872dc37 100644 Binary files a/2023-08-delta/session_10_files/figure-html/unnamed-chunk-19-1.png and b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-19-1.png differ diff --git a/2023-08-delta/session_10_files/figure-html/unnamed-chunk-20-1.png b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-20-1.png index bcda06bb..b499a675 100644 Binary files a/2023-08-delta/session_10_files/figure-html/unnamed-chunk-20-1.png and b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-20-1.png differ diff --git a/2023-08-delta/session_10_files/figure-html/unnamed-chunk-21-1.png b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-21-1.png index 4d0d3eae..bcda06bb 100644 Binary files a/2023-08-delta/session_10_files/figure-html/unnamed-chunk-21-1.png and b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-21-1.png differ diff --git a/2023-08-delta/session_10_files/figure-html/unnamed-chunk-22-1.png b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-22-1.png new file mode 100644 index 00000000..4d0d3eae Binary files /dev/null and b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-22-1.png differ diff --git a/2023-08-delta/session_10_files/figure-html/unnamed-chunk-25-1.png b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-25-1.png new file mode 100644 index 00000000..0113aed3 Binary files /dev/null and b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-25-1.png differ diff --git a/2023-08-delta/session_10_files/figure-html/unnamed-chunk-29-1.png b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-29-1.png new file mode 100644 index 00000000..28a68d21 Binary files /dev/null and b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-29-1.png differ diff --git a/2023-08-delta/session_10_files/figure-html/unnamed-chunk-4-1.png b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-4-1.png new file mode 100644 index 00000000..02fff39a Binary files /dev/null and b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-4-1.png differ diff --git a/2023-08-delta/session_10_files/figure-html/unnamed-chunk-9-1.png b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-9-1.png new file mode 100644 index 00000000..bab99cbc Binary files /dev/null and b/2023-08-delta/session_10_files/figure-html/unnamed-chunk-9-1.png differ