16. Nighttime Lights from NASA Black Marble during Crisis#

16.1. Summary#

In a previous class, we learned how to extract baseline information about population distribution. However, during some crisis events, this data becomes less useful as people need to relocate for safety concerns. As discussed, knowing people’s location is crucial for emergency management and humanitarian response, so other data sources need to be explored. Satellite images of Earth at night can be used to detect areas where artificial lighting has increased/decreased and use this change as a proxy of population movement. Using multiple data sources, NASA developed the Black Marble dataset that produces cloud-free images that have been corrected for atmospheric, terrain, lunar BRDF, thermal, and straylight effects. This dataset is open-source and has proven valuable for studying human behavior.

This notebook instructs students to leverage the Black Marble dataset to understand human dynamics during or after a crisis using the BlackMarblePy package developed at the World Bank.

16.2. Learning Objectives#

16.2.1. Overall goals#

The main goal of this class is to teach students how to access and analyze Black Marble’s data to quantify changes in the spatial distribution of radiance related to a crisis event.

16.2.2. Specific goals#

At the end of this notebook, you should have gained an understanding and appreciation of the following:

  1. Black Marble data:

    • Understand how the data is generated.

  2. Learn BlackMarblePy package:

    • Learn how to use the package for obtaining and analyzing Black Marble data.

  3. Black Marble Data Visualization:

    • Summarize the data to show changes across time.

    • Generate timeline plots of the radiance.

    • Generate dynamic visualizations to show changes in radiance across the time.

16.3. About the data#

At night, satellite images of Earth capture a uniquely human signal–artificial lighting. Remotely-sensed lights at night provide a new data source for improving our understanding of interactions between human systems and the environment. NASA has developed the Black Marble, a daily calibrated, corrected, and validated product suite, so nightlight data can be used effectively for scientific observations. Black Marble is playing a vital role in research on light pollution, illegal fishing, fires, disaster impacts and recovery, and human settlements and associated energy infrastructures.

Source.

NASA’s operational Black Marble product suite ingests multiple-source input datasets and ancillary data to output the highest quality pixel-based estimates of NTL. NASA’s Black Marble algorithm produces cloud-free images that have been corrected for atmospheric, terrain, lunar BRDF, thermal, and straylight effects. The corrected nighttime radiance, resulting in a superior retrieval of nighttime lights at short time scales and a reduction in background noise, enables quantitative detection and analyses of daily, seasonal and annual variations.

Source.

16.3.1. NASA Black Marble using BlackMarblePy#

NASA’s Black Marble product suite includes:

  • daily at-sensor TOA nighttime radiance (VNP46A1/VJ146A1)

  • daily moonlight and atmosphere-corrected NTL (VNP46A2/VJ146A2)

  • monthly moonlight and atmosphere-corrected NTL (VNP46A3/VJ146A3)

  • yearly moonlight and atmosphere-corrected NTL (VNP46A4/VJ146A4)

All the products are provided at a 15 arc-second geographic linear latitude/longitude (lat/lon) grid. The data are provided in the standard land Hierarchical Data Format - Earth Observing System (HDF-EOS) format.

Source

Each product provides different variables to work with. For example VNP46A3 offers the following:

  • AllAngle_Composite_Snow_Covered: Temporal Radiance Composite Using All Observations During Snow-covered Period

  • AllAngle_Composite_Snow_Free: Temporal Radiance Composite Using All Observations During Snow-free Period

  • NearNadir_Composite_Snow_Free: Temporal Radiance Composite Using Near Nadir Angle Observations (View Zenith Angle 0-20 degree) During Snowfree Period

  • NearNadir_Composite_Snow_Free_Quality1: Quality Flag of Temporal Radiance Composite Using Near Nadir Angle Observations (View Zenith Angle 0-20 degree) During snow-free period

Check the complete list of available variables in the User’s Guide.

In this course, we will be accessing this dataset using BlackMarblePy, a Python package that provides a simple way to use nighttime lights data from NASA’s Black Marble project. This package automates the process of downloading all relevant tiles from the NASA LAADS DAAC to cover a region of interest, converting the raw files (in HDF5 format) to georeferenced rasters, and mosaicking rasters together when needed.

../../_images/blackmarblepy.png

Fig. 16.1 BlackMarblePy.#

16.3.2. Accessing the data#

BlackMarblePy requires a NASA Earthdata bearer token. Follow this step-by-step guide to obtain it.

Once you have the token, it needs to be saved in a secure place. It is a good practice to use environment variables and avoid hardcoding them in your notebooks/scripts.

16.3.3. Obtain the data#

16.3.4. Region of interest#

The first step is to set up the region of interest for which the data needs to be downloaded. This example will use the Gaza-Israel 2023 conflict as an example. Palestine’s Administrative Level 2 boundaries were downloaded from GADM.

Caution

This example uses a GeoPackage file, which stores each administrative level as a different layer.

import pandas as pd
import geopandas as gpd
import fiona
import os
from datetime import datetime

from blackmarble.extract import bm_extract
from blackmarble.raster import bm_raster, transform

from bokeh.models import (
    HoverTool,
    Legend,
    Range1d,
    Span,
    Title,
)
from bokeh.plotting import figure, output_notebook, show
import colorcet as cc
# List the layers in the file
gpkg_path = '../../data/ntl-physical-impact/gadm41_PSE.gpkg'
layers = fiona.listlayers(gpkg_path)
layers
['ADM_ADM_0', 'ADM_ADM_1', 'ADM_ADM_2']

Hide code cell source

# Open admin2 layer
admin2 = gpd.read_file('../../data/ntl-physical-impact/gadm41_PSE.gpkg', layer = 'ADM_ADM_2')
admin2.explore()
Make this Notebook Trusted to load map: File -> Trust Notebook

16.3.5. Extract the data#

The goal of this example is to compare the situation before and after the outbreak of the conflict on the 7th of October, 2023.

The most interesting product is VNP46A2, which offers daily moonlight and atmosphere-corrected NTL. By default, when downloading VNP46A2, BlackMarblePy processes the data for the variable Gap_Filled_DNB_BRDF-Corrected_NTL. However, if necessary, this is a parameter that can be changed. As mentioned above, the complete list of variables by product can be checked in the User’s Guide.

VNP46A3 offers monthly moonlight and atmosphere-corrected NTL. This product is interesting to get a higher-level overview of the situation.

Caution

Check that all the data has been downloaded correctly.

Both products are downloaded because, later in this section, we will be comparing their results.

bearer = os.getenv('bm_token')
VNP46A2 = bm_extract(
    admin2, # Area of interest
    product_id="VNP46A2", # Product
    date_range=pd.date_range("2023-09-01", "2023-12-31", freq="D"),
    bearer=bearer,
    aggfunc=["mean", "sum"],
    variable='Gap_Filled_DNB_BRDF-Corrected_NTL', # This is the default
    output_directory="./ntl/", # Save the downloaded h5 files locally to reprocess when necessary
    output_skip_if_exists=True # Skip the download of the file if it already exists. Make sure data was downloaded correctly
    )

Note

In the following table, one can appreciate one of the valuable features of the BlackMarblePy package. As you know, Black Marble data is a raster, but thanks to the package, with only one command, one can download the data and obtain the average radiance at different aggregation levels (see the table and plot below)

# The resulting dataframe has one record per admin2 geography and per day
VNP46A2.head()
GID_2 GID_0 COUNTRY GID_1 NAME_1 NL_NAME_1 NAME_2 VARNAME_2 NL_NAME_2 TYPE_2 ENGTYPE_2 CC_2 HASC_2 geometry ntl_mean ntl_sum date
0 PSE.1.1_1 PSE Palestine PSE.1_1 Gaza NA Deir Al-Balah NA NA Governorate Governorate NA PS.GZ.DB MULTIPOLYGON (((34.41588 31.4272, 34.4079 31.4... 12.309211 3742.0 2023-09-01
1 PSE.1.2_1 PSE Palestine PSE.1_1 Gaza NA Gaza NA NA Governorate Governorate NA PS.GZ.GZ MULTIPOLYGON (((34.50843 31.50397, 34.50824 31... 19.254424 7181.9 2023-09-01
2 PSE.1.3_1 PSE Palestine PSE.1_1 Gaza NA Gaza ash Shamaliyah North Gaza NA Governorate Governorate NA PS.GZ.GS MULTIPOLYGON (((34.48903 31.59443, 34.48946 31... 16.156655 4733.9 2023-09-01
3 PSE.1.4_1 PSE Palestine PSE.1_1 Gaza NA Khan Yunis NA NA Governorate Governorate NA PS.GZ.KY MULTIPOLYGON (((34.37065 31.3726, 34.36583 31.... 12.572713 7832.8 2023-09-01
4 PSE.1.5_1 PSE Palestine PSE.1_1 Gaza NA Rafah NA NA Governorate Governorate NA PS.GZ.RA MULTIPOLYGON (((34.26801 31.22361, 34.26234 31... 21.198837 5469.3 2023-09-01
# Create a pivot table for plotting
pivot_day = (
    VNP46A2.pivot_table(index="date", columns=["NAME_2"], values=['ntl_mean'], aggfunc="mean")
    .resample("D", label="left")
    .mean()
)
pivot_day.head()
ntl_mean
NAME_2 Bethlehem Deir Al-Balah Gaza Gaza ash Shamaliyah Hebron Jenin Jericho Jerusalem Khan Yunis Nablus Qalqilya Rafah Ramallah and Al-Bireh Salfit Tubas Tulkarm
date
2023-09-01 6.625108 12.309211 19.254424 16.156655 10.655935 9.808780 7.049122 27.576268 12.572713 9.693076 18.557646 21.198837 11.317546 16.873524 3.941107 14.030075
2023-09-02 6.579640 10.901645 21.765416 17.723891 10.119197 9.752054 7.004479 26.390645 12.373515 10.269413 18.378591 18.443798 10.961254 16.407205 3.975395 13.727136
2023-09-03 6.049856 10.403618 20.315818 14.963481 8.639310 8.389150 5.721035 25.483358 12.410112 8.988694 16.340593 18.394574 10.243518 14.764757 3.381621 12.069651
2023-09-04 6.242422 10.479934 20.533512 15.136860 8.570817 8.288147 5.673910 25.713836 12.381380 8.943120 16.446432 19.552326 10.086064 14.664583 3.486808 11.750991
2023-09-05 6.284053 10.782566 21.183110 15.311945 9.028971 8.718470 5.965617 25.880355 12.410273 9.378469 16.628638 19.786434 10.399747 15.159896 3.738142 12.603144
def create_plot(data, agg, product, place, other=None):
    '''Creates an interactive plot
            data: Data to be used
            agg: Aggregation period [Daily, Monthly, Yearly]
            product: VNP46A2, VNP46A3 '''
    # Create the figure
    p = figure(
        title="{}: {} Nighttime Lights".format(place, agg),
        width=800,
        height=600,
        x_axis_label="Date",
        x_axis_type="datetime",
        y_axis_label=r"Radiance [nW $$cm^{-2}$$ $$sr^{-1}$$]",
        tools="pan,wheel_zoom,box_zoom,reset,save,box_select",
        )
    # Add a title
    p.add_layout(
        Title(
            text="{} NTL Radiance Average ({}) for each second-level administrative division".format(agg, product),
            text_font_size="12pt",
            text_font_style="italic",
        ),
        "above",
    )
    # Add a subtitle
    p.add_layout(
        Title(
            text=f"Data Source: NASA Black Marble. Creation date: {datetime.today().strftime('%d %B %Y')}. Feedback: datalab@worldbank.org.",
            text_font_size="10pt",
            text_font_style="italic",
        ),
        "below",
    )
    # Add the legend
    p.add_layout(Legend(), "right")
    
    # What to show when hover over 
    p.add_tools(
        HoverTool(
            tooltips=[
                ("Date", "@x{%F} (@x{%W of %Y})"),
                ("Radiance", "@y{0.00}"),
            ],
            formatters={"@x": "datetime"},
        )
    )
    
    # Plot a line for each column/admin2 level
    for column, color in zip(data.columns, cc.b_glasbey_category10):
        r = p.line(
            data.index,
            data[column],
            legend_label=column[1],
            line_color=color,
            line_width=2,
        )
        r.visible = False
    
        # Except Gaza visible
        if str(column[1]) == "Gaza":
            r.visible = True
    if not(other is None):
        for column, color in zip(other.columns, cc.b_cyclic_grey_15_85_c0):
            r = p.line(
                other.index,
                other[column],
                legend_label=column[1],
                line_color=color,
                line_width=2,
            )
            r.visible = False
        
            # Except Gaza visible
            if str(column[1]) == "Gaza":
                r.visible = True        
    
    p.legend.location = "bottom_left"
    p.legend.click_policy = "hide"
    p.title.text_font_size = "16pt"
    
    output_notebook()
    show(p)
create_plot(pivot_day, 'Daily', 'VNP46A2', 'Palestine')
Loading BokehJS ...
VNP46A3 = bm_extract(
    admin2, # Area of interest
    product_id="VNP46A3", # Product
    date_range=pd.date_range("2023-09-01", "2024-06-30", freq="MS"),
    bearer=bearer,
    aggfunc=["mean", "sum"],
    variable='NearNadir_Composite_Snow_Free', # This is the default
    output_directory="./ntl/", # Save the downloaded h5 files locally to reprocess when necessary
    output_skip_if_exists=True # Skip the download of the file if it already exists
    )
# Create a pivot table for plotting
pivot_month = (
    VNP46A3.pivot_table(index="date", columns=["NAME_2"], values=['ntl_mean'], aggfunc="mean")
    .resample("MS", label="left")
    .mean()
)
create_plot(pivot_month, 'Monthly', 'VNP46A3', 'Palestine')
Loading BokehJS ...

The following section will teach you how to create valuable indicators by elaborating on the downloaded data.

16.4. Indicators - Methodology#

Creating a time series of weekly/monthly radiance using NASA’s Black Marble data involves several steps, including data acquisition, pre-processing, zonal statistics calculation, and time series generation. Below is a general methodology for this process.

Note

This section uses the already downloaded data.

16.4.1. Time Series Generation#

The first step consists of organizing the zonal statistics results in a tabular format, where each column corresponds to a specific zone, and rows represent the daily radiance values. Next, the data is aggregated on a weekly basis, computing the desired statistical metric (e.g., mean radiance) for each zone each week. Finally, the time series is visualized to observe trends, patterns, and anomalies over time.

16.4.1.1. Weekly#

In this step, a weekly aggregation of the zonal statistics for each second-level administrative division and for each week is computed.

weekly_agg_admin2 = (
    VNP46A2.pivot_table(values=['ntl_mean'], index="date", columns=["NAME_2"])
    .resample("W-SUN", label="right") # Aggregate by week, anchor sunday
    .mean()
)
create_plot(weekly_agg_admin2, 'Weekly', 'VNP46A2', 'Palestine')
Loading BokehJS ...

16.4.1.2. Monthly#

This step computes a monthly aggregation of the zonal statistics for each second-level administrative division and for each month. Additionally, the VNP46A3 monthly composite is shown, when available, for comparison purposes.

monthly_agg_admin2 = (
    VNP46A2.pivot_table(values=['ntl_mean'], index="date", columns=["NAME_2"])
    .resample("MS", label="right") # Aggregate by Month, beginning of the month
    .mean()
)
create_plot(monthly_agg_admin2, 'Monthly', 'VNP46A2', 'Palestine', pivot_month)
Loading BokehJS ...

16.4.1.3. Percentage change#

16.4.1.3.1. Benchmark Comparison#

This analysis compares the observed average radiance levels to a benchmark for each second-level administrative division. The benchmark can be defined in different ways, for example:

  • Average radiance in 2022

  • Average radiance in October 2022

  • Average radiance in the week before the outbreak

This example uses September 2023 as the benchmark. This is done mainly to avoid downloading a large amount of data during the course.

16.4.1.3.2. Administrative Level 2#

This example calculates how much each week changed its average radiance with respect to the weekly average radiance from September 2023 for each administrative level 2 boundary.

benchmark = weekly_agg_admin2[(weekly_agg_admin2.index >= "2023-09-01") & (weekly_agg_admin2.index < "2023-09-30")].mean()
benchmark
          NAME_2               
ntl_mean  Bethlehem                 6.621347
          Deir Al-Balah            11.841913
          Gaza                     21.680186
          Gaza ash Shamaliyah      16.425614
          Hebron                    9.656051
          Jenin                     9.339154
          Jericho                   6.581607
          Jerusalem                26.549290
          Khan Yunis               13.391600
          Nablus                    9.587927
          Qalqilya                 17.420442
          Rafah                    20.121346
          Ramallah and Al-Bireh    10.789527
          Salfit                   15.712062
          Tubas                     3.965039
          Tulkarm                  13.060293
dtype: float64
percentage_change = 100 * ((weekly_agg_admin2/ benchmark) - 1)
def create_plot_perc_change(data, agg, product, place, other=None):
    '''Creates an interactive plot
            data: Data to be used
            agg: Aggregation period [Daily, Monthly, Yearly]
            product: VNP46A2, VNP46A3 '''
    # Create the figure
    p = figure(
        title="{}: {} Nighttime Lights Percentage Change".format(agg, 'Palestine'),
        width=800,
        height=600,
        x_axis_label="Date",
        x_axis_type="datetime",
        y_axis_label=r"NTL Percentage change (%)",
        tools="pan,wheel_zoom,box_zoom,reset,save,box_select",
        )
    # Add a title
    p.add_layout(
        Title(
            text="{} NTL Percentage Change ({}) for each second-level administrative division".format(agg, product),
            text_font_size="12pt",
            text_font_style="italic",
        ),
        "above",
    )
    # Add a subtitle
    p.add_layout(
        Title(
            text=f"Data Source: NASA Black Marble. Creation date: {datetime.today().strftime('%d %B %Y')}. Feedback: datalab@worldbank.org.",
            text_font_size="10pt",
            text_font_style="italic",
        ),
        "below",
    )
    # Add the legend
    p.add_layout(Legend(), "right")
    
    # What to show when hover over 
    p.add_tools(
        HoverTool(
            tooltips=[
                ("Date", "@x{%F} (@x{%W of %Y})"),
                ("Percentage Change", "@y{0.00}"),
            ],
            formatters={"@x": "datetime"},
        )
    )
    
    # Plot a line for each column/admin2 level
    for column, color in zip(data.columns, cc.b_glasbey_category10):
        r = p.line(
            data.index,
            data[column],
            legend_label=column[1],
            line_color=color,
            line_width=2,
        )
        r.visible = False
    
        # Except Gaza visible
        if str(column[1]) == "Gaza":
            r.visible = True
    if not(other is None):
        for column, color in zip(other.columns, cc.b_cyclic_grey_15_85_c0):
            r = p.line(
                other.index,
                other[column],
                legend_label=column[1],
                line_color=color,
                line_width=2,
            )
            r.visible = False
        
            # Except Gaza visible
            if str(column[1]) == "Gaza":
                r.visible = True        
    
    p.legend.location = "bottom_left"
    p.legend.click_policy = "hide"
    p.title.text_font_size = "16pt"
    
    output_notebook()
    show(p)
create_plot_perc_change(percentage_change, 'Weekly', 'VNP46A2', 'Palestine')
Loading BokehJS ...
16.4.1.3.3. Administrative Level 1#

This example calculates how much each week changed its average radiance with respect to the weekly average radiance from September 2023 for each administrative level 1 boundary.

weekly_agg_admin1 = (
    VNP46A2.pivot_table(values=['ntl_mean'], index="date", columns=["NAME_1"])
    .resample("W-SUN", label="right")
    .mean()
)
benchmark_admin1 = weekly_agg_admin1[(weekly_agg_admin1.index >= "2023-09-01") & (weekly_agg_admin1.index < "2023-09-30")].mean()
benchmark_admin1
          NAME_1   
ntl_mean  Gaza         16.692132
          West Bank    11.752976
dtype: float64
percentage_change_admin1 = 100 * ((weekly_agg_admin1/ benchmark_admin1) - 1)
create_plot_perc_change(percentage_change_admin1, 'Weekly', 'VNP46A2', 'Palestine')
Loading BokehJS ...
16.4.1.3.4. Visualize radiance spatially over time#

This example shows how to create a map to visualize radiance in the area of interest across the time.

import json
import folium
from folium.plugins import TimestampedGeoJson, TimeSliderChoropleth
import branca
admin2.replace({'NAME_2' : {'Deir Al-Balah': 'Deir_Al-Balah',
                                           'Gaza ash Shamaliyah': 'Gaza_ash_Shamaliyah',
                                           'Khan Yunis': 'Khan_Yunis',
                                           'Ramallah and Al-Bireh': 'Ramallah_and_Al-Bireh'}
               }, inplace = True)
admin2.set_index('NAME_2', inplace = True)
admin2_json = json.loads(admin2.to_json())
weekly_agg_admin2.columns = weekly_agg_admin2.columns.droplevel()
weekly_agg_admin2.reset_index(inplace = True)
weekly_agg_admin2.rename(columns = {'Deir Al-Balah': 'Deir_Al-Balah',
                                           'Gaza ash Shamaliyah': 'Gaza_ash_Shamaliyah',
                                           'Khan Yunis': 'Khan_Yunis',
                                           'Ramallah and Al-Bireh': 'Ramallah_and_Al-Bireh'}, inplace = True)
weekly_agg_admin2_melted = weekly_agg_admin2.melt(
                                id_vars=['date'], 
                                value_vars = ['Bethlehem', 'Deir_Al-Balah', 'Gaza', 'Gaza_ash_Shamaliyah',
                                   'Hebron', 'Jenin', 'Jericho', 'Jerusalem', 'Khan_Yunis', 'Nablus',
                                   'Qalqilya', 'Rafah', 'Ramallah_and_Al-Bireh', 'Salfit', 'Tubas',
                                   'Tulkarm'],
                                var_name = 'admin_2', 
                                value_name = 'ntl_mean'
                            )
weekly_agg_admin2_melted['unixtime'] = weekly_agg_admin2_melted.date.apply(lambda x: int(x.timestamp()))
all_values = list(weekly_agg_admin2_melted['ntl_mean'].unique())
min_value = min(all_values)
max_value = max(all_values)
colormap = branca.colormap.linear.YlOrRd_09.scale(min_value, max_value)
colormap.caption = 'mean ntl'
weekly_agg_admin2_melted['color'] = weekly_agg_admin2_melted['ntl_mean'].apply(lambda x: colormap(x))
weekly_agg_admin2_melted['opacity'] = 0.8
styledata = {}
for idx in admin2.index:
    df = weekly_agg_admin2_melted[(weekly_agg_admin2_melted['admin_2']==idx)].copy()
    df.set_index('unixtime', inplace = True)
    styledata[idx] = df.drop('date', axis = 1)
styledict = {
    str(idx): data.to_dict(orient="index") for idx, data in styledata.items()
}
# Create the Folium map
m = folium.Map(location=[31.468289,34.397759], zoom_start=8)

# Add the time slider
TimeSliderChoropleth(
    admin2_json,
    styledict=styledict,
).add_to(m)
colormap.add_to(m)
m
Make this Notebook Trusted to load map: File -> Trust Notebook

16.4.2. Quality of the data#

The quality of nighttime lights data can be impacted by a number of factors, particularly cloud cover. To facilitate analysis using high-quality data, Black Marble (1) marks the quality of each pixel and (2) in some cases of poor-quality pixels, Black Marble will use data from a previous date to fill the value—using a temporally-gap filled NTL value. If data quality is a concern, BlackMarblePy offers examples on how that can be assessed. More information can be found here.

16.5. Limitations#

Using nighttime lights to estimate macroeconomic indicators during conflict may be a valuable approach, but it comes with several assumptions and limitations. Here’s a list of some of the key assumptions and limitations:

Caution

Assumptions:

  • Luminosity Reflects Economic Activity: The approach assumes that the level of nighttime lights is a reliable proxy for economic activity. It presupposes that areas with brighter lights correspond to higher economic productivity.

  • Baseline Data Availability: It assumes the availability of baseline nighttime lights data before the onset of the conflict. The accuracy of the estimates depends on the quality and relevance of this baseline data.

  • Spatial Distribution: The method assumes that nighttime lights are evenly distributed within a given geographic area and that changes in luminosity accurately reflect changes in economic activity across all locations.

Limitations:

  • Confounding Factors and Data Interpretation: The approach may require subjective interpretation, as it may not distinguish between reduced lighting due to conflict and reduced lighting due to other factors. Changes in nighttime lights can be influenced by factors other than economic activity, such as energy conservation measures, urban development, or seasonal variations.

  • Generalization: The approach might lead to overgeneralization, as a reduction in nighttime lights can be associated with various economic outcomes, from minor disruptions to severe economic downturns.

  • Alternative Explanations: Changes in nighttime lights can result from factors other than conflict, such as urban development, changes in economic activities, or natural disasters. Therefore, it may not always be clear whether a decline in nighttime lights is solely due to conflict.

  • Geopolitical Factors: The dataset may be subject to geopolitical biases, with some areas having less comprehensive coverage due to political reasons.

  • Data Lag: There can be a significant time lag between the occurrence of a conflict event and its reflection in the nighttime lights dataset. This lag may limit the dataset’s utility for real-time conflict monitoring.

  • Resolution and Urban Bias: The dataset’s spatial resolution may not be fine enough to capture small villages or isolated conflict events. It may also have an urban bias, making it less suitable for analyzing rural or remote conflicts.

To address these assumptions and limitations, it is crucial to complement nighttime lights data analysis with other sources of information and adopt a cautious and context-aware approach when interpreting the findings.

16.6. References#

1

Miguel O. Román, Zhuosen Wang, Qingsong Sun, Virginia Kalb, Steven D. Miller, Andrew Molthan, Lori Schultz, Jordan Bell, Eleanor C. Stokes, Bhartendu Pandey, Karen C. Seto, Dorothy Hall, Tomohiro Oda, Robert E. Wolfe, Gary Lin, Navid Golpayegani, Sadashiva Devadiga, Carol Davidson, Sudipta Sarkar, Cid Praderas, Jeffrey Schmaltz, Ryan Boller, Joshua Stevens, Olga M. Ramos González, Elizabeth Padilla, José Alonso, Yasmín Detrés, Roy Armstrong, Ismael Miranda, Yasmín Conte, Nitza Marrero, Kytt MacManus, Thomas Esch, and Edward J. Masuoka. Nasa's black marble nighttime lights product suite. Remote Sensing of Environment, 210:113–143, 2018. URL: https://www.sciencedirect.com/science/article/pii/S003442571830110X, doi:https://doi.org/10.1016/j.rse.2018.03.017.

16.7. Practice#

Calculate the weekly percentage change in radiance across administrative level 2 boundaries after the conflict outbreak using the mean radiance in October 2022 as the baseline. Make the calculations for Sept-Oct-Nov of 2023.

Create an interactive map of the percentage change in radiance with a weekly time slider.