Hyperspectral satellite imagery for Earth observation

A hyperspectral satellite photo of an area in California, USA, showing a landscape, a lake, and human dwellings.Watch the lightning talk

What is hyperspectral imagery?

Until recently, imaging satellites used cameras that capture the Earth in a few frequency bands. Multispectral sources typically cover the optical red, green, and blue bands of visible light in addition to a few other frequency ranges.

But a revolution is underway. A growing number of satellites equipped with hyperspectral cameras are going to allow us to observe the Earth in extraordinary detail across a large frequency range — well beyond what the human eye can see, and in much more detail than existing multispectral sensors such as Landsat or Sentinel-2.

Hyperspectral imagery enables applications such as tree and plant species identification, monitoring of air quality (particulate and gaseous) and water quality (e.g. algal bloom detection), tree health observation, and more.

Compare visible light to multi- and hyperspectral images

Visible light

3 spectral bands

A RGB image can be used to identify vegetation by shape and color

Multispectral

5-35 spectral bands

A 13-band Sentinel-2 image can be used to differentiate between some crops and detect plants under stress.

Hyperspectral

100s of spectral bands

The 224-band AVIRIS sensor enables detection of plant chemical composition and physiology changes attributed to weeds, diseases, or nutrient deficiency.

The visualization above shows one spectral band at a time. The amount of data contained in a single image can also be represented as a hyperspectral cube.

Hyperspectral data cube

 Three-dimensional projection of a hyperspectral image cube. The X and Y-axis represent spatial dimensions, the vertical Z axis encodes 224 spectral wavelengths.

Different materials can often be identified by their characteristic spectral signature. Hyperspectral imaging sensors cover a larger frequency spectrum in narrow, continuous frequency bands. They enable the capture and identification of spectral profiles, such as of healthy (green-stage) and stressed (red-stage) conifers in California.

Explore the spectral profiles in an hyperspectral image

An aerial image of a partly tree-covered, partly rocky landscape. A crosshair is overlaid on the photo, pointing at one pixel at the center of the image.

Move your mouse over the image to see the spectral profile in each pixel

AVIRIS images contain 224 datapoints in each pixel. We can compare them with reference spectral profiles known from material science. And tracking changes over time enables measurement of tree health, etc.

A two-axis chart comparing a spectral profile from the previous image to healthy and stressed conifers.Healthy conifer. Line shows high reflectance at lower wavelengths, then fairly flat.Healthy coniferStressed conifer. Line shows two sharp reflectance peaks at medium and medium-high wavelengths.Stressed coniferSpectral profile in a pixel of the previous image. Line has some similarities with healthy and stressed conifer line.

Manually comparing spectral profiles is inefficient at scale. Using machine learning we can train a model to identify the spectral characteristics of different materials. New imagery can then be classified more quickly and at scale.

Techniques for production of actual analyses are still in an early stage. Hyperspectral imagery is currently limited and ground truth and model training data is scarce.

The image below shows early results from our tree health study area in California. Here, we used hyperspectral imagery to distinguish between healthy (green-stage) conifers and conifers under stress (red-stage). These results are from an ML model that works on 224-band AVIRIS imagery.

See how a machine learning model can detect conifer health

Hyperspectral imageWhat the human eye can see

An aerial image of a mountainous area with three bodies of water

Azavea R&D vegetation modelUsing machine learning on spectral profiles

A processed, false-color version of the previous aerial image, highlighting healthy conifers in green, conifers under stress in red, and other vegetation in blue

These are early results shown for illustration only.


Hyperspectral at Azavea

NASA expects to see a major increase in the number of hyperspectral data sets in the coming years. Supported by an innovation research grant from NASA, Azavea created a reusable, open source data processing pipeline for serving and analyzing hyperspectral imagery.

Our pipeline for hyperspectral imagery processing on the web

Image preparation

We retrieve required scenes, re-encode them to formats suitable for cloud processing such as Cloud Optimized GeoTIFF, and place them in cloud storage using Amazon Web Services (AWS).

Cloud-optimized GeoTIFFsAmazon Web Services (AWS)Kubernetes

Hyperspectral STAC catalog

We catalog imagery using the SpatioTemporal Asset Catalog (STAC) specification for serving geospatial datasets on the web. We use an open-source Franklin server to index imagery and serve features. Franklin lets us query and filter imagery collections by place, time, and spectral coverage.

STAC: SpatioTemporal Asset CatalogsFranklin server

Image processing & machine learning

We use the Raster Vision geospatial machine learning library to process images using Kubernetes/Argo and AWS.

Raster VisionKubernetes

Interactive, user-friendly web apps

We use a research-driven user experience design process to create actionable and easy-to-use web applications using React.

ReactMapLibre

Work with us on your hyperspectral project

Azavea creates software and data analytics for the web. We are a mission-driven company, using our twenty years of geospatial expertise to help our clients address complex civic, social, and environmental problems.

Get in touch at www.azavea.com/contact-us

Hyperspectral Satellite Launch Tracker

See available HSI data and find out when more becomes available

Launch dateNameProviderPlatformConstellation size now / plannedWavelengths in nmSpectral res.in nmSpatial res.in m
N/AAVIRIS (classic)NASA/JPLAirborne380-251010Varies
N/AAVIRIS-NGNASA/JPLAirborne380-25105Varies
2018DESISDLRISS400-10002.5530
2019PRISMAASISatellite400-25001030
2022EnMAP (VNIR)DLRSatellite420-10006.530
2022EnMAP (SWIR)DLRSatellite900-24501030
2020-25Satellogic Aleph-1SatellogicSatellite26 / 200+460-83014-3525
2022-23Orbital Sidekick GHOStOrbital SidekickSatellite6400-25003-58
2022-23PIXXELPIXXELSatellite65
2023Carbon Mapper (Tanager)Planet/JPL/CarbonMapper CoalitionSatellite2530
2023WyvernWyvern SpaceSatellite36<5
2023HySpecHyspeqIQSatellite21055
2023HyperSatHyperSatSatellite6
2023Kuva Hyperfield-1Kuva SpaceSatellite450-100025
2024PIXXELPIXXELSatellite30400-25005
2027/28SBGNASASatellite1030
2029CHIME (Sentinel 10)ESASatellite400-250010

Launch dates and specifications may be changed or delayed. Last updated Mon Dec 12 2022.