A conventional camera captures three numbers per pixel — red, green, and blue intensities — which is a coarse approximation of the full spectral content of the light arriving at that pixel. The visible spectrum spans roughly 400 to 700 nanometers, and many materials, tissues, and chemical compounds have spectral signatures — characteristic patterns of absorption and reflectance across this range — that are invisible to an RGB camera but are precisely what distinguishes healthy tissue from diseased, ripe fruit from unripe, or a genuine pharmaceutical from a counterfeit. A hyperspectral camera captures tens to hundreds of narrow spectral bands per pixel, recovering the full spectral signature at every spatial location. The result is a three-dimensional data cube — two spatial dimensions and one spectral dimension — that contains far more information than any RGB image.
The challenge is that acquiring this data cube conventionally is slow and expensive. A traditional scanning hyperspectral imager sweeps a narrow slit across the scene, capturing one spatial line per exposure and building the cube sequentially. This requires the scene to remain stationary for the entire acquisition time — a fundamental limitation for moving objects, dynamic biological processes, or real-time industrial inspection. My research in compressive spectral imaging addresses this challenge by asking a different question: rather than acquiring every voxel of the data cube sequentially, can we design a sensor that acquires a small number of carefully coded measurements and then recovers the full cube computationally?
This page describes the theory behind compressive spectral imaging, my group's specific contributions to its design and reconstruction, and our extensions to joint spectral and depth sensing.
Background: Compressive Sensing and Coded Apertures
Compressive sensing theory. Compressive sensing, developed in the mid-2000s by Candès, Romberg, Tao, and Donoho, established that a signal with S nonzero coefficients in some basis can be recovered exactly from as few as O(S log N) linear measurements — far fewer than the N measurements that Nyquist sampling would require — provided that the measurement matrix satisfies a condition called the Restricted Isometry Property (RIP) and that the signal is recovered by solving a convex ℓ1-minimization problem. For signals that are sparse in a transform domain (wavelets, DCT, graph Fourier basis), this enables dramatic reduction in acquisition time or sensor complexity. Spectral images are highly compressible: adjacent spectral bands are strongly correlated, and the spatial structure of natural scenes is also sparse in appropriate transform domains. This makes spectral imaging a natural application for compressive sensing.
From US Patent 10,151,629: optical layout of a coded aperture snapshot spectral+ToF imager.
Coded aperture spectral imagers. The physical implementation of compressive spectral sensing that my group works with is the Coded Aperture Snapshot Spectral Imager (CASSI), originally developed by Brady and Gehm at Duke University. In a CASSI system, a spatial light modulator (SLM) — typically a digital micromirror device (DMD) — is placed at an intermediate focal plane in the optical path. The SLM applies a binary spatial mask to the scene, blocking or passing light at each spatial location according to a coded pattern. A dispersive element (prism or diffraction grating) then spectrally shears the masked image onto a focal plane array, so that different spectral bands land at different spatial positions on the detector. A single detector image thus contains a superposition of spatially shifted, spectrally coded versions of the scene — a single-shot compressive measurement of the full spectral cube.
Recovering the spectral cube from this single measurement is an ill-posed inverse problem. The key insight from compressive sensing is that the problem becomes well-posed when the coded aperture pattern is designed so that the measurement matrix satisfies RIP, and when the spectral cube is sparse or smooth in an appropriate basis. The design of the coded aperture — the spatial pattern on the SLM — is therefore a critical degree of freedom that directly determines reconstruction quality.
Phase 1: Coded Aperture Design
Blue-noise coded apertures. My group's first contribution to compressive spectral imaging was the application of blue-noise theory to coded aperture design. The connection is direct: a coded aperture is a binary spatial mask, and the design problem — place a fixed fraction of “open” pixels on the aperture so that the resulting measurement matrix has good RIP properties — is mathematically equivalent to the halftoning problem of placing a fixed number of printed dots so that their spatial distribution is spectrally optimal. Blue-noise distributions, which suppress low-frequency energy and distribute power uniformly in the mid-frequency band, are known to produce near-optimal RIP matrices for compressive sensing. We demonstrated that blue-noise coded apertures achieve reconstruction quality superior to random binary apertures and to regular grid apertures across a wide range of spectral scenes:
With the first academically owned Zmini time-of-flight camera from 3DVSystems in North America.
Standard CASSI systems capture spectral information but discard depth — every pixel in the detector image is a superposition of contributions from surfaces at different distances, and there is no mechanism to separate them. Time-of-flight (ToF) cameras, conversely, capture accurate per-pixel depth but have no spectral resolution beyond a single amplitude measurement. My group developed the first compressive imaging architecture that captures both spectral and depth information simultaneously from a single sensor, by combining a coded aperture spectral imager with a time-of-flight modulation scheme.
The physical principle. In a ToF camera, the scene is illuminated with an amplitude-modulated light source at a known frequency, and the detector measures the phase delay of the reflected modulation — which is directly proportional to the round-trip distance to the surface. We recognized that the ToF modulation scheme is compatible with CASSI: by modulating the illumination at a ToF frequency while simultaneously applying a spatial coded aperture, a single detector exposure captures a measurement that is jointly coded in the spectral, spatial, and depth dimensions. Recovering the full spectral-depth data cube from this single exposure requires solving a higher-dimensional inverse problem, but the same compressive sensing and sparsity-based reconstruction framework applies.
A central challenge in compressive spectral imaging is reconstruction quality: given the compressed measurements, how accurately and efficiently can the full spectral cube be recovered? Early CASSI reconstruction algorithms used standard ℓ1 minimization with wavelet sparsity priors, which treat each spectral band independently and ignore inter-band correlations. My group developed reconstruction algorithms that exploit the joint spatial-spectral smoothness of natural spectral scenes using graph-based regularization. Block-based graph reconstruction. We partitioned the spectral image into spatial blocks and modeled each block as a graph signal, where nodes represent pixels and edge weights reflect spatial and spectral similarity. Graph-Laplacian regularization within each block enforces smoothness while allowing sharp edges to be preserved across block boundaries:
A natural and timely application of compressive spectral imaging is precision agriculture, where UAS (unmanned aircraft systems) equipped with multispectral cameras are used to monitor crop health, detect disease, assess water stress, and guide variable-rate application of fertilizers and pesticides. Spectral reflectance indices — combinations of reflectance values at specific wavelengths — are established proxies for plant health parameters including chlorophyll content, leaf area index, and canopy nitrogen. However, the spectral and spatial calibration of UAS-mounted cameras is technically challenging: sensor characteristics vary with temperature and illumination angle, and the geometric distortions introduced by the UAS platform require careful correction. A current USDA/NIFA grant ($613K, 2023–2027) supports my group's work on improving the spectral and spatial calibration of remote sensing imagery from UAS platforms, in collaboration with agricultural scientists in the University of Kentucky's College of Agriculture. This project connects the coded aperture design and calibration methods developed in the laboratory setting to the practical constraints of field deployment on a UAS — smaller sensors, wider illumination variation, faster acquisition, and the need for robust real-time processing. A separate collaboration supported by the U.S. Department of Energy ($1M, 2025–2027) applies spectral sensing and signal processing methods to utility asset monitoring — using sensor data from electrical grid infrastructure to enable more accurate load modeling, capacity utilization assessment, and fault detection. Funding Summary
Graduate Alumni from This Thrust
Connection to Other Research Thrusts
This thrust sits at the intersection of the other three areas of my research program:
Compressive sensing theory. Compressive sensing, developed in the mid-2000s by Candès, Romberg, Tao, and Donoho, established that a signal with S nonzero coefficients in some basis can be recovered exactly from as few as O(S log N) linear measurements — far fewer than the N measurements that Nyquist sampling would require — provided that the measurement matrix satisfies a condition called the Restricted Isometry Property (RIP) and that the signal is recovered by solving a convex ℓ1-minimization problem. For signals that are sparse in a transform domain (wavelets, DCT, graph Fourier basis), this enables dramatic reduction in acquisition time or sensor complexity. Spectral images are highly compressible: adjacent spectral bands are strongly correlated, and the spatial structure of natural scenes is also sparse in appropriate transform domains. This makes spectral imaging a natural application for compressive sensing.
From US Patent 10,151,629: optical layout of a coded aperture snapshot spectral+ToF imager.
Blue-noise coded apertures. My group's first contribution to compressive spectral imaging was the application of blue-noise theory to coded aperture design. The connection is direct: a coded aperture is a binary spatial mask, and the design problem — place a fixed fraction of “open” pixels on the aperture so that the resulting measurement matrix has good RIP properties — is mathematically equivalent to the halftoning problem of placing a fixed number of printed dots so that their spatial distribution is spectrally optimal. Blue-noise distributions, which suppress low-frequency energy and distribute power uniformly in the mid-frequency band, are known to produce near-optimal RIP matrices for compressive sensing. We demonstrated that blue-noise coded apertures achieve reconstruction quality superior to random binary apertures and to regular grid apertures across a wide range of spectral scenes:
- H. Zhang, X. Ma, D. L. Lau, J. Zhu, and G. R. Arce, “Compressive Spectral Imaging Based on Hexagonal Blue-Noise Coded Apertures,” IEEE Transactions on Computational Imaging, vol. 6, pp. 749–763, 2020.
- L. Galvis, D. L. Lau, X. Ma, H. Arguello, and G. R. Arce, “Coded Aperture Design in Compressive Spectral Imaging Based on Side Information,” Applied Optics, vol. 56, no. 22, pp. 6332–6340, 2017.
- J. F. Florez-Ospina, D. L. Lau, D. Guillot, K. Barner, and G. R. Arce, “Smoothness on Rank-Order Path Graphs and its Use in Compressive Spectral Imaging with Side Information,” Signal Processing, vol. 196, 2022, 108707.
- A. Aguirre, A. Alrushud, G. R. Arce, and D. L. Lau, “Sudoku Multispectral Filter Arrays for Spectral Snapshot Cameras,” Optics Continuum, vol. 4, no. 9, pp. 2035–2052, 2025.
- H. Rueda, C. Fu, D. L. Lau, and G. R. Arce, “Single Aperture Spectral+ToF Compressive Camera: Toward Hyperspectral+Depth Imagery,” IEEE Journal of Selected Topics in Signal Processing, vol. 11, no. 7, pp. 992–1003, 2017.
- H. Rueda-Chacon, J. F. Florez, D. L. Lau, and G. R. Arce, “Snapshot Compressive ToF+Spectral Imaging via Optimized Color-Coded Apertures,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 42, no. 10, pp. 2346–2360, October 2020.
- H. Rueda, D. L. Lau, and G. R. Arce, “Multi-Spectral Compressive Snapshot Imaging Using RGB Image Sensors,” Optics Express, vol. 23, no. 9, pp. 12207–12221, 2015.
- D. L. Lau, Y. Zhang, T. Hastings, H. Rueda, and G. R. Arce, “Light Field Modeling for Coded Aperture Systems,” OSA Imaging and Applied Optics Congress, 2017.
A central challenge in compressive spectral imaging is reconstruction quality: given the compressed measurements, how accurately and efficiently can the full spectral cube be recovered? Early CASSI reconstruction algorithms used standard ℓ1 minimization with wavelet sparsity priors, which treat each spectral band independently and ignore inter-band correlations. My group developed reconstruction algorithms that exploit the joint spatial-spectral smoothness of natural spectral scenes using graph-based regularization. Block-based graph reconstruction. We partitioned the spectral image into spatial blocks and modeled each block as a graph signal, where nodes represent pixels and edge weights reflect spatial and spectral similarity. Graph-Laplacian regularization within each block enforces smoothness while allowing sharp edges to be preserved across block boundaries:
- J. F. Florez-Ospina, A. K. M. Alrushud, D. L. Lau, and G. R. Arce, “Block-Based Spectral Image Reconstruction for Compressive Spectral Imaging Using Smoothness on Graphs,” Optics Express, vol. 30, pp. 7187–7209, 2022.
A natural and timely application of compressive spectral imaging is precision agriculture, where UAS (unmanned aircraft systems) equipped with multispectral cameras are used to monitor crop health, detect disease, assess water stress, and guide variable-rate application of fertilizers and pesticides. Spectral reflectance indices — combinations of reflectance values at specific wavelengths — are established proxies for plant health parameters including chlorophyll content, leaf area index, and canopy nitrogen. However, the spectral and spatial calibration of UAS-mounted cameras is technically challenging: sensor characteristics vary with temperature and illumination angle, and the geometric distortions introduced by the UAS platform require careful correction. A current USDA/NIFA grant ($613K, 2023–2027) supports my group's work on improving the spectral and spatial calibration of remote sensing imagery from UAS platforms, in collaboration with agricultural scientists in the University of Kentucky's College of Agriculture. This project connects the coded aperture design and calibration methods developed in the laboratory setting to the practical constraints of field deployment on a UAS — smaller sensors, wider illumination variation, faster acquisition, and the need for robust real-time processing. A separate collaboration supported by the U.S. Department of Energy ($1M, 2025–2027) applies spectral sensing and signal processing methods to utility asset monitoring — using sensor data from electrical grid infrastructure to enable more accurate load modeling, capacity utilization assessment, and fault detection. Funding Summary
| Sponsor | Program | Amount | Period |
|---|---|---|---|
| NSF VEC Small Collaborative Research | Joint Compressive Spectral Imaging and 3D Range Sensing | $860K | 2015–2019 |
| NSF CIF: Small | Blue-Noise Graph Sampling (partial support) | $500K | 2018–2021 |
| NSF CIF: Small | Hypergraph Signal Processing (partial support) | $600K | 2023–2026 |
| USDA/NIFA | UAS Remote Sensing Spectral and Spatial Calibration | $613K | 2023–2027 |
| U.S. Department of Energy | Utility Asset Load Modeling and Event Detection | $1.0M | 2025–2027 |
| Student | Degree | Institution | Year | Current Position |
|---|---|---|---|---|
| Hoover Rueda | Ph.D. | University of Delaware | 2018 | Universidad Industrial de Santander |
| Juan Felipe Florez-Ospina | Ph.D. | University of Delaware | 2022 | Paul Scherrer Institute |
This thrust sits at the intersection of the other three areas of my research program:
- Thrust 1 (Structured Light): Time-of-flight depth sensing, which features prominently in Thrust 3b, shares hardware and calibration infrastructure with structured light systems. The joint spectral-depth camera architecture is a natural extension of the structured light 3D scanner toward richer scene understanding.
- Thrust 2 (Graph/Hypergraph SP): The graph-based smoothness priors used in spectral image reconstruction (Thrust 3c) are direct applications of the graph signal processing theory developed in Thrust 2. The Sudoku filter array design (Thrust 3a) draws on the same blue-noise sampling theory that underlies blue-noise graph sampling.
- Thrust 4 (Halftoning): The coded aperture design problem — place a binary mask on a spatial array to optimize measurement quality — is mathematically equivalent to the halftoning problem. Blue-noise coded apertures are direct applications of blue-noise halftone mask theory to the optics domain.