Laplacian Based Image Fusion: Jesse Scott and Michael A. Pusateri
Laplacian Based Image Fusion: Jesse Scott and Michael A. Pusateri
useful pixel at every pixel index. In the fusion image, the This type of image is representative of the large dynamic range
visible band image provides the foreground car and interior of that occurs when looking at a transition between sunlit outdoor
the garage while seamlessly transitioning between the thermal and electrically lit indoor content for NIR imagery.
content in the background of the outdoor portion of the scene.
Figure 7 - Example 3 Results – NIR (Top Left), LWIR (Top Right), and Laplacian Fusion (Bottom)
channels results in imagery similar to grayscale imaging over
V. ANALYSIS all of the visible channels. The fusion result has increased
The preliminary results are predictable and consistent. As detail in multiple regions, but generally has the same level of
shown in Example 1, the fusion of multiple visible spectral detail as the MATLAB grayscale result.
Table 1 – Execution times for each example image set
Example Matrix Dimensions Building Sparse Matrix Building gradient field Solving linear system
(pixels) (sec) (sec) (sec)
1 193929 x 193929 0.677151 0.373219 1.536592
2 303849 x 303849 0.956812 0.398306 2.627780
3 1784322 x 1784322 5.831647 1.780139 24.698355
When fusing discontinuous and/or non-visible channels, the The current method for solving the Poisson equation is not
resulting image is subtly mixed and intuitive to understand. acceptable for FPGA implementation in a real-time system.
The technique is capable of seamlessly fusing imagery that has The numerical manipulation requires the entire image to be
regions that are improperly exposed or washed out while not formed into a large matrix which would result in a minimum of
allowing this to affect the output image. Example 2 and one frame of latency. Approximations to the integration of the
Example 3 illustrate very common and reoccurring situations derivative image must be developed to make this fusion
where multiple parts of the image fail to capture information approach viable for night vision goggles.
efficiently. However, the idea that there is always some
information in an image is important to the premise that image REFERENCES
fusion is valuable as a mechanism to enhance information [1] Wang, Z.; Ziou, D.; Armenakis, C.; Li, D.; Li, Q.; "A comparative
density in images. It should be clear from visual analysis that analysis of image fusion methods," IEEE Transactions on Geoscience
the results in Figure 6 show that the small amount of and Remote Sensing, vol.43, no.6, pp. 1391-1402, June 2005. doi:
10.1109/TGRS.2005.846874.
information in the LWIR channel was integrated in to the
[2] Fay, D.A. et al; "Fusion of multi-sensor imagery for night vision: color
visible image even though there was sufficient information in visualization, target learning and search," Proceedings of the Third
the NIR channel and only slightly more information in the International Conference on Information Fusion, 2000. FUSION 2000.
LWIR channel. vol.1, 10-13 July 2000. doi: 10.1109/IFIC.2000.862702.
[3] Waxman, A.M. et al; “Solid-state color night vision: fusion of low-light
The Poisson solver is not a low latency mathematical visible and thermal infrared imagery,” MIT Lincoln Laboratory Journal,
method, but approximation techniques are currently being vol.11, pp.41-60, 1998.
reviewed that are low latency. All other mathematics in this [4] Hui Li; Manjunath, B.S.; Mitra, S.K.; "Multi-sensor image fusion using
design have clear implementations that are low latency with the wavelet transform," IEEE International Conference on Image
exact calculations. The MATLAB PC execution times for each Processing, 1994. ICIP 1994, vol.1, pp.51-55. Nov 13-16 1994. doi:
example are shown in Table 1. 10.1109/ICIP.1994.413273.
[5] Wen, J.; Li, Y.; Gong, H.; “Remote Sensing Image Fusion on Gradient
VI. CONCLUSIONS Field,” Proceedings of the 18th international Conference on Pattern
Recognition. ICPR 2006. vol.3, pp.643-646. Washington, DC, Aug. 20-
The proposed application of the Laplacian blending method 24, 2006. doi: 10.1109/ICPR.2006.995.
appears to provide excellent performance under a variety of [6] Chen, Y.; Blum, R.S.; "Experimental tests of image fusion for night
conditions. The method automatically adapts between the vision, "8th International Conference on Information Fusion, 2005.
scene content of the input images while adding information vol.1, pp. 8, 25-28 July 2005. doi: 10.1109/ICIF.2005.1591895.
from one and/or the other image depending on the Laplacian [7] Waxman, A.; Fay, D.; Ilardi, P.; Savoye, D.; Biehl, R.; Grau, D.; "Sensor
Fused Night Vision: Assessing Image Quality in the Lab and in the
content. Field," 9th International Conference on Information Fusion, 2006, pp.1-
As imaging moves to longer wavelengths, the photon 8, 10-13 July 2006. doi: 10.1109/ICIF.2006.301767.
measurement is dominated more by emitted rather than [8] Petrovic, V.; Xydeas, C.; "Objective image fusion performance
characterization," Tenth IEEE International Conference on Computer
reflected photons. Laplacian fusion is dependent on the basic Vision, 2005. ICCV 2005. vol.2, pp.1866-1871. Oct. 17-21, 2005. doi:
premise that information is stored in the derivative of the 10.1109/ICCV.2005.175.
image. While this dependence is somewhat obvious for visible [9] Fattal, R.; Lischinski, D.; Werman, M.; “Gradient domain high dynamic
wavelengths, it is not as clear for longer wavelengths. range compression,” Proceedings of the 29th Annual Conference on
Emissive imagers may not store their information in the Computer Graphics and interactive Techniques. SIGGRAPH 2002. San
derivatives at the same level as the reflective imagers. As a Antonio, Texas, July 23-26, 2002. New York, NY, pp.249-256. doi:
10.1145/566570.566573.
result, the Laplacian fusion technique may mix channels with
unequal weighting. [10] Levin, A.; Zomet, A.; Peleg, S.; Weiss, Y.; “Seamless image stitching in
the gradient domain,” European Conference on Computer Vision. ECCV
2004. pp.377-389. 2004.
VII. FUTURE WORK
[11] Pérez, P.; Gangnet, M.; Blake, A.; “Poisson image editing.” ACM
This technique relies on boundary conditions to provide Transactions on Graphics. Vol.22, Iss.3, pp313-318, July 2003. doi:
initial conditions in the Poisson equation. To accomplish full 10.1145/882262.882269.
image laplacian fusion, boundary conditions must be [12] Jong Hyun Park; Moon Gi Kang; "Noise reduction for gradient-based
eliminated, made constant, or the effects must be mitigated. multispectral image fusion," Proceedings of International Symposium on
Intelligent Signal Processing and Communication Systems, 2004.
Currently, the implementation uses a single pixel border to ISPACS 2004., pp.185-188, Nov. 18-19, 2004. doi:
create a boundary condition. Each input channel produces a 10.1109/ISPACS.2004.1439042.
different visual result primarily at the boundary.