Nowadays we are witnessing an explosive spread of digital cameras embedding significant digital computation capabilities. The digital images obtained by these devices (e.g., smartphones) are not just optically formed and recorded by the sensor, but after this they are also transformed through digital image processing. The conventional strategy for designing digital cameras follows a sequential procedure: first designing the optics aiming at optimizing the image quality on the sensor plane, and then using digital capablities to further processing and enhancing (e.g., demosaicing, color balance, denoising, etc.) the registered image.
In this project we propose testing an innovative hypothesis. Namely, whether it is possible to improve the current quality/cost ratio of digital imaging devices by using an integrated hybrid optical-digital design strategy. Such new approach considers the image quality improvement due to digital image processing in the integrated imaging process right from the optical design stage. To rigorously test this hypothesis implies an ambitious multidisciplinary project of scientific and technical exploration on the properties of the degradation caused by optical systems, on the one hand, and the potential of the digital restoration capabilities, on the other. We need full expertise on both optical design and digital image processing sides.
Previous works on this subject have addressed very particular goals, like digitally compensating for the degradation induced by a specific simple optical configuration, or for the usual camera optics, being its blur estimated through experimental measurements. However, no general framework on integrated optical-digital design was provided.
This proposal establishes a rigorous methodology comprising a set of logically connected goals. Given an image formation optical system: (a) to perform a thorough and efficient optical characterization; (b) to estimate, from (a), a field of Point Spread Functions (PSFs) which describes the blurring and distortion effects of the optical system over extended objects; (c) to build a computational model combining the field of PSFs and sensor noise, describing how an ideal reference image is degraded while being acquired by the imaging device (up to the sensor output); (d) to implement computationally efficient image restoration algorithms which must consider major degradation effects on real sensor images: space-variant blur, image distortion, signal dependent noise, aliasing, chromatic effects, and boundary conditions of the blur on the image frontiers; (e) to establish a performance metric and a method to optimize it by modifying the physical configuration of the system. Such metric comprises both quantifying the image degradation for each configuration and estimating the positive expected effect of the digital image restoration stage.
We note that the general goals of this project are not only applicable in the context of designing optical-digital integrated cameras. They are also useful by themselves in a number of research fields: optical characterization, imaging computational modeling, imaging digital simulation and image restoration.