Skip to main content
  • Submit
  • About
    • Editorial Board
    • PNAS Staff
    • FAQ
    • Rights and Permissions
    • Site Map
  • Contact
  • Journal Club
  • Subscribe
    • Subscription Rates
    • Subscriptions FAQ
    • Open Access
    • Recommend PNAS to Your Librarian
  • Log in
  • My Cart

Main menu

  • Home
  • Articles
    • Current
    • Latest Articles
    • Special Features
    • Colloquia
    • Collected Articles
    • PNAS Classics
    • Archive
  • Front Matter
  • News
    • For the Press
    • Highlights from Latest Articles
    • PNAS in the News
  • Podcasts
  • Authors
    • Information for Authors
    • Purpose and Scope
    • Editorial and Journal Policies
    • Submission Procedures
    • For Reviewers
    • Author FAQ
  • Submit
  • About
    • Editorial Board
    • PNAS Staff
    • FAQ
    • Rights and Permissions
    • Site Map
  • Contact
  • Journal Club
  • Subscribe
    • Subscription Rates
    • Subscriptions FAQ
    • Open Access
    • Recommend PNAS to Your Librarian

User menu

  • Log in
  • My Cart

Search

  • Advanced search
Home
Home

Advanced Search

  • Home
  • Articles
    • Current
    • Latest Articles
    • Special Features
    • Colloquia
    • Collected Articles
    • PNAS Classics
    • Archive
  • Front Matter
  • News
    • For the Press
    • Highlights from Latest Articles
    • PNAS in the News
  • Podcasts
  • Authors
    • Information for Authors
    • Purpose and Scope
    • Editorial and Journal Policies
    • Submission Procedures
    • For Reviewers
    • Author FAQ

New Research In

Physical Sciences

Featured Portals

  • Physics
  • Chemistry
  • Sustainability Science

Articles by Topic

  • Applied Mathematics
  • Applied Physical Sciences
  • Astronomy
  • Computer Sciences
  • Earth, Atmospheric, and Planetary Sciences
  • Engineering
  • Environmental Sciences
  • Mathematics
  • Statistics

Social Sciences

Featured Portals

  • Anthropology
  • Sustainability Science

Articles by Topic

  • Economic Sciences
  • Environmental Sciences
  • Political Sciences
  • Psychological and Cognitive Sciences
  • Social Sciences

Biological Sciences

Featured Portals

  • Sustainability Science

Articles by Topic

  • Agricultural Sciences
  • Anthropology
  • Applied Biological Sciences
  • Biochemistry
  • Biophysics and Computational Biology
  • Cell Biology
  • Developmental Biology
  • Ecology
  • Environmental Sciences
  • Evolution
  • Genetics
  • Immunology and Inflammation
  • Medical Sciences
  • Microbiology
  • Neuroscience
  • Pharmacology
  • Physiology
  • Plant Biology
  • Population Biology
  • Psychological and Cognitive Sciences
  • Sustainability Science
  • Systems Biology
Research Article

Miniature curved artificial compound eyes

Dario Floreano, Ramon Pericet-Camara, Stéphane Viollet, Franck Ruffier, Andreas Brückner, Robert Leitel, Wolfgang Buss, Mohsine Menouni, Fabien Expert, Raphaël Juston, Michal Karol Dobrzynski, Geraud L’Eplattenier, Fabian Recktenwald, Hanspeter A. Mallot, and Nicolas Franceschini
PNAS June 4, 2013 110 (23) 9267-9272; https://doi.org/10.1073/pnas.1219068110
Dario Floreano
aLaboratory of Intelligent Systems, École Polytechnique Fédérale de Lausanne, CH-1015 Lausanne, Switzerland;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: dario.floreano@epfl.ch
Ramon Pericet-Camara
aLaboratory of Intelligent Systems, École Polytechnique Fédérale de Lausanne, CH-1015 Lausanne, Switzerland;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Stéphane Viollet
bAix-Marseille Université, Centre National de la Recherche Scientifique, Institut des Sciences du Mouvement, Unité Mixte de Recherche 7287, 13288 Marseille Cedex 09, France;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Franck Ruffier
bAix-Marseille Université, Centre National de la Recherche Scientifique, Institut des Sciences du Mouvement, Unité Mixte de Recherche 7287, 13288 Marseille Cedex 09, France;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Andreas Brückner
cFraunhofer Institute for Applied Optics and Precision Engineering, 07745 Jena, Germany;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Robert Leitel
cFraunhofer Institute for Applied Optics and Precision Engineering, 07745 Jena, Germany;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Wolfgang Buss
cFraunhofer Institute for Applied Optics and Precision Engineering, 07745 Jena, Germany;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Mohsine Menouni
dAix-Marseille Université, Centre National de la Recherche Scientifique, Centre de Physique des Particules de Marseille, Unité Mixte de Recherche 7346, 13288 Marseille Cedex 09, France; and
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Fabien Expert
bAix-Marseille Université, Centre National de la Recherche Scientifique, Institut des Sciences du Mouvement, Unité Mixte de Recherche 7287, 13288 Marseille Cedex 09, France;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Raphaël Juston
bAix-Marseille Université, Centre National de la Recherche Scientifique, Institut des Sciences du Mouvement, Unité Mixte de Recherche 7287, 13288 Marseille Cedex 09, France;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Michal Karol Dobrzynski
aLaboratory of Intelligent Systems, École Polytechnique Fédérale de Lausanne, CH-1015 Lausanne, Switzerland;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Geraud L’Eplattenier
aLaboratory of Intelligent Systems, École Polytechnique Fédérale de Lausanne, CH-1015 Lausanne, Switzerland;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Fabian Recktenwald
eLaboratory of Cognitive Neuroscience, Department of Biology, University of Tübingen, 72076 Tübingen, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Hanspeter A. Mallot
eLaboratory of Cognitive Neuroscience, Department of Biology, University of Tübingen, 72076 Tübingen, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Nicolas Franceschini
bAix-Marseille Université, Centre National de la Recherche Scientifique, Institut des Sciences du Mouvement, Unité Mixte de Recherche 7287, 13288 Marseille Cedex 09, France;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  1. Edited by Wilson S. Geisler, The University of Texas at Austin, Austin, TX, and approved April 23, 2013 (received for review November 7, 2012)

  • Article
  • Figures & SI
  • Info & Metrics
  • PDF
Loading

Abstract

In most animal species, vision is mediated by compound eyes, which offer lower resolution than vertebrate single-lens eyes, but significantly larger fields of view with negligible distortion and spherical aberration, as well as high temporal resolution in a tiny package. Compound eyes are ideally suited for fast panoramic motion perception. Engineering a miniature artificial compound eye is challenging because it requires accurate alignment of photoreceptive and optical components on a curved surface. Here, we describe a unique design method for biomimetic compound eyes featuring a panoramic, undistorted field of view in a very thin package. The design consists of three planar layers of separately produced arrays, namely, a microlens array, a neuromorphic photodetector array, and a flexible printed circuit board that are stacked, cut, and curved to produce a mechanically flexible imager. Following this method, we have prototyped and characterized an artificial compound eye bearing a hemispherical field of view with embedded and programmable low-power signal processing, high temporal resolution, and local adaptation to illumination. The prototyped artificial compound eye possesses several characteristics similar to the eye of the fruit fly Drosophila and other arthropod species. This design method opens up additional vistas for a broad range of applications in which wide field motion detection is at a premium, such as collision-free navigation of terrestrial and aerospace vehicles, and for the experimental testing of insect vision theories.

  • bioinspired robotics
  • wide-angle vision
  • optic flow sensor
  • Micro-opto-electromechanical systems

Insect compound eyes consist of a mosaic of tiny optical units, or ommatidia (1). Compared with vertebrate single-lens eyes, compound eyes offer a versatile morphology with panoramic field of view (FOV), negligible distortion and aberration, and high temporal resolution, while trading high spatial resolution for diminutive size (2). These features are particularly beneficial for visually controlled navigation, including tasks like collision avoidance, take-off, landing, and other optomotor responses that do not require a high density of photoreceptors. Insect compound eyes possess local sensory adaptation mechanisms capable of compensating for large changes in light intensity right at the photoreceptor level (3, 4), and they wrap around highly distributed neuronal circuitry, allowing for fast and low-power integrated signal processing (5), whereas minimizing the overall size of the insect head. An artificial compound eye exhibiting all these properties would represent an ideal miniature sensor for numerous situations in which fast motion detection across wide FOVs is required (6⇓⇓–9).

Attempts have recently been made to develop miniature compound eyes. Both planar (10) and curved (11⇓⇓–14) microlens arrays have been fabricated. In some cases, these were interfaced with conventional flat CMOS arrays, but this resulted in off-axis aberrations, crosstalk between neighboring ommatidia, or limited FOV. At first sight, recent developments in flexible sensors (15⇓⇓–18) could represent a promising avenue for curved vision sensors (19, 20). However, adapting those flexible technologies to the design of curved compound eyes is challenging due to the recurring problem of precisely aligning a curved photodetector array with a curved microlens array. None of these previous methods or other available omnidirectional camera systems (21) display important features of biological compound eyes, such as embedded data processing, versatile morphologies, high temporal resolution, and local light adaptation in a miniature package.

Here, we describe a unique approach to the design of a curved artificial compound eye, named CurvACE. We validate this approach by characterizing the sensitivity, angular resolution, and motion extraction capabilities of a prototype bearing a semicylindrical morphology (Fig. 1A) and a hemispherical FOV of 180° × 60° (Fig. 1B). This prototype is a self-contained, integrated, curved artificial compound eye system with morphology and properties resembling those of primitive (22) (Fig. 1C) and modern (1) (Fig. 1D) arthropods. In particular, the CurvACE prototype features several similarities with the eye of the fruit fly Drosophila, namely, spatial resolution, acceptance angle, number of ommatidia, local light adaptation, crosstalk prevention, and signal acquisition bandwidth, as well as a smaller but comparable FOV (Table 1). Such curved visual sensors may be useful for terrestrial and aerial vehicles, medical instruments, prosthetic devices, home automation, surveillance, motion capture systems, and smart clothing. Artificial compound eyes may also foster the development of alternative visual algorithms, and when fitted on physical robots, they could help explore fundamental principles in animal sensory-motor control (6, 8, 9, 23).

Fig. 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 1.

Artificial and natural curved compound eyes. (A) Image of the CurvACE prototype. The entire device occupies a volume of 2.2 cm3, weighs 1.75 g, and consumes 0.9 W at maximum power. (B) Illustration of the panoramic FOV of the fabricated prototype. The dots and circles represent the angular orientation and acceptance angle Δρ of every ommatidium, respectively. Compound eye of the extinct trilobite Erbenochile erbeni (22) (C) and of the fruit fly Drosophila melanogaster (D). [(C) Reprinted from ref. 22 with permission from AAAS; (D) Reprinted from ref. 44 with permission from AAAS.]

View this table:
  • View inline
  • View popup
Table 1.

Specifications of CurvACE prototype compared with the characteristics of the Drosophila melanogaster compound eye

Fabrication Process

Design Method.

As with biological compound eyes, CurvACE artificial ommatidia consist of three materially and functionally different layers (Fig. 2A): (i) an optical layer composed of an array of highly transparent polymer microlenses molded on a glass carrier (Fig. S1), which focus light precisely onto (ii) the sensitive areas of a silicon-based photodetector layer. This layer contains an array of analog very-large-scale integration (VLSI) photodetectors as well as additional circuitry to condition the signal for processing (Fig. S2). Finally, (iii) a flexible electromechanical interconnection layer, formed by a polyimide printed circuit board (PCB), physically supports the ensemble and transfers the output signals from the individual ommatidia (Fig. 2B) to the processing units. With thicknesses of 550 μm, 300 μm, and 100 μm, respectively, the total height of the three assembled layers is less than 1 mm in the prototype presented here.

Fig. 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 2.

CurvACE design and assembly. (A) Scheme of the three layers that compose the CurvACE artificial ommatidia: optical (microlenses and apertures), photodetector (CMOS chip), and interconnection (PCB). (B) Accurate alignment and assembly process of the artificial ommatidia layers in planar configuration. (C) Dicing of the assembled array in columns down to the flexible interconnection layer, which remains intact. (D) Curving of the ommatidial array along the bendable direction and attachment to a rigid semicylindrical substrate with a radius of curvature of 6.4 mm to build the CurvACE prototype. Two rigid circuit boards containing two microcontrollers, one three-axis accelerometer, and one three-axis rate gyroscope are inserted into the rigid substrate concavity and soldered to the sides of the ommatidia through dedicated pads (Figs. S3D and S4).

The apposition and neural superposition compound eyes of many arthropod species contain pigmented sidewalls that contribute to reducing optical crosstalk between ommatidia (1). Our solution to suppress optical crosstalk makes use of two low-reflective opaque metal layers with matching pinhole patterns: one subjacent to the microlens array and the other one close to the focal plane, ahead of the photodetector layer (10) (Fig. 2 A and B and Fig. S1C).

The proposed design method is based on a planar fabrication technology for each of the three layers of the artificial ommatidia array, followed by high-precision cutting (dicing) of the rigid ommatidia layers to add bendability. Specifically, each of the three layers is fabricated at first with wafer-level (optical and photodetector layer) or batch-level (interconnection layer) processes using standard microengineering technologies (Figs. S1 and S2). Next, the optical and photodetector layers are aligned at micrometer accuracy and glued chip-wise (Fig. 2B). Subsequently, the ensemble is fixed and wire-bonded to the electromechanical interconnection layer. Finally, the rigid optical and photodetector layer stack is precisely separated with a chip dicing saw in columns of ommatidia down to the flexible interconnection layer, which remains intact (Fig. 2C and Fig. S3A). This procedure ensures accurate and reproducible alignment of the optical and photosensitive elements across the array while providing electronic accessibility to the output signals of the individual artificial ommatidia. It results in a very thin and light package, less than 1 mm and 0.36 g in the prototype presented here, and ensures mechanically safe bending of the interconnection layer down to a small radius of curvature (Fig. 2D). Free space on the backside of the artificial ommatidia permits attachment to curved rigid or flexible substrates and incorporation of additional electronics for signal processing in the resulting concavity (Fig. 2D).

Fabrication of a CurvACE Prototype.

We fabricated a CurvACE prototype by bending a rectangular array of 42 columns of 15 artificial ommatidia (microlens diameter = 172 μm) down to a curvature radius of 6.4 mm along its longer direction to yield a 180° FOV in the horizontal plane (Fig. 1 A and B and Fig. S3 B and C). This curvature should nominally yield an interommatidial angle Δφh of 4.3° in the equatorial row along the bent direction. Although there is no mechanical bending along the vertical direction, it is possible to make the visual axes of the 15 ommatidia in each column fan out in the vertical plane by making the vertical pitch between the photodetectors stepwise smaller than the vertical pitch between the microlenses (10) (Fig. S1C). In the prototype, the photodetector pitch was calculated so as to obtain a similar value for the interommatidial angle Δφv along the vertical unbent direction, which results in a total vertical FOV of 60° (Fig. 1B). To avoid spatial aliasing or blind spots in the visual field, the acceptance angle Δρ of each ommatidium must closely approach the interommatidial angle Δφ (1, 24) (Fig. 3C). Therefore, the ommatidial lenslets, diaphragms, and photodetectors were designed using an optical ray tracing technique (Zemax; Radiant Zemax, LLC) to produce an acceptance angle Δρ of 4.3°.

Fig. 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 3.

Characterization of CurvACE angular sensitivity. (A) Measured ASF along the middle (equatorial) row (black curves), with the corresponding interommatidial angles Δφh eq (red triangles) and mean acceptance angles Δρh (blue circles) of the CurvACE ommatidia averaged along every column. Error bars display SDs. (B) Mean horizontal interommatidial and acceptance angles averaged along every row of artificial ommatidia as a function of the elevation angle α. The black curve shows the theoretical Δφh values obtained using Eq. S10 with a constant Δφh max of 4.2°. (C) Schematic representation of the acceptance angle Δρ of an ommatidium and the interommatidial angle Δφ calculated from the peak ASFs of two neighboring ommatidia. (D) Measured ASFs along a single column of artificial ommatidia (black curves), mean vertical interommatidial (red triangles), and acceptance angles (blue circles) averaged along every row of artificial ommatidia. a.u., arbitrary units; deg., degree.

The resulting concavity on the backside of the prototype after the mechanical bending along the horizontal direction is used to host two microcontrollers, two inertial sensors, and other electronic components that are fitted and soldered on two rigid PCBs (Fig. 2D and Figs. S3C and S4). In the experiments described below, the embedded microcontrollers are programmed to operate the visual data read-out and communicate with an external computer for analysis; in a stand-alone application, these microcontrollers can be used to process visual data onboard the prototype without any external computer.

Results

Characterization of Visual Sampling.

To characterize the visual sampling of the environment by the fabricated CurvACE prototype, we measured the angular sensitivity function (ASF) of each of the 630 artificial ommatidia (Fig. S5). Fig. 3 A and D shows representative examples of ASFs measured along a single row and a single column, respectively. Most ASFs display the expected Gaussian distribution with respect to the light incidence angle, which validates both the microoptical design and the precise alignment with each individual photodetector. We derived the experimental acceptance angles and interommatidial angles from the measured ASFs. The acceptance angle Δρ of an ommatidium is defined as the full width at half maximum (FWHM) of its Gaussian-like ASF. The horizontal and vertical interommatidial angles Δφh and Δφv were assessed from the angular position of the peak of the ASFs of two adjacent ommatidia (Fig. 3C). The measured acceptance angles yielded an average of Δρ of 4.2° ± 0.3° (SD) for both horizontal (Fig. 3A) and vertical (Fig. 3D) directions. The vertical interommatidial angles resulted in an average of Δφv of 4.26° ± 0.16° (SD) (Fig. 3D), and the horizontal ones ranged from Δφh = 4.2° ± 0.8° (SD) in the middle row (Fig. 3A) to 3.7° ± 0.7° (SD) in the top and bottom rows (Fig. 3B). The close match between the experimentally measured acceptance angles and interommatidial angles validates both the ray-tracing design and fabrication process while indicating that the CurvACE prototype, like the fruit fly compound eye (24), performs an adequate sampling of its entire FOV (Fig. 1B). The observed spread in the values of the horizontal interommatidial angles Δφh (Fig. 3 A and B) is probably due to the manual process used to mechanically fix the flexible PCB supporting the artificial ommatidia array onto the rigid curved substrate.

Characterization of Ommatidium Light Adaptation.

In the natural world, visual sensors must cope with a wide dynamic range of irradiance, which can span on the order of 8 decades over the course of a day. Light variations within a scene are particularly challenging because they can make part of the visual field nonresponsive due to photoreceptor saturation. Animal retinae partly solve this crucial problem by means of a local light adaptation mechanism integrated within each photoreceptor (3, 4, 25, 26). Similarly, we have equipped each prototype ommatidium with a neuromorphic adaptation circuit (Fig. S2D) that operates independent of its 629 neighbors. The neuromorphic circuit originally proposed by Delbrück and Mead (27) was modified here by cascading a first-order, low-pass filter (Fig. S2D). This modification prevents temporal aliasing and keeps the photodetector bandwidth of 300 Hz practically constant across the entire studied range of ambient lighting conditions. The circuit design was further optimized (SI Text, Photodetector Layer) to minimize the transient gain dispersion of each autoadaptive circuit. Fig. 4 shows the mean steady state and transient responses of 11 artificial ommatidia (photodetectors with optics) in one column to light step increments and decrements presented at four different steady light levels (Fig. S6). At each of these four levels (red circles in Fig. 4), the output response of the individual ommatidia to light steps yields an S-shaped operating curve in a semilog plot. Adaptation to a novel steady irradiance level essentially produces a horizontal shift of the curve without markedly changing its slope, which represents a dynamic sensitivity of about 1,300 mV per decade in the linear part. The steady operating curve (shown in red in Fig. 4) is also a logarithmic function of the adapting light, but with a slope (steady sensitivity) about 12-fold smaller. Thanks to the optimized design of the adaptive photodetector layout, the averaged dispersion of the sensitivity over the four operating curves is as small as 11 mV, that is, only about 2% of the total 600-mV dynamic range.

Fig. 4.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 4.

CurvACE autoadaptation to ambient light at the single ommatidium level. Steady-state (red dots) and transient (green dots) responses of the adaptive analog VLSI photodetectors [design based on a circuit proposed by Delbrück and Mead (27)]. Each of the four dynamic operating curves (in green) shows the V(log I) response, averaged over 11 ommatidia (photodetectors with optics) of one column, to step increments and decrements of irradiance (Fig. S6) about four steady levels (red circles).

The four operating curves demonstrate not only the high sensitivity of the prototype ommatidia but their relative invariance in sensitivity to the ambient light. These V(log I) curves shifting with the operating points are reminiscent of those obtained in analogous experiments carried out on single vertebrate (25) and invertebrate (26) photoreceptors. This local adaptation is essential for efficient sampling of natural environments because it prevents saturation of the photoreceptors by bright spots in the visual scene while allowing them to adapt quickly to untoward illumination changes, such as transitions from a shaded area to a sunny place.

Characterization of Motion Extraction.

In addition to an extensive FOV (Fig. 3) and local adaptation to illuminance (Fig. 4), the CurvACE prototype ommatidia yield a signal acquisition bandwidth of 300 Hz, which is threefold higher than that measured in the ommatidia of fast-flying insects (28). A high bandwidth contributes to the reduction of motion aliasing during fast locomotion. Furthermore, the implemented read-out protocol (Fig. S7) allows a maximum frame rate of 1.5 kfps, which permits frame averaging to improve the signal-to-noise ratio. We experimentally tested CurvACE motion detection capabilities by computing optic flow vectors from visual signals resulting from different types of motion in the presence of random black and white patterns on a wall (Fig. S8). In this first experiment, we used a modified version of the Lucas–Kanade method (29, 30) (SI Text, Optic Flow Characterization and Eqs. S1–S9), which is a particularly efficient image-based processing algorithm used to calculate optic flow vectors in two dimensions. The optic flow vectors measured during roll rotation (Fig. 5A) and linear translation toward a textured wall 0.3 s before collision (Fig. 5B) show coherent patterns of visual rotation and expansion, respectively. The center of rotation and focus of expansion can be clearly identified (red dots in Fig. 5), allowing for estimation of the axis of rotation and of the direction of translation, respectively. The sensor egomotion can be estimated from these flow fields, for instance, by implementing matched filters (31) analogous to the directionally selective, motion-sensitive neurons found in some insect visual systems (5). Furthermore, the embedded inertial sensors can be used for cancelling the rotational component of the measured optic flow, assessing only the translational component. Because this component is related to distance from objects, the optic flow data provided by a CurvACE prototype could assist mobile platforms to perform collision-free navigation (9).

Fig. 5.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 5.

Optic flow fields from the CurvACE prototype. Cylindrical equidistant projections of the optic flow field calculated with a modified version of the Lucas–Kanade method (29, 30) from the visual signals obtained by the CurvACE prototype subjected to roll motion (Fig. S8B) at 32° per second and at a distance of about 1 m to a wall displaying random black and white patterns (A) or to linear translation (Fig. S8C) at 3 cm/s toward the patterned wall at a distance of 1 cm (B). The red spot displays the center of rotation (A) or the focus of expansion (B).

We also characterized motion detection quantitatively at different ambient light levels with a bioinspired local visual processing algorithm based on the “time-of-travel” scheme (32) (Fig. S9). Fig. 6 shows the angular speed ωmedian obtained by measuring the rotational optic flow as a function of the yaw rotational speed of CurvACE surrounded by a natural pattern. The experimental data show a good match between the rotational speed perceived by CurvACE and the true rotational speed. The error in the regression coefficient (linearity error) ranges from 5 to 10% (Fig. 6) at the three illumination levels, indicating that the CurvACE sensor takes full advantage of its autoadaptive analog VLSI photodetectors to make motion detection largely invariant to different illumination conditions. With the time-of-travel scheme, any pair of neighboring ommatidia driving a “local motion sensor” is able to measure angular velocities ranging from 50° to 358° per second for the interommatidial angle of 4.2° with sufficient accuracy. Measurement limitation at lower speeds is due to the signal attenuation brought about by the spatiotemporal processing present in each artificial ommatidium (Fig. S9A).

Fig. 6.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 6.

Characterization of CurvACE motion detection capabilities. (A–C) Angular speed characteristics of CurvACE calculated with a method based on the time-of-travel scheme (32) (Fig. S9) assessed by applying steps of yaw rotational speed Ωyaw to the sensor at 10° per second, lasting 10 s each, with the prototype placed at the center of a 105-cm diameter arena lined with prints of a natural image. The dashed line displays the theoretical trend.

Discussion

The prototype presented here represents one of many possible manifestations of the CurvACE design principle. It yields a compact, lightweight, energy-efficient, miniature vision sensor that suits a broad range of applications requiring fast motion detection across a panoramic FOV. The applied optical and electronic parameters enable this prototype to measure optic flow patterns caused by sensor egomotion within a contrasted environment. A prototype with these characteristics could be used for autonomous terrestrial navigation, in analogy with some crab species (33) that use quasicylindrical compound eyes to navigate in flat environments. Furthermore, the hemispherical FOV of the prototype obtained by horizontal bending and by the longer microlens vertical pitch distance resembles the FOV of flying insects (1). Thus, such a prototype could also be used in Micro Air Vehicles (MAV) to support a large number of navigation tasks, such as egomotion estimation (34), collision avoidance (6, 7), and flight control (8, 9, 35), at low and high speeds, even in complex indoor and outdoor environments.

The CurvACE design principle also allows for flexible customization of artificial ommatidia in terms of their number, size, focal length, and interommatidial and acceptance angles, according to the requirements of the intended use. The artificial ommatidia could be further tailored by taking inspiration from the extraordinary eye regionalization found in insects and crustaceans, where specific parts of the compound eye serve specific functions. For example, higher acuity (36) may be obtained by increasing ommatidial resolution in defined areas, which could be achieved by decreasing both the acceptance angle and the interommatidial angle through redesigned microlenses and a reduced photodetector size with a consequent loss of signal-to-noise ratio. Design variations in the ommatidial optics or photodetector characteristics could yield regions of higher light capture (37) or different spectral (38) or polarization sensitivity (39).

The size of the CurvACE prototype described here is comparable to that of some trilobite eyes (22) (Fig. 1C) and some crab eyes (33), but reaching the diminutive size of insect eyes is challenging because it implies various tradeoffs. Increasing the surface density of artificial ommatidia requires decreasing photosensor size, chip circuitry, and microlens diameter at the cost of lower sensitivity and signal-to-noise ratio. Considering state-of-the-art technologies, we have estimated that the CurvACE prototype could be further reduced by a factor of 2. Further increments of surface density via hexagonal arrangement of ommatidia, similar to that found in many insect eyes, may be possible but would require different cutting methods. In the future, the development of vertical integration of 3D electronic circuits could further reduce the footprint size at the cost of chip thickness.

The CurvACE design opens up new avenues for vision sensors with alternative morphologies and FOVs of up to 360° in small, compact packages. In particular, the realization of a fully cylindrical CurvACE with a 360° FOV in the horizontal plane is relatively straightforward, either by attaching two semicylindrical prototypes (Fig. S4D) or by fabricating a wider array with a larger number of ommatidia. A CurvACE prototype with a truly omnidirectional FOV, reminiscent of the eye morphology of most flying insects, would be especially interesting for egomotion estimation and better navigational support in three dimensions in a minimal package, providing an advantageous alternative to current cumbersome arrangements based on catadioptric or fish-eye lenses (9). A spherical CurvACE could be realized by fabricating and individually bending several ommatidial arrays with one ommatidium per column along the meridians of a sphere to measure optic flow omnidirectionally.

The CurvACE design is expected to foster further research and applications on fully flexible vision sensors (40, 41) that can adapt to rigid or unsteady surfaces of arbitrary shapes. Such devices could function as thin wearable sensors on smart clothing, as sensors for intelligent homes, or integrated in the artificial skin of soft robots. Toward these applications, future work could devise methods for cost-effective mass production of artificial ommatidia, which would also allow more complex dicing to achieve alternative bending patterns. Such production methods may include the realization of all processes of ommatidia alignment and assembly at the wafer level with the help of robotic platforms for automatized pick-and-place, bonding, and dicing.

Acknowledgments

We thank Jean-Christophe Zufferey and Jacques Duparré for conceptual suggestions; Stéphanie Godiot, Patrick Breugnon, Alain Calzas, Rémy Potheau, and Marc Boyron for their help in VLSI circuit design and tests; Felix Kraze for assisting in prototype assembly; Julien Diperi for test bench realization; David O’Carroll and Russell Brinkworth for the natural photograph used in optic flow characterization setup; Antoine Beyeler for the CurvACE prototype photograph; and Claudio Bruschini for project management. We also thank the anonymous reviewers, whose comments have contributed largely to improve the manuscript. The CURVACE project acknowledges the financial support of the Future and Emerging Technologies (FET) program within the Seventh Framework Programme for Research of the European Commission, under FET-Open Grant 237940. This work also received financial support from the Swiss National Centre of Competence in Research Robotics of the Swiss National Science Foundation, the French National Center for Scientific Research and Aix-Marseille University, the French National Research Agency, and the German Federal Ministry of Education and Research.

Footnotes

  • ↵1To whom correspondence should be addressed. E-mail: dario.floreano{at}epfl.ch.
  • Author contributions: D.F., S.V., A.B., and N.F. designed research; R.P.-C., R.L., W.B., M.M., F.E., R.J., M.K.D., G.L., and F. Recktenwald performed research; S.V., F. Ruffier, R.L., W.B., M.M., F.E., R.J., G.L., and F. Recktenwald contributed with technical and analytic tools; D.F., R.P.-C., S.V., F. Ruffier, A.B., R.L., W.B., M.M., F.E., R.J., F. Recktenwald, H.A.M., and N.F. analyzed data; and D.F., R.P.-C., S.V., F. Ruffier, A.B., R.L., F. Recktenwald, H.A.M., and N.F. wrote the paper.

  • The authors declare no conflict of interest.

  • This article is a PNAS Direct Submission.

  • This article contains supporting information online at www.pnas.org/lookup/suppl/doi:10.1073/pnas.1219068110/-/DCSupplemental.

References

  1. ↵
    1. Land MF,
    2. Nilsson D-E
    (2002) Animal Eyes (Oxford Univ Press, Oxford).
  2. ↵
    1. Kirschfeld K
    (1976) The resolution of lens and compound eyes. Neural Principles in Vision, eds Zettler F, Weiler R (Springer, Berlin), pp 354–370.
  3. ↵
    1. Laughlin SB
    (1989) The role of sensory adaptation in the retina. J Exp Biol 146(1):39–62.
    OpenUrlAbstract/FREE Full Text
  4. ↵
    1. Gu Y,
    2. Oberwinkler J,
    3. Postma M,
    4. Hardie RC
    (2005) Mechanisms of light adaptation in Drosophila photoreceptors. Curr Biol 15(13):1228–1234.
    OpenUrlCrossRefPubMed
  5. ↵
    1. Krapp HG,
    2. Hengstenberg R
    (1996) Estimation of self-motion by optic flow processing in single visual interneurons. Nature 384(6608):463–466.
    OpenUrlCrossRefPubMed
  6. ↵
    1. Franceschini N,
    2. Pichon JM,
    3. Blanes C
    (1992) From insect vision to robot vision. Philos Trans R Soc Lond B Biol Sci 337(1281):283–294.
    OpenUrlAbstract/FREE Full Text
  7. ↵
    1. Blanchard M,
    2. Rind FC,
    3. Verschure PFMJ
    (2000) Collision avoidance using a model of the locust LGMD neuron. Robot Auton Syst 30(1–2):17–38.
    OpenUrlCrossRef
  8. ↵
    1. Franceschini N,
    2. Ruffier F,
    3. Serres J
    (2007) A bio-inspired flying robot sheds light on insect piloting abilities. Curr Biol 17(4):329–335.
    OpenUrlCrossRefPubMed
  9. ↵
    1. Floreano D,
    2. Zufferey J-C,
    3. Srinivasan MV,
    4. Ellington C
    (2009) Flying Insects and Robots (Springer, Berlin).
  10. ↵
    1. Duparré J,
    2. Dannberg P,
    3. Schreiber P,
    4. Bräuer A,
    5. Tünnermann A
    (2005) Thin compound-eye camera. Appl Opt 44(15):2949–2956.
    OpenUrl
  11. ↵
    1. Jeong K-H,
    2. Kim J,
    3. Lee LP
    (2006) Biologically inspired artificial compound eyes. Science 312(5773):557–561.
    OpenUrlAbstract/FREE Full Text
  12. ↵
    1. Radtke D,
    2. Duparré J,
    3. Zeitner UD,
    4. Tünnermann A
    (2007) Laser lithographic fabrication and characterization of a spherical artificial compound eye. Opt Express 15(6):3067–3077.
    OpenUrlCrossRefPubMed
  13. ↵
    1. Pulsifer DP,
    2. Lakhtakia A,
    3. Martín-Palma RJ,
    4. Pantano CG
    (2010) Mass fabrication technique for polymeric replicas of arrays of insect corneas. Bioinspir Biomim 5(3):036001.
    OpenUrlCrossRefPubMed
  14. ↵
    1. Qu P,
    2. et al.
    (2012) A simple route to fabricate artificial compound eye structures. Opt Express 20(5):5775–5782.
    OpenUrlCrossRefPubMed
  15. ↵
    1. Khang D-Y,
    2. Jiang H,
    3. Huang Y,
    4. Rogers JA
    (2006) A stretchable form of single-crystal silicon for high-performance electronics on rubber substrates. Science 311(5758):208–212.
    OpenUrlAbstract/FREE Full Text
  16. ↵
    1. Dinyari R,
    2. Rim S-B,
    3. Huang K,
    4. Catrysse PB,
    5. Peumans P
    (2008) Curving monolithic silicon for nonplanar focal plane array applications. Appl Phys Lett 92(9):091114–091113.
    OpenUrlCrossRef
  17. ↵
    1. Xu X,
    2. Davanco M,
    3. Qi X,
    4. Forrest SR
    (2008) Direct transfer patterning on three dimensionally deformed surfaces at micrometer resolutions and its application to hemispherical focal plane detector arrays. Org Electron 9(6):1122–1127.
    OpenUrlCrossRef
  18. ↵
    1. Lee J,
    2. Kim J
    (2011) Fabrication of strongly anchored, high aspect ratio elastomeric microwires for mechanical and optical applications. J Micromech Microeng 21(8):085016–085024.
    OpenUrlCrossRef
  19. ↵
    1. Jung I,
    2. et al.
    (2011) Dynamically tunable hemispherical electronic eye camera system with adjustable zoom capability. Proc Natl Acad Sci USA 108(5):1788–1793.
    OpenUrlAbstract/FREE Full Text
  20. ↵
    1. Dumas D,
    2. et al.
    (2012) Infrared camera based on a curved retina. Opt Lett 37(4):653–655.
    OpenUrlCrossRefPubMed
  21. ↵
    1. Ferrat P,
    2. et al.
    (2008) Ultra-miniature omni-directional camera for an autonomous flying micro-robot. Conference on Optical and Digital Image Processing, eds Schelkens P, Ebrahimi T, Cristobal G, Truchetet F (Proc SPIE, Bellingham, WA), Vol 7000, pp 70000M-1–70000M-10.
    OpenUrl
  22. ↵
    1. Fortey R,
    2. Chatterton B
    (2003) A Devonian trilobite with an eyeshade. Science 301(5640):1689.
    OpenUrlFREE Full Text
  23. ↵
    1. Webb B
    (2002) Robots in invertebrate neuroscience. Nature 417(6886):359–363.
    OpenUrlCrossRefPubMed
  24. ↵
    1. Götz KG
    (1965) Die optischen Übertragungseigenschaften der Komplexaugen von Drosophila. Biol Cybern 2(5):215–221, German.
    OpenUrl
  25. ↵
    1. Normann RA,
    2. Perlman I
    (1979) The effects of background illumination on the photoresponses of red and green cones. J Physiol 286(1):491–507.
    OpenUrlAbstract/FREE Full Text
  26. ↵
    1. Matić T,
    2. Laughlin SB
    (1981) Changes in the intensity-response function of an insect’s photoreceptors due to light adaptation. J Comp Physiol 145(2):169–177.
    OpenUrlCrossRef
  27. ↵
    1. Delbrück T,
    2. Mead CA
    (1994) Adaptive photoreceptor with wide dynamic range. IEEE International Symposium on Circuits and Systems (IEEE, New York, NY), pp 339–342.
  28. ↵
    1. Laughlin SB,
    2. Weckström M
    (1993) Fast and slow photoreceptors—A comparative study of the functional diversity of coding and conductances in the Diptera. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 172(5):593–609.
    OpenUrlCrossRef
  29. ↵
    1. Lucas BD,
    2. Kanade T
    (1981) An iterative image registration technique with an application to stereo vision. Proceedings of the Seventh International Joint Conference on Artificial Intelligence, ed Hayes PJ (William Kaufmann, Los Altos, CA), pp 674–679.
  30. ↵
    1. Fleet DJ,
    2. Langley K
    (1995) Recursive filters for optical flow. IEEE Trans Pattern Anal Mach Intell 17(1):61–67.
    OpenUrlCrossRef
  31. ↵
    1. Franz MO,
    2. Krapp HG
    (2000) Wide-field, motion-sensitive neurons and matched filters for optic flow fields. Biol Cybern 83(3):185–197.
    OpenUrlCrossRefPubMed
  32. ↵
    1. Pichon J-M,
    2. Blanes C,
    3. Franceschini N
    (1989) Visual guidance of a mobile robot equipped with a network of self-motion sensors. Mobile Robots IV, eds Chun WH, Wolfe WJ (Proc SPIE), Vol 1195, pp 44–55.
    OpenUrl
  33. ↵
    1. Zeil J,
    2. Al-Mutairi M
    (1996) The variation of resolution and of ommatidial dimensions in the compound eyes of the fiddler crab Uca lactea annulipes (Ocypodidae, Brachyura, Decapoda) J Exp Biol 199(Pt 7):1569–1577.
    OpenUrlAbstract
  34. ↵
    1. Plett J,
    2. Bahl A,
    3. Buss M,
    4. Kühnlenz K,
    5. Borst A
    (2012) Bio-inspired visual ego-rotation sensor for MAVs. Biol Cybern 106(1):51–63.
    OpenUrlCrossRefPubMed
  35. ↵
    1. Kerhuel L,
    2. Viollet S,
    3. Franceschini N
    (2010) Steering by gazing: An efficient biomimetic control strategy for visually guided micro aerial vehicles. IEEE Trans Robot 26(2):307–319.
    OpenUrlCrossRef
  36. ↵
    1. Horridge GA
    (1978) The separation of visual axes in apposition compound eyes. Philos Trans R Soc Lond B Biol Sci 285(1003):1–59.
    OpenUrlAbstract/FREE Full Text
  37. ↵
    1. Hateren JH,
    2. Hardie RC,
    3. Rudolph A,
    4. Laughlin SB,
    5. Stavenga DG
    (1989) The bright zone, a specialized dorsal eye region in the male blowfly Chrysomyia megacephala. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 164(3):297–308.
    OpenUrlCrossRef
  38. ↵
    1. Franceschini N,
    2. Hardie R,
    3. Ribi W,
    4. Kirschfeld K
    (1981) Sexual dimorphism in a photoreceptor. Nature 291(5812):241–244.
    OpenUrlCrossRef
  39. ↵
    1. Labhart T
    (1980) Specialized photoreceptors at the dorsal rim of the honeybee’s compound eye: Polarizational and angular sensitivity. J Comp Physiol 141(1):19–30.
    OpenUrlCrossRef
  40. ↵
    1. Daneshpanah M,
    2. Javidi B
    (2011) Three-dimensional imaging with detector arrays on arbitrarily shaped surfaces. Opt Lett 36(5):600–602.
    OpenUrlCrossRefPubMed
  41. ↵
    1. Dobrzynski MK,
    2. Pericet-Camara R,
    3. Floreano D
    (2012) Vision Tape—A flexible compound vision sensor for motion detection and proximity estimation. IEEE Sens J 12(5):1131–1139.
    OpenUrlCrossRef
    1. Franceschini N,
    2. Kirschfeld K
    (1971) Les phénomènes de pseudopupille dans l’œil composé de Drosophila. Biol Cybern 9(5):159–182, French.
    OpenUrl
    1. Heisenberg M,
    2. Wolf R
    (1984) Vision in Drosophila: Genetics of Microbehavior (Springer, Berlin).
  42. ↵
    1. Lee JH,
    2. et al.
    (2010) Sestrin as a feedback inhibitor of TOR that prevents age-related pathologies. Science 327(5970):1223–1228.
    OpenUrlAbstract/FREE Full Text
View Abstract
PreviousNext
Back to top
Article Alerts
Email Article

Thank you for your interest in spreading the word on PNAS.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Miniature curved artificial compound eyes
(Your Name) has sent you a message from PNAS
(Your Name) thought you would like to see the PNAS web site.
Citation Tools
Miniature curved artificial compound eyes
Dario Floreano, Ramon Pericet-Camara, Stéphane Viollet, Franck Ruffier, Andreas Brückner, Robert Leitel, Wolfgang Buss, Mohsine Menouni, Fabien Expert, Raphaël Juston, Michal Karol Dobrzynski, Geraud L’Eplattenier, Fabian Recktenwald, Hanspeter A. Mallot, Nicolas Franceschini
Proceedings of the National Academy of Sciences Jun 2013, 110 (23) 9267-9272; DOI: 10.1073/pnas.1219068110

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Request Permissions
Share
Miniature curved artificial compound eyes
Dario Floreano, Ramon Pericet-Camara, Stéphane Viollet, Franck Ruffier, Andreas Brückner, Robert Leitel, Wolfgang Buss, Mohsine Menouni, Fabien Expert, Raphaël Juston, Michal Karol Dobrzynski, Geraud L’Eplattenier, Fabian Recktenwald, Hanspeter A. Mallot, Nicolas Franceschini
Proceedings of the National Academy of Sciences Jun 2013, 110 (23) 9267-9272; DOI: 10.1073/pnas.1219068110
del.icio.us logo Digg logo Reddit logo Twitter logo CiteULike logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Mendeley logo Mendeley
Proceedings of the National Academy of Sciences: 116 (49)
Current Issue

Submit

Sign up for Article Alerts

Article Classifications

  • Physical Sciences
  • Engineering
  • Biological Sciences
  • Applied Biological Sciences

Jump to section

  • Article
    • Abstract
    • Fabrication Process
    • Results
    • Discussion
    • Acknowledgments
    • Footnotes
    • References
  • Figures & SI
  • Info & Metrics
  • PDF

You May Also be Interested in

Modulating the body's networks could become mainstream therapy for many health issues. Image credit: The Feinstein Institutes for Medicine Research.
Core Concept: The rise of bioelectric medicine sparks interest among researchers, patients, and industry
Modulating the body's networks could become mainstream therapy for many health issues.
Image credit: The Feinstein Institutes for Medicine Research.
Adaptations in heart structure and function likely enabled endurance and survival in preindustrial humans. Image courtesy of Pixabay/Skeeze.
Human heart evolved for endurance
Adaptations in heart structure and function likely enabled endurance and survival in preindustrial humans.
Image courtesy of Pixabay/Skeeze.
Viscoelastic carrier fluids enhance retention of fire retardants on wildfire-prone vegetation. Image courtesy of Jesse D. Acosta.
Viscoelastic fluids and wildfire prevention
Viscoelastic carrier fluids enhance retention of fire retardants on wildfire-prone vegetation.
Image courtesy of Jesse D. Acosta.
Water requirements may make desert bird declines more likely in a warming climate. Image courtesy of Sean Peterson (photographer).
Climate change and desert bird collapse
Water requirements may make desert bird declines more likely in a warming climate.
Image courtesy of Sean Peterson (photographer).
QnAs with NAS member and plant biologist Sheng Yang He. Image courtesy of Sheng Yang He.
Featured QnAs
QnAs with NAS member and plant biologist Sheng Yang He
Image courtesy of Sheng Yang He.

Similar Articles

Site Logo
Powered by HighWire
  • Submit Manuscript
  • Twitter
  • Facebook
  • RSS Feeds
  • Email Alerts

Articles

  • Current Issue
  • Latest Articles
  • Archive

PNAS Portals

  • Classics
  • Front Matter
  • Teaching Resources
  • Anthropology
  • Chemistry
  • Physics
  • Sustainability Science

Information

  • Authors
  • Editorial Board
  • Reviewers
  • Press
  • Site Map
  • PNAS Updates

Feedback    Privacy/Legal

Copyright © 2019 National Academy of Sciences. Online ISSN 1091-6490