Tokyo, Japan—Canon announced it has developed the world’s first SPAD image sensor (single photon avalanche diode). The sensor has signal-amplifying pixels capable of capturing 1-megapixel images.
SPAD image sensors are used in such applications as 2-dimensional cameras; these cameras capture and develop still image and video in an extremely short time span. The sensors also hold potential for use in 3-dimensional cameras, due to their ability to obtain information about the distance between them and a subject as image data.
A single photon avalanche diode (SPAD) sensor is a specially configured sensor in which a diode is placed within each pixel; each pixel possesses an electronic element. Each diode, when receiving a single incoming photon of light, turns the photon into an “avalanche” of electrons to create a single large electrical pulse signal. With the ability to convert a single photon into multiple electrons, the sensor can realize greater sensitivity during photography as well as more accurate distance measurements.
Canon’s SPAD Image Sensor
Canon says the SPAD image sensor it developed overcomes the longstanding difficulties of achieving this effect with high pixel counts. By adopting new circuit technology, its sensor uses a method known as photon counting to achieve a digital image resolution of 1 megapixel.
What’s more, the sensor employs a global shutter that allows simultaneous control of exposure for every pixel. The sensor shortens exposure time to as little as 3.8 nanoseconds; this makes possible distortion-free image capture. In addition, the sensor is capable of up to 24,000 frames per second (fps) with 1 bit output. As a result, it enables slow-motion capture of fast movement within a short time frame.
Furthermore, because of the SPAD sensor’s ability to capture fine details for the entirety of events and phenomena, it holds the potential for use in a variety of applications. These may include clear, safe and durable analysis of chemical reactions, natural phenomena like lightning strikes, falling objects, damage upon impact and other events the human eye can’t precisely observe.
The sensor also features a high time resolution as precise as 100 picoseconds. Consequently, it can determine the exact timing at which a photon reaches a pixel with ultrahigh accuracy. Leveraging this functionality, the sensor is capable of Time of Flight distance measurement.
What’s more, with 1MP resolution and high-speed image capture, it also accurately performs 3D distance measurements where multiple subjects overlap. Canon’s SPAD sensor makes 3D cameras capable of recognizing depth information to achieve a 1MP resolution; this is expected to expand the use of such cameras as the “eyes” of robotic devices.
CMOS and SPAD Sensor Pixel Structures
SPAD sensors output a signal based on the number of generated pulses it counts. Although able to detect a single photon, each pixel requires its own memory or a counter. In addition, conversion of a photon into multiple electrons requires a high voltage and consequently a high-voltage-resistant structure with insulation. Such requirements inevitably lead to larger pixels. As a result, miniaturization and increased pixel counts have thus far proven difficult. However, in recent years, there has been great progress in 3D stacking technologies.
Comparison of CMOS and SPAD Sensor Pixel Structures
In recent years, various devices and equipment have used SPAD sensors. Currently, smartphones utilize proximity sensors to determine the distance between the device and physical objects around it.
Photon counting refers to the technique by which optical sensors count the number of photon particles, the smallest unit of light. This determine such parameters of signal light as intensity and time distribution.
With conventional photodetectors, analog signals are detected via electrical currents and voltage. However, photon counting treats light signals as discreet digital signals. Subsequently, it can eliminate the interference from electronic noise, enabling highly accurate detection of weak signals.
Time of Flight Measurement
Time of Flight (ToF) measurement is a method for determining the distance between a sensor and another object. Distance is measured based on how quickly light emitted from a light source, traveling at light speed (300,000 km/s), reflects off the target object and returns to the sensor.
Because light travels at such incredibly fast speed, distance calculation using the ToF method must be performed in a span of between 1 nanosecond (one billionth of a second) and one picosecond (one trillionth of a second). It therefore requires an optical sensor capable of responding to such a high-speed regime with precision.
The SPAD sensor also features a high time resolution as precise as 100 picoseconds; it is able to determine the exact timing at which a photon reaches a pixel with ultrahigh accuracy. When utilizing ToF measurement by equipping a light-emitting device (or camera using a specialized sensor) with a SPAD image sensor, the time duration between pulse light being directed at an object and the light returning to the sensor is used to determine the physical distance between the sensor and the subject.
Moreover, SPAD image sensors utilizing the ToF are easily installed on various devices and can measure depth information with extraordinary precision, even in dark environments.