Equipping smartphone cameras with 3D imaging with an inexpensive solution by Stanford University researchers


Equipping smartphone cameras with 3D imaging with an inexpensive solution by Stanford University researchers

Equipping smartphone cameras with 3D imaging with an inexpensive solution by Stanford University researchers

EMGblog.com: A standard image sensor – of which billions Found on almost every smartphone today – it can record light intensity and color. Relying on a common technology known as CMOS, these sensors have become smaller and more powerful over time and are now available in tens of megapixel resolutions. But despite these advances, today’s sensors only see two dimensions, resulting in flat, flat photos (just like a painting). But Stanford University researchers recently invented a method announced whereby the same image sensors can have a three-dimensional vision (that is, they will be able to detect the distance between the camera and the subject).

Measuring the distance between objects by light is currently only possible using special and expensive systems known as lidar. lidar, short for “light detection and ranging” means “light detection and ranging” and we can refer to it as “lidar” or laser radar. An example of this technology can be found in self-driving cars, which are placed on the roof of these cars like a hump. The main part of this technology is related to the car collision prevention system, which can determine the distance between objects using a laser.

In fact, lidar mechanism is similar to radar, except that it uses light instead of radio waves to determine the distance between objects. By shining a laser on objects and measuring the time it takes for the light to return, lidar can determine the distance of the object, its speed and whether the object is approaching or moving away; More importantly, it can calculate whether the trajectories of two moving objects will continue to intersect. It should be noted that Apple in order to improve focus in low-light conditions and also improve night mode performance in iPhone 13 Pro and iPhone 13 Pro Max and also iPhone 12 Pro and iPhone 12 Pro Max has used lidar technology.

iPhone 12 Pro

According to Okan Atalar, student doctoral candidate in electrical engineering at Stanford University, current lidar systems are large and heavy. But if we are going to see the use of lidar in autonomous drones or lightweight robotic vehicles in the future, these systems must be very small and low-power, and at the same time have high performance.

If such progress is achieved, it will create two attractive opportunities for engineers; First, it can realize high-resolution lidar (which is currently not possible). The higher resolution allows the lidar to detect objects from a greater distance. For example, consider a self-driving car that can distinguish a pedestrian from a bicyclist from a greater distance, making it easier to avoid accidents. The second opportunity – which is our main discussion – is that a current image sensor, of which there are billions active on smartphones, will be able to capture 3D images with just a few minor hardware additions.

3D imaging has various functions. 3D cameras can take deep images that the viewer can relate to the image by looking at them (as if he was physically there). This feature will have many applications, including remote work, education, and also for maintaining distance during an epidemic. Diagnostic, therapeutic and repair functions in the health, technology and production sectors are among the other uses of 3D images.

Another benefit of 3D imaging in smartphones is in the field of security. As a rule, face recognition in this system will be more reliable than conventional solutions due to the 3D face mapping. As more depth information is collected, more data about the face is available to the smartphone. This can reduce the number of times facial recognition fails to recognize your face. Now the question is how to implement this system?

One way to add 3D imaging to conventional sensors is to add a light source and a modulator that can bounce light at a very high speed (millions of times). every second) turn on and off. Engineers can calculate distance by measuring light fluctuations. Of course, current modulators can also do such a thing, leading to high energy consumption; The consumption of these modulators is so high that it will not be suitable for everyday use.

The Stanford researchers team’s solution to equip ordinary sensors with 3D imaging relies on a phenomenon called “acoustic resonance”. In fact, this research team has made a simple acoustic modulator using a thin wafer of “lithium niobate” which is covered with two transparent electrodes (lithium niobate is a transparent crystal that has good electrical, acoustic and optical properties). The creation of this phenomenon was the result of the collaboration between the Laboratory of Integrated Nano-Quantum Systems (LINQS) and the Arbabian Laboratory (affiliated to Amin Arbabian, an assistant professor at Stanford University).

From left to right, Amin Safavi Naini, Okan Atalar and Amin Arbabian, Stanford University project researchers

But what is more important is the piezoelectricity of lithium niobate. This means that by entering the electric current through the electrodes, the crystal lattice, which is at the heart of its atomic structure, is deformed and vibrates with very high, predictable and controllable frequencies. As it begins to vibrate, the lithium niobate powerfully modulates, or pitches, the light. By adding two polarizers, this modulator can turn light on and off several million times per second.

According to Atalar, it is the geometry of the wafers and electrodes that defines the modulation frequency of the light and thus the frequency can be adjusted. So, by changing the geometry, the modulation frequency will also change accordingly.

In more specialized terms, the piezoelectric effect creates an acoustic wave in the crystal, which causes the polar rotation of the light in a desirable, adjustable and usable way. In fact, the success of the Stanford research team has been due to this key move. Then, a precision polarizing filter is placed after the modulator, which converts this rotation into intensity modulation. This makes the light brighter and darker and can turn the light on and off millions of times per second.

Atalar believes that despite the existence of other ways to turn the light on and off, this acoustic method is superior to other methods because it saves a lot of energy.

The most positive aspect of this solution is that the design of the mentioned modulator is simple and can be easily added to ordinary cameras – found in smartphones and digital SLR cameras. Atalar and Amin Arbabian, his advisor and associate professor of electrical engineering and senior author of this project, believe that this idea can be the basis for building a new type of small, cheap and low-power lidar. Atalar and Arbabian call this type of system “standard Cmass lidar” and believe that this system can be used in drones, extraterrestrial probes and other functions.

The modulator will have a major impact on the industry and, according to the Stanford research team, has the potential to add 3D imaging to any camera sensor. To prove this claim, the research team built a prototype lidar system in the laboratory, the receptor of which was simply an ordinary digital camera. In the report of this research team, it is stated that their prototype was able to record depth maps with megapixel resolution. In this report, the low consumption of the optical modulator is emphasized.

Atalar, while reminding that the research team has managed to reduce the energy consumption of the system by at least 10 times, believes that it will be possible to reduce the amount of consumption by several hundred times. If this technology is realized, the dream of producing small lidar systems with standard image sensors as well as 3D cameras for smartphones will come true. Needless to say, the lidar version used by the Stanford research team was cheaper than the version used by Apple. And for this reason, we can expect this system to be used on more devices.

Previous Getting to know the new Fitbit algorithm to detect atrial fibrillation with FDA approval
Next Introducing Motorola Moto G52 mid-range 4G with 90 Hz OLED display

No Comment

Leave a reply

Your email address will not be published.