signals header
People in an office seeting during a mixed reality and virtual communications meeting.
People in an office seeting during a mixed reality and virtual communications meeting.




Stay updated and leverage Signals+ latest insights, information and ideas on Connectivity, Digital Health, Electrification, and Smart Industry.

      Please see our Privacy Policy for further information on the above.
      You can change your privacy settings at any time by clicking on the unsubscribe link in emails sent from Analog Devices or in Analog’s Privacy Settings.

      Thank you for subscribing to ADI Signals+. A confirmation email has been sent to your inbox.

      You'll soon receive timely updates on all the breakthrough technologies impacting human lives across the globe. Enjoy!

      Portrait image of Eoin English
      Eoin English,

      Senior Principal Engineer

      Analog Devices

      Author Details
      Eoin English
      Eoin English is a senior principal engineer, leading the Consumer System Applications Team at Analog Devices’ European Research and Development Centre in Ireland. Eoin joined ADI in 1997 with a first-class, honors BENG in electronic engineering from University College Galway. In 2017, he completed a certificate in system design management at MIT and currently holds 26 U.S. granted patents. Eoin takes great satisfaction in enabling technologies that help people live more comfortably, stay healthier, be more productive, and engage more interactively through the fusion of audio, vision, and human body sensing.
      Close Details


      We’ve all become accustomed to using Microsoft Teams, Zoom, and Google Meet to enhance the remote meeting experience. But even so, all these interactions are still in 2D. What’s missing is a truly immersive, lifelike, 3D experience.

      Despite the hype around the Metaverse and AR/VR, and even Holoportation (high quality, 3D models of a person, reconstructed and transmitted anywhere in the world in real time), we are still a long way from the ultimate, extended-reality headset.

      Transforming 2D interactions into immersive 3D ones is quite difficult and requires many technologies to be fused together. While 3D displays and spatial audio content have existed in isolation for some time, they’ve really only been static, requiring fixed viewing and listening positions. 3D time of flight technology goes to the next level in bridging the physical and digital worlds by transforming static, 3D interactions into contextually aware, immersive interactions, which dynamically adjust to the users and machine context in all types of environments—human to remote human, human to machine, or machine to machine.

      From Grade School to Grad School—The Advancement of Smart Devices

      Smart home devices like Amazon Alexa and Google Home are a normal part of life. Worldwide shipments of smart home devices grew nearly 12% in 2021 according to analysts at IDC, and the same study predicts double-digit growth through 2026. Consumers now expect everything to evolve and become smart and touchless, including light bulbs, home appliances, televisions, cars, and more.


      As the smart home has evolved over the last decade, so, too, has the definition of what it means to be smart. Basic smart devices are typically straightforward—they’re internet connected and controlled or monitored, like a robot vacuum. More advanced smart devices are context aware. They might also be GPS enabled, knowing, for example, to turn the heat on in your house when you come within a certain radius, without you ever having to touch a button or screen.

      Inside a family home with visualization of smart home technology.

      Even smarter devices add a level of personalization by listening to your voice and understanding you when you speak. For example, your home’s smart speaker can double as a house manager that can prepare for bedtime by lowering your blinds, dimming your lights, adjusting the temperature, and playing mellow music.

      The next shift in the smart revolution adds another human sense to our devices—vision. These devices can examine and analyze the world around us, then make decisions based on that analysis. The technology behind those devices is depth sensing, of which a key enabler is time of flight.


      A time of flight camera measures distance by bouncing a beam of light off an object and back to a sensor, then measuring the time delay between when the light is emitted and when the reflected light is received by the sensor. This process is similar to ultrasound, which uses sound to measure distance, rather than light, and radar, which uses radio waves. Time of flight cameras can generate high resolution depth maps (spatial resolutions similar to RGB cameras) with precision depth accuracies faster and can cover a greater range than ultrasound—after all, the speed of light is far greater than the speed of sound. While radar has a longer range, time of flight is still more accurate, with higher resolution.

      There are two primary methods for measuring the time delay in time of flight: indirect time of flight (iToF) and direct time of flight (dToF):

      Indirect ToF and Direct ToF

      • In an iToF system, a continuous-wave (CW) method is used, which measures the phase shift between the sent and received light pulses.
      • In a dToF camera, a pulse-based method is used, which measures the elapsed time between the emitted pulse and the received light pulse. One of the benefits of using CW iToF image sensors is that they are mass produced on traditional semiconductor infrastructure, achieving high pixel densities for short range imaging at an affordable cost.
      Graphic depicts Time of Flight 3D mapping technology.
      ToF empowers contextual awareness for better informed decisions—from adjusting room temperature due to the number of people in the room, to identifying exhibits of interest in shopping experiences.

      The choice between iToF and dToF depth imaging systems is down to the application need and the use case context. iToF is suited for short range imaging (0.5 m, 5 m, and10 m) use cases that require high spatial resolution. dToF is more applicable to longer range imaging requiring lower spatial resolution. Artificial intelligence (AI) and optical system design is blurring the boundaries between the choice of iToF and dToF technologies. In the context aware intelligent edge systems, both iToF and dToF sensors are fused with RGB images and inertial sensors with the assistance of AI to enhance performance and remove artifacts.

      Just like the mouse revolutionized computer interaction and touch screen technologies drove the adoption of smartphones and tablets time of flight is empowering contactless 3D interaction. Time of flight technology is having a similar effect in advancing Industry 4.0. From industrial machine vision for quality inspection, to volumetric detection for asset management, to navigation for autonomous manufacturing, the manufacturing industry is adopting these sensing technologies and moving toward high resolution systems designed for harsh industrial environments.

      But where is time of flight revolutionizing our daily lives? What is today’s breakthrough that would be equivalent to the color television or personal computer?


      Current applications that utilize time of flight technologies are continually emerging all around us—from in-cabin safety in automobiles, to home exercise equipment, to gaming and 3D lifelike remote collaborations. One future application for time of flight is in autonomous-driving vehicles, serving as a complement to radar, LIDAR, and the other depth sensors.

      Here are a few of the smartest uses of time of flight technologies, which are making life better, safer, and more enjoyable:

      People collaborating in a virtual setting.

      3D Remote Collaborations

      ToF is combining advances in hardware and software to enable friends, families, and coworkers to feel as if they’re together, even when they’re miles apart. With immersive, 3D meetings you’re no longer constrained by the flatness of 2D images, so you see life-size, three-dimensional people, as if they’re truly in the same room with you.

      Person inside a car with In-cabin assited automotive technologies.

      In-Cabin/Assistive Automotive Technology

      Time of flight helps you enjoy advanced driver assistance systems (ADAS) with facial and motion monitoring to detect a sleepy driver or warn you of veering out of your lane. Gesture-controlled technologies help you answer phone calls, change audio, or adjust climate control without taking your eyes off the road.

      Woman exercing with a smart technology exercise mirror.

      Home Exercise Equipment

      The home fitness industry has evolved beyond online spin and yoga classes to now include smart fitness mirrors. When the built-in virtual trainer gives you pointers on your squat form, that trainer’s brain is powered by time of flight.

      Smart Home theater applications.

      Home Theater

      Time of flight makes your home theater system smart enough to dynamically adjust the sound equalization to compensate for changes in the listener’s location or physical changes in the environment, such as a new piece of furniture.

      Woman shopping using Mixed and virtual realty.


      Online shopping has become so easy and fast that many of us dread going into a store for the items we want to see in person. Time of flight allows you to measure a space and redesign your home using your phone, or have your avatar try on clothes before you purchase them.

      Gaming and the metaverse.

      Gaming and the Metaverse

      In AR/VR headsets, depth information acquired by the time of flight system allows people to interact via hand tracking with virtual objects placed in the real world, more accurately merging the physical and digital worlds for an immersive experience.

      Smart factory technologies utilization in machine visioning.

      Smart Factories

      Machine vision empowered by ToF technologies is a key enabler of flexible, adaptable production configurations. This allows humans to work safely and collaboratively alongside machines increasing productivity, enhancing quality and maximizing factory utilization.


      Woman exercising in front of a Smart exercise mirror

      If innovation continues to accelerate at its current rate, there could be another life-changing innovation in the next decade or two. The proliferation of depth sensors and imaging technologies may create a smart device we haven’t even imagined yet, or an entirely new technology could emerge, making smart mirrors and AR headsets seem as simple as a robot vacuum.

      ADI helps customers co-create and leverage insights from the intelligent edge, including time of flight system solutions. ADI’s industry-leading time of flight imaging sensors, along with complete systems solutions, integrate data processing, laser driver, power management, and software/firmware into one unit to enable next-generation intelligent edge solutions.