If an ultrasonic wave travels at a speed of 343 meters per second and a receiver detects its echo 0.1 seconds after it is emitted, what is the distance between the ultrasonic sensor and the object?

Distance = Speed × Time
Distance = 343 m/s × 0.1 s
Distance = 34.3 meters

The distance between the ultrasonic sensor and the object is 34.3 meters.

Related Questions