Extracting 3D-information from the environment is an essential ability for modern-day robotic systems. Indeed, the rising availability of various 3D-sensing modalities for roboticists clearly illustrates this need for 3D-perception. Many autonomous robot projects make use of 3D-cameras, 3D-LIDARS or similar optical sensing modalities. In environmental conditions where these optical techniques fail due to medium distortions (fog, dust, etc.), other sensing modalities are preferred. In challenging environments, often encountered in applications ranging from automotive, construction equipment, agriculture and mining, sensing modalities based on acoustics (SONAR) or electromagnetism (RADAR) can serve as a valid alternative to optical sensing techniques. Inspired by the echolocation system of bats we have previously proposed a 3D-imaging sonar sensor capable of localizing objects in the full frontal hemisphere with respect to the sensor, and applied this sensor in a variety of robotic tasks such as navigating corridor-like environments and SLAM. While the developed imaging sensor proved to be very capable in providing exteroceptive sensor data, the need for an external computational resource with dedicated signal processing algorithms hinders the real-world applicability of this technology. In this presentation we wish to introduce our 3D-embedded real-time imaging sonar (eRTIS) which encapsulates the sensor operation to enable real-world applications.