Sony Alpha camera supported by Sony SDK (Software Development Kit)
Sony Alpha camera supported by Sony SDK (Software Development Kit)
Digital Video Camera using various interfaces (USB, Gigabit Ethernet, Camera link)
Sony Europe B.V. is announcing commercialization of the AS-DT1 LiDAR Depth Sensor. The AS-DT1, is the world’s smallest and lightest LiDAR Depth Sensor structure measuring just 29mm x 29mm x 31mm (approximately 1.14 inch width x 1.14 inch height x 1.22 inch depth), excluding protrusions, and weighing only 50g (approximately 1.76 ounces). The AS-DT1 leverages miniaturization and optical lens technologies from Sony’s machine vision industrial cameras making it ideal for applications where space, and weight constraints are paramount including, drones, robotics, and more
Maximum range @ 15 fps, 50% reflexion & center:
Indoor 40 m & outdoor 20 m (to be confirmed into final specifications)
Accuracy @10 meter:
Indoor & Outdoor +/- 5 cm (to be confirmed into final specifications)
Detection points:
576 (24 x 24)
Frame rate:
30 fps or 15 fps@maximum range mode
| Key Features | compact size, sensorSPAD, speed30, accuracy, distance |
|---|---|
| Sensor | SPAD |
| Resolution | |
| Video format | |
| Speed | 30 |
| Optical zoom (lens) | |
| Digital zoom | |
| Super resolution zoom | |
| Storage | |
| Lens | |
| Live view | |
| Width | 29 |
| Height | 29 |
| Depth | 31 |
| Weight | 50g |
| Interface | |
| Video output | |
| Partial scan/binning | |
| IEEE1588 compliant | |
| HW & SW trigger | |
| Noise filter | |
| DC 12 V (PoE) | |
| Shading/ defect correction | |
| Frame accumulation | |
| WDR | |
| LUT | |
| FOV | 30° |
| External synchronization | |
| Power consumption | |
| Image stabilization | |
| Noise reduction | |
| Visca command | |
| Sensitivity | |
| Others | |
| accuracy | 5cm |
| distance range | 0.3meter - 40meter |
More automated systems are all around us. From ground-based robots to driverless vehicles and drones, we see the increasing application of machines that operate with an accurate, real-time understanding of their surroundings.
Light Detection and Ranging (LiDAR) has become a foundational technology for such capability. By measuring distances with laser pulses, LiDAR sensors can be used to capture precise shapes, distances, and motion, reliably detecting obstacles, no matter whether it is day or night. This capability enables autonomous systems to construct ultra-detailed maps of their surroundings in an instant. It is this consistent, predictable spatial awareness that underpins safe decision-making without a human in the loop.
For autonomous systems, accurate perception is not a nice-to-have feature; it is a critical performance enabler. Take ground-based robots moving around a factory floor, for example. Their systems must continuously interpret surroundings with certainty, often in unpredictable environments where there are obstacles of many types. Depth errors of just a few centimetres can significantly invalidate navigation paths and collision avoidance decisions. Equally, performance must remain consistent across bright sunlight, low light, reflective surfaces, and low-contrast objects – all common in modern industrial settings.
Therefore, sensing technology must offer high accuracy, low latency, environmental robustness, and the ability to operate reliably without human involvement. And it needs to be compact and lightweight for easy integration within embedded environments. It is within this highly demanding context that advances in LiDAR architecture have become critical to the next generation of autonomous capability.
Automation depends on more intelligent machines that can perceive and understand the world around them. Advances in miniaturised LiDAR are delivering precise, real-time 3D vision across increasingly complex environments.
Sony’s AS-DT1 LiDAR depth sensor has been put through its paces in an innovative drone project focused on collision avoidance and landing.
Autonomous drone navigation in industrial environments presents a significant challenge. Facilities such as warehouses, factories and logistics hubs are often dense, fast-moving and GPS-denied, requiring systems that can interpret their surroundings instantly and reliably. Traditional single-point laser sensors offer limited spatial awareness, making precise positioning, collision avoidance and safe landing difficult. As a result, there is a growing need for more advanced sensing technologies capable of delivering detailed, real-time 3D environmental data.
To address this, UK-based industrial automation specialist OEM Automatic explored the potential of integrating Sony ISS’s AS-DT1 LiDAR depth sensor in a pilot drone project designed to test real-time 3D sensing and in-flight data streaming.
The project - the brainchild of Ibrahim Ahmethan, a Computer Vision Application Engineer at OEM Automatic - aimed not only to validate the sensor’s technical capabilities, but also to better understand how it could be integrated into autonomous platforms operating in industrial settings. By combining in-house hardware and software development, the team set out to build a fully functional proof-of-concept drone capable of capturing and processing depth data during flight.