Sony Alpha camera supported by Sony SDK (Software Development Kit)
Sony Alpha camera supported by Sony SDK (Software Development Kit)
Digital Video Camera using various interfaces (USB, Gigabit Ethernet, Camera link)
Sony Europe B.V. is announcing commercialization of the AS-DT1 LiDAR Depth Sensor. The AS-DT1, is the world’s smallest and lightest LiDAR Depth Sensor structure measuring just 29mm x 29mm x 31mm (approximately 1.14 inch width x 1.14 inch height x 1.22 inch depth), excluding protrusions, and weighing only 50g (approximately 1.76 ounces). The AS-DT1 leverages miniaturization and optical lens technologies from Sony’s machine vision industrial cameras making it ideal for applications where space, and weight constraints are paramount including, drones, robotics, and more
Maximum range @ 15 fps, 50% reflexion & center:
Indoor 40 m & outdoor 20 m (to be confirmed into final specifications)
Accuracy @10 meter:
Indoor & Outdoor +/- 5 cm (to be confirmed into final specifications)
Detection points:
576 (24 x 24)
Frame rate:
30 fps or 15 fps@maximum range mode
| Key Features | compact size, sensorSPAD, speed30, accuracy, distance |
|---|---|
| Sensor | SPAD |
| Resolution | |
| Video format | |
| Speed | 30 |
| Optical zoom (lens) | |
| Digital zoom | |
| Super resolution zoom | |
| Storage | |
| Lens | |
| Live view | |
| Width | 29 |
| Height | 29 |
| Depth | 31 |
| Weight | 50g |
| Interface | |
| Video output | |
| Partial scan/binning | |
| IEEE1588 compliant | |
| HW & SW trigger | |
| Noise filter | |
| DC 12 V (PoE) | |
| Shading/ defect correction | |
| Frame accumulation | |
| WDR | |
| LUT | |
| FOV | 30° |
| External synchronization | |
| Power consumption | |
| Image stabilization | |
| Noise reduction | |
| Visca command | |
| Sensitivity | |
| Others | |
| accuracy | 5cm |
| distance range | 0.3meter - 40meter |
More automated systems are all around us. From ground-based robots to driverless vehicles and drones, we see the increasing application of machines that operate with an accurate, real-time understanding of their surroundings.
Light Detection and Ranging (LiDAR) has become a foundational technology for such capability. By measuring distances with laser pulses, LiDAR sensors can be used to capture precise shapes, distances, and motion, reliably detecting obstacles, no matter whether it is day or night. This capability enables autonomous systems to construct ultra-detailed maps of their surroundings in an instant. It is this consistent, predictable spatial awareness that underpins safe decision-making without a human in the loop.
For autonomous systems, accurate perception is not a nice-to-have feature; it is a critical performance enabler. Take ground-based robots moving around a factory floor, for example. Their systems must continuously interpret surroundings with certainty, often in unpredictable environments where there are obstacles of many types. Depth errors of just a few centimetres can significantly invalidate navigation paths and collision avoidance decisions. Equally, performance must remain consistent across bright sunlight, low light, reflective surfaces, and low-contrast objects – all common in modern industrial settings.
Therefore, sensing technology must offer high accuracy, low latency, environmental robustness, and the ability to operate reliably without human involvement. And it needs to be compact and lightweight for easy integration within embedded environments. It is within this highly demanding context that advances in LiDAR architecture have become critical to the next generation of autonomous capability.