Login  |  Register
SDK speeds development for polarized imaging applications

SDK speeds development for polarized imaging applications

SDK overviewIt’s been several years since the research and development and market development phases of Sony’s first polarized camera module—the XCG-CP510—began, according to Clauss.“In this time, from talking with selected key players of targeted markets, it became clear to us that the biggest barrier to adoption was the ability of system developers to work with the new sensor technology easily,” says Clauss.As a result, Sony has for the first time developed a dedicated image processing library to speed solution development. The SDK’s functions—including stress measurement and glare reduction—run on a standard PC with choice of CPU or GPU architecture dictated by the processing functions, resolutions, and frame rate. With the SDK, Sony suggests that solution development can be improved from a range of 6 to 24 months to 6 to 12 weeks, depending on the team.

Key features

Support functions are the first level supplied in the SDK, which includes demosaic and raw extraction. The ‘Cosine fit’ also allows the developer to define a virtual polarizer angle for the whole image and the ‘Average’ function creates a non-polarized image from the raw data to simultaneously export for comparison what a standard machine vision camera would see.

Pre-processing functions of the SDK calculate various polarization-specific information like the ‘Degree of Polarization’, ‘Stokes Vector’ and the ‘Surface Normal Vector.’ At the higher-end level, ‘Applications-Oriented’ functions have been implemented to manage reflections and measure stress.

“These functions’ algorithms have been optimized for accuracy and high computing to enable their use in real-time inspection,” says Clauss.

The Sony XCG-CP510 camera pointed at a perspex block under stress. Stress is altered via the adjustable screw (black rod above), light is shone through the block, with areas under stress bending the angle of light. 


The use of polarized camera modules will prove vital in a wide range of applications, suggests Clauss.

“The SDK has developed models for the major subset of these, including weakness detection. In stress monitoring, measuring where stresses are occurring by how the light bends to highlight potential weaknesses— vital in industries such as glass, Polyethylene terephthalate (PET), and phone displays—is possible,” he says.

Another application, Clauss says, is inspection for manufacturing and intelligent transportation systems. Models were developed for the extraction of reflections on an object under inspection caused by unmanaged lighting. This reflection/glare management control, according to Sony, will not only improve quality inspection for printed circuit boards and for packaging—particularly in the pharmaceutical industry—but for traffic monitoring as well, as reflection/glare management enables the confirmation of a driver, or to see if a mobile device is in use, or if a passenger isn’t strapped in, and so on.

“Further, scratch identification is another application where polarization technology proves useful. For transparent goods inspection, the camera and SDK make it feasible—with the reflection enhancement—to more easily undertake surface inspection and scratch detection. For the features described here, Sony developed and validated the software in collaboration with customers,” explains Clauss.

The output comparing averaged (b/w non-polarized) image and a heat map image, calculated using a phase retardation correlated to stress applied to the PET block. Note the stress clearly shown in blue.

Stress testing in glass/PET manufacturing

At VISION 2018, Sony showed two real-time demonstrations of the SDK and polarized camera in action, with glare reduction for ITS applications and stress analysis.

A polarized XCG-CP510 GigE Vision camera was set up 50 cm from a PET block that had been backlit with a monochromatic light source and polarizer. The camera was set to output both an averaged (non-polarized) image alongside a heat map image, with the software calculating the phase retardation correlated to stress applied to the PET block.

Above the block was a screw that attendees at the Sony booth could turn to change the stress placed upon the block.

“As per the images, in the averaged version for both no stress and under stress, it is impossible to see the effect with a normal machine vision camera,” according to Clauss. “However,” he added, “using the new software development kit’s algorithms, it is now possible to detect rapid changes.”