Drives & Controls April 2023

40 n MACHINE VISION April 2023 www.drivesncontrols.com Identifying the top trends in machine vision technologies Machine and embedded vision technologies are in state of constant flux. The latest developments include faster and higher-resolution image sensors, interfaces that support increasing bandwidths, simpler integration, miniaturisation and high on-board processing power. Below we consider some of these trends in greater detail. Ultra-high resolution sensors Image sensor megapixel counts are continuing to climb, allowing industrial cameras to capture more detail. This not only allows microscopic defects to be detected, but also means that a single camera can cover a larger area. Applications that have historically required several cameras may be carried out using just one high-resolution camera, with huge potential for cost savings through reduced complexity, processing, management and capital outlay. Using a single camera also eliminates the need to stitch together multiple images, improving performance by reducing image-processing times. Larger lenses The larger the sensor, the bigger the pixel area and the better the image quality. Higher resolution sensors go hand-inglove with larger lenses if a “tunnel effect”is to be avoided. This is a major consideration when designing machine vision systems, because cmount lenses are designed for sensors of up to a little more than one inch. Anything bigger, and a large diameter aperture lens is needed. Micro lenses At the opposite end of the lens size spectrum, there is a surge in demand for M12 miniature lenses for use in embedded cameras, drones, robotics and autonomous vehicles. The quality of these lenses has improved dramatically in recent years, enabling them to be deployed in applications that were previously the preserve of c-mount lenses. Miniature lenses can be now incorporated into machine vision cameras, resulting in compact systems. This is particularly advantageous where space is limited and there is a cost benefit. Filter application technologies The use of filters to block out certain bandwidths of light is not new, but the technologies for applying filters to industrial cameras are evolving. For example, adapters for M12 lenses are now available that allow ambient light to be filtered out in the same way as with c-mount lenses. This means that M12 lenses can be deployed in light-sensitive applications, such as in robotics systems for factories where the ambient lighting can change. Filters that only allow light in near-infrared bandwidths can mitigate against varying ambient light levels. 10 GigE protocol As the successor to GigE, 10GigE provides the same benefits but with a ten-fold increase in data and frame rates. Machine vision designers historically had a choice of GigE or USB3 as protocols for transmitting high-speed video and related data over Ethernet networks. The decision tended to hinge on the length of cable required which, in turn, related to the number of cameras being used. USB is rated for 5m or less, while a GigE interface can function with a cable length up to 100m. The trade-off with a 1GigE interface is speed – a USB3 cable can transmit data five times faster than a 1GigE system. The advent of 10GigE will enable the capabilities of highperforming image sensors – until now, limited by the bandwidth that could be achieved with the available interfaces – to be realised. Embedded vision These low-cost cameras are increasingly infiltrating the industrial world. The main difference between embedded vision and machine vision is that embedded camera technology is far simpler owing to its limited data-processing capacity. This means it is more suited to gathering data which is then analysed on a cloud-based platform, than processing online data and making decisions in real-time. Embedded cameras are being used, for example, in vertical farms to monitor and adjust ambient conditions for optimum plant growth. Wider use of stereo vision In the past, stereo vision was seen as the domain of experts and required investment in costly software. Now, thanks to the development of low-cost stereo vision sensors by companies such as Arducam, stereo cameras can be paired with open-source stereo vision AI algorithms and 3D capabilities to create systems that are very proficient at depth sensing. One example is robots that navigate around warehouses. However, the limitations of this technology should be respected – the hardware isn’t capable of tasks that require accuracy and repeatability, which remains the domain of traditional and dedicated machine vision cameras. n The technologies used for machine vision systems is evolving constantly. Paul Wilson, managing director of Scorpion Vision, charts some current trends in machine vision cameras and components. A popular machine vision technology is small, embedded 3D camera modules that use time-of-flight technology to create 3D point clouds in real time. These are ideal for real-time distance measurement applications.

RkJQdWJsaXNoZXIy MjQ0NzM=