July/August 2020

| 32 | July/August 2020 www.smartmachinesandfactories.com | INTERVIEWS & REPORTS | and began developing accelerator cards dubbed PPUs (Physics Processing Units), hardware designed to alleviate the calculation burden on the CPU, specifically calculations involving physics. In 2008, Ageia was itself acquired by graphics technology manufacturer Nvidia, after which development of PhysX turned away from PPU extension cards and focused instead on general purpose graphics cards (GPGPUs). These are very efficient at manipulating and displaying computer graphics, and their highly parallel structure makes them more effective than general-purpose CPUs for accelerating physical simulations using PhysX. PhysX SDK 3.0 was released in May 2011 and represented a significant rewrite of the SDK, with more efficient multithreading and a unified code base for all supported platforms. At GDC 2015, Nvidia made the source code for PhysX available on GitHub, but required registration. The proprietary SDK was provided to developers for free for both commercial and non- commercial use on a wide range of platforms, including Windows, Linux, macOS, iOS and Android. In late 2018, PhysX was made open source under a three-clause BSD license, but this applied only to computer and mobile platforms. BSD licences are a family of permissive software licenses, imposing minimal restrictions on the use and distribution of the software to which they apply. The original BSD licence took its name from Berkeley Software Distribution, which is a UNIX-like operating system; the license has since been revised, and its descendants are sometimes referred to as modified BSD licences. NVidia features a “unified physics solver” called FleX, a simulation technique for real-time visual effects. Traditionally, these were generated using separate elements for rigid bodies, fluids, clothing, etc. Unified physics solvers like FleX enable new effects, where different simulated substances can interact with each other seamlessly. Sources claim that the first game to use PhysX was Bet On Soldier: Blood Sport, launched in 2005. As one of a handful of major physics engines, it is used in many games, such as The Witcher 3: Wild Hunt, Warframe, Killing Floor 2, Fallout 4, Batman: Arkham Knight and Borderlands 2. Most of these games use the CPU to process the physics simulations. Video games with optional support for hardware- accelerated PhysX often include additional effects such as tearable cloth, dynamic smoke or simulated particle debris. Debris is simulated in Mafia II when PhysX is turned to the highest level in the game settings. Digital twins By using digital twinning, instead of committing to a heavy upfront capital investment, prospective customers would only initially need to commit approximately 10% of the total project value for prototyping, before allocating the remainder when the project is proven. The company has its base on a light industrial estate in Buckinghamshire, where it is currently running trials. Despite having its roots in the automotive sector, Vikaso is seeking opportunities in medium-sized companies of perhaps 100-500 employees, but in any market sector. “The principles are the same in any industry, though product cycle times may. but the core technology itself remains the same, says Boricha. “But as soon as the company size goes down to around 100, they may not be able to afford services and often prefer to do some mock-ups themselves, rather than look to external support.” Machine learning opens up entirely new possibilities for industrial and collaborative robot applications, allowing both types to perform tasks that were previously impossible. Currently, imitation learning and computer vision are the two main approaches. With advanced versions of computer vision, complex optical equipment for image capture feeds neural networks so that a robot can “see.” In most instances, this translates into robotic guidance to avoid collision, seam tracking during welding, and to ensure parts are grasped correctly. With imitation, a robot can be programmed by demonstrating how to complete a task. For example, someone could show a collaborative robot how to grasp an object by guiding the robotic arm the first few times. In this way, the robot would learn to grasp the object on its own. Will further developments in machine learning will have a major impact on robotic capabilities? Boricha thinks so. He sees robots being able to learn from their own mistakes. Currently, machine learning is limited to the initial deployment, rather than self-adapting over the lifetime of the robot. “I personally know the challenges that exist in the manufacturing environments, the kind of tasks that can be automated but have not been so far,” Borisha adds. “As an example, random bin picking is a focus area for us, we are trying to crack it using in machine learning.” For further information please visit: https://vikaso.co.uk/virtual- development

RkJQdWJsaXNoZXIy MjQ0NzM=