Drives & Controls Magazine November/December 2024

30 n AI AND ROBOTICS November/December 2024 www.drivesncontrols.com Where are we in the physical AI revolution? In 10, 20 or 50 years’ time, if we look back at November 30 2022, we might remember it as a historic turning-point. The launch of ChatGPT on that day may come to be seen as having started an era of the widespread use of arti cial intelligence. Since then, AI and ML (machine learning) have been hot topics of conversation. AI and ML are not new technologies. We have known them for decades, but the recent revolution is basically down to advances in computing power that are allowing us nally to handle the enormous amounts of data needed to take on the complex tasks we’re starting to use AI for. The companies behind all this, such as Nvidia, are enjoying extraordinary growth – and rightly so. Before this year’s Computex technology conference, Nvidia founder and CEO Jensen Huang highlighted the transformative power of generative AI, predicting a major shift in computing. “The intersection of AI and accelerated computing is set to rede ne the future,” he stated, setting the stage for discussions on cutting-edge innovations, including the emerging eld of physical AI which is posed to revolutionise robotic automation. But here, in late 2024, what progress have we made in the physical AI revolution – on a scale from 1 to 5, say? To be honest, we haven’t really got that far. I like to compare robotics to the development of self-driving cars. The automotive industry has de ned ve stages for the transition from manual to fully autonomous driving. Currently, the industry is not yet on level 5, as recent experiments in the US have shown, but the upside is that there are already a lot of level 2, 3 or 4 technologies that can have a major impact – such as adaptive cruise control in cars, which has turned a manual activity into a semiautomated process, making driving smoother, easier and safer. The same goes for robotics. One day, AI will certainly lead to humanoid robots that can think and gure out how to solve problems by themselves without prior programming – that would be level 5. But, as with self-driving cars, we will see, and are already seeing, plenty of breakthroughs on level 2, 3 and 4 that are providing true value to businesses. One of these breakthroughs, for example, can be seen in the eld of logistics. In partnership with Siemens and Zivid, Universal Robots has developed a cobot that can perform order-picking with total autonomy, based on Siemens’Simatic Robot Pick AI software and Zivid’s vision technology. Compared to manual processes, this enhances the speed and accuracy of order ful lment signi cantly in warehouses, and helps logistics centres to meet growing global demand, while also dealing with the increasing dišculty of attracting labour for this kind of manual work. Getting to a level 5 humanoid robot will rely heavily on, among many things, having outstanding vision technology and software at a level we are yet to see. But intermediatestage technological innovations are delivering a lot of value on the way. Three impacts Getting a group of robotics experts to agree on where we currently are on the above scale could start a lengthy discussion. But it’s obvious that, when looking at the disruptive potential of physical AI, we still have much ground to cover – despite the great advances that have been made in the past couple of years. Looking forward, let me highlight three of the impacts that I believe physical AI will have on robotics: n AI will largely eliminate the need for experts We will, of course, still need robotics engineers, integrators and other skilled experts in the future, and plenty of them. But the potential of robotic automation is so large that Arti cial intelligence has the potential to change the way that we automate our factories. Anders Billesø Beck, vice-president for strategy and innovation at Universal Robots, considers some of the far-reaching implications of AI and suggests how it might a€ect robotics in the future. Micropsi Industries’ Mirai is a vision-based control system that uses AI to allow industrial robots to deal with variances in position, shape or movement. With it, cobots can handle exible components, such as cables, robustly and precisely thus, for example, taking over end-of-line testing tasks.

RkJQdWJsaXNoZXIy MjQ0NzM=