DynCoMM: Dynamic Collaborative control of Mobile Manipulators for complex picking
DynCoMM is an advanced robotic solution for external control of mobile platforms to perform dynamic operations in collaborative scenarios. The objective is to control and synchronize both application operation and platform dynamics simultaneously, increasing the productivity and flexibility of potential applications. Thus, three prominent aspects must be considered: the environment and process analysis, the collaborative integral control of the mobile manipulator, and the application integration. DynCoMM solution includes the integration of computer vision algorithms for both the application and the environment reconstruction. The developed RealTime (RT) external control system will use the obtained information to perform complex picking applications in industrial collaborative scenarios. More specifically, the demonstrator will be focused on complex manufacturing processes where manual operations are still needed, facing technology challenges in the field of robotics and vision-based Human-Robot Collaboration (HRC).
The project presents a novel solution for mobile manipulators, embedding the robot manipulator into a mobile platform to combine their benefits. The combination of both actuators increases productivity, as the mechanism will be synchronized to operate in RT with the manufacturing line workers, creating a human-machine collaborative environment. This high level of coordination between the mobile manipulator and the human employer is obtained by integrating a novel artificial vision framework designed to recognize in RT the environment and make decisions about the mobile platform position to maintain human safety without halting the production. In the novel environment presented, the mobile manipulator interacts with the human worker in RT, responding dynamically to their commands and aiding them during the manufacturing process. This transformation optimizes industrial processes, increasing their productivity in three main aspects:
Cost efficiency: Factories would obtain high degrees of automation, increasing their efficiency and profitability in the long term.
Ergonomy: In the collaborative environment presented, operators are aided through the mobile platform in their daily tasks. This situation will reduce the absenteeism of workers, as the mobile platform will deal with the most physically intensive operations, increasing the manufactory competitiveness.
Flexibility and agility: Manufactured products complexity has been increased during the previous years, while their live cycles have been decreased. This situation requires developing novel technological solutions and processes in constant adaptation to maintain production stability without reducing product quality or increasing costs.
In summary, the project will improve the manufacturing techniques of several industrial sectors, introducing a novel collaborative environment where robots interact with human workers.
The general objective of the project is to develop a novel computer vision guided collaborative control solution for mobile manipulators. This concept is represented in Figure 1, where the proposed technological modules are shown. The integration and synchronization of control and computer vision modules will be done in an external real-time controller that will manage and control the multiple agents and systems involved in the operation (i.e. mobile platforms, collaborative manipulators, perception devices and/or robotic tools).
Therefore, a distributed external control solution that interfaces with a mobile manipulator and with the perception system will be developed. Both vision and control modules will be implemented over an extended ROS architecture as a common framework. Within this framework specific communication and hardware interfeces will be design and implemented between the different technological modules and between the RT control module, the mobile manipulator and its tools.
The onboard external controller will synchronise and coordinate the operation of the multiagent system (mobile platform, collaborative manipulator and tools) controlling the whole system as a single system of N degrees of freedom. This will allow the system to compensate for possible deviations and/or dynamic changes in the environment that could not be addressed by individual equipment control.