Industrial robots emerged at the dawn of Industry 3.0, along with computerized control and automation, and have evolved over many years, becoming specialized for various industries and processes. Robots are designed for mass production; they are generally solitary, working in relative isolation on specific tasks. With the emergence of Industry 4.0, cyber-physical systems, and the Internet of things (IoT), some robots evolved into collaborative robots, called cobots. Cobots interact with their environment, including people and other robots, and support flexible manufacturing and mass customization (Figure 1).
industrial robots (left) and cobots (right)” alt=”Robots Evolve to Cobots in Industry 4.0″>Figure 1: Conventional industrial robots operate in isolation (left) while cobots (right) are designed to interact with their environment, including people and other robots or machines. (Image source: Omron)
The evolutionary path from robot to cobot has included numerous adaptations: cobots operate differently; they are programmed differently; they tend to be smaller, simpler, and in some cases, mobile; they are used for different processes compared with robots, and they must adhere to different safety standards. Cobots generally don’t compete with robots or replace robots; cobots expand the opportunities to employ automated processes.
This article traces the evolution of robots into cobots: it compares how robots and cobots operate differently; reviews the different programming methods used with cobots; discusses the use of artificial intelligence (AI), the IoT, and other technologies to enable cobot mobility and interaction with people; details some of the applications where cobots excel such as process finishing operations, quality control, logistics/material transport, and others; and reviews the expanded safety standards for cobots. Throughout, it paints a picture of future cyber-physical operations that merge robots, cobots, and people to maximize productivity and quality, while minimizing overall costs.
Cobots are designed to not only work with people but be moved from place to place (Figure 2). These characteristics have important implications for cobot programming, where and when cobots are used, and cobot safety requirements.
Figure 2: Cobots can be moved from place to place as needed for specific tasks. (Image source: Omron)
Industrial robots are programmed using languages such as C and C++. Cobots have evolved to be “taught” using various no-code tools such as pendants, tablet computers, even manually moving the cobot arm from point to point (Figure 3). Employing different teaching methodologies instead of traditional programming enables cobots to learn new tasks more quickly, which is important when the cobot is moved from task to task. The time it takes to program an industrial robot makes economic sense since it is used for relatively long periods in high production applications. On the other hand, cobots need to learn new processes quickly to avoid extended periods of expensive downtime. Machine operators can teach cobots specific tasks without needing help from specialist programmers. Tasks such as pick-and-place, including visual inspection of the results, can be taught to a cobot in a matter of minutes.
Figure 3: A cobot can be trained by moving its arm from position to position. The right hand of the operator is on a high-resolution camera the cobot can use to see where it is and what is at that location. (Image source: Omron)
AI plus machine vision can help improve cobot learning and functioning. Intelligent cobot vision systems provide a range of capabilities such as object identification and positioning, barcode and totem interpretation, pattern matching, and color recognition. The vision system can also enable hand gestures to guide the cobot from position to position and teach it a new process. In other cases, machine operators can quickly and efficiently teach cobots using a drag-and-drop flowchart-based system on a tablet computer (Figure 4).
Figure 4: Intuitive drag-and-drop teaching/programming maximizes cobot productivity and flexibility. (Image source: Omron)
In addition to working with people, cobots can team with autonomous mobile robots (AMRs) to move from task to task (Figure 5). AMRs are specialist cobots that work collaboratively with people, cobots, robots, and machines, performing tasks such as material handling with excellent efficiency. Like material handling, moving a cobot from place to place is not a highly skilled activity, making it suited for AMR implementation. AMRs navigate from place to place by combining onboard sensors and computing to understand their immediate environment with wireless connections to centralized computing resources and sophisticated sensor networks throughout a facility to help AMRs understand the position of obstacles on a planned route and efficiently navigate around fixed obstacles such as workstations, racks, and robots, as well as variable obstacles such as forklifts, other AMRs, and people.
Figure 5: A manipulator cobot (top) can be picked up and moved to a new workstation by an autonomous mobile robot (bottom). (Image source: Omron)
What are cobots good for?
The ability of cobots to work with AMRs, people, other robots, and machines opens up new opportunities for automation. Cobots are finding use in mass customization in a wide variety of industries and processes such as assembly operations, dispensing, screw driving, machine tending, palletizing, pick and place, and more, in an equally wide array of industries from automotive to food processing and semiconductor manufacturing (Figure 6).
Figure 6: Cobots are flexible and can be used in various applications. (Image source: Omron)
Performance of repetitive or complex assembly tasks can be efficiently performed by cobots working alongside people. If paired with an AMR, a cobot can improve the implementation of complex picking operations and the delivery of materials to work sites. Once the material is delivered to the end of the line, a cobot can quickly palletize products for shipment. Using machine vision and AI, cobots can inspect, sort, and pick up finished parts from the conveyor belt and place them in cartons. Cobots can quickly adapt their behavior to accommodate new products and seasonal variations.
Cobots are adaptable to various manufacturing processes, including (as stated previously) machine tending, screwdriving, and dispensing. CNC machines, stamping and punch presses, various cutting machines, and injection molding stations are among the machine tending tasks where cobots can relieve people from repetitive and potentially dangerous activities. Screw driving cobots add precision and consistent torque, resulting in higher quality than manual assembly. Dispensing various materials such as glues, seals, paints, and other finishes can be implemented by cobots with high levels of precision. Cobot end effectors are interchangeable and enable cobots to move from task to task as needed (Figure 7).
Figure 7: Cobot end-effectors can be easily switched for any task. This provides the flexibility to switch to different production requirements with minimal downtime. The top two end-effectors include a high-resolution camera for AI-based vision systems. (Image source: Omron)
Inspecting finished parts or products is another area where cobots with machine vision can excel. If the part is complex, a thorough inspection may require high-resolution images from various angles requiring the coordination of multiple stationary cameras. Alternatively, a cobot with a single camera can identify the part being inspected and move around the part accordingly, capturing all the needed images for a complete visual inspection.
Evolving cobot safety
Safety considerations have evolved along with cobots. Compared with industrial robots, cobot safety requirements are more complex. A team consisting of a cobot and a person can combine the repetitive performance abilities of robots with people’s individual skills and flexibility. Cobots (and robots) are proficient at tasks that demand precision, endurance, and power, while people are proficient at solving imprecise situations and variable problems. Combining these complementary skill sets brings challenges related to safe interactions between people and cobots.
Safety standards for industrial robots are generally based on excluding operators from the workspace while the robot is active. Cobot safety anticipates interaction with people. Cobot speed, torque, and force limits are the defining safety standards and include an emergency stop versus a protective stop.
An emergency stop of a cobot is operator initiated; it stops all cobot motion and removes power from the cobot. A reboot is required to recover from an emergency stop. A protective stop occurs automatically when a person enters the protective space around the cobot (Figure 8). During a protective stop, the cobot is still powered. Also, during a protective stop, the cobot motion encoders are monitored for unintended motion. If an unintended motion is detected, power is removed.
Figure 8: The cartesian safety space around a cobot (blue box) can be rectangular or cylindrical and defines an exclusion zone. If a person working next to the cobot enters the exclusion zone, the cobot initiates a protective stop. (Image source: Omron)
Some cobots are designed with two operational speed settings, one for maximum performance and one for maximum safety. In the performance setting, it’s assumed that no person will enter the protected space of the cobot, and the cobot will operate at a high rate of speed for maximum productivity. If a person enters the protected space, the cobot automatically enters the human-cobot setting for maximum safety with reduced speeds, torques, and forces.
There are several evolving standards and guidelines regarding cobot safety. ISO Technical Standard 15066:2016 and RIA Technical Report 15.606-2016 both describe the four collaborative techniques used to reduce risks to human workers: safety-rated monitor stops, hand guiding, speed, and separation monitoring, and power force limiting (PFL) systems. TS 15066 is normative and details the steps required for conformance to the standard. TS 15.606 is informative and provides information and methods that can be used for standard compliance.
RIA TR R15.806-2018 describes a method for testing forces exerted by a PFL system. Sensor systems are required for standard compliance related to speed and separation monitoring. For PFL systems and safety-rated monitor stops, safeguarding in exclusion zones is a requirement.
ISO 13855:2010 establishes the positioning of safeguards with respect to the approach speeds of cobots to specific parts of the human body. It provides a methodology to determine the minimum distances to a hazard zone from the detection/exclusion zone or actuating safeguards devices.
Collaboration is a hallmark of Industry 4.0 and cyber-physical systems, and cobots are key participants in driving higher levels of collaboration. Cobots continue to evolve to make them easier, safer, and more flexible to use. Advances in cobot teaching tools and AI make it more intuitive to use cobots. The evolving human-machine-interfaces (HMIs) of cobots leads to increased productivity and higher quality of mass customized production. Cobots are not replacing robots; cobots are expanding the opportunities for automation, and the line between robots, cobots, and people is increasingly fluid. As cobots become more like colleagues and less like industrial robots, cobot safety standards are expanding and becoming more and more important to ensure that the productivity promise of cobot-human collaboration is safely realized.