Productivity Fluctuation: Robots Vs Humans
Creating a balanced and even workflow will optimise productivity for robots – in the same way as it will for human workers.
Surely robots don’t get tired, can work 24/7, are fully skilled at what they are programmed to do, and don’t have any pesky motivational issues – so their productivity must always be consistently high? Absolutely not. This is according to Neil Bentley, Non-Executive Director & Co-Founder of ActiveOps, a leading provider of digital operations management solutions.
To believe this would be to forget everything we have learned about Lean Workflow and the way production systems work. For a processor (robot or human) productivity is best measured as a ratio of output:input. How much work did we get out for the amount of time we put in? For this to make sense we generally convert time into “capacity to do work” based on some idea of how much work could be done in a given time.
So, if Person A completes 75 tasks in a day and they had capacity to complete 100 then their productivity was 75%. Similarly, if Robot B completes 500 tasks in a day and had capacity to do 1,000 then their productivity would be 50%.
As we begin to increase our investment in Robotic Process Automation (RPA) and AI: the productivity of this (potentially) cheaper processing resource will matter – if not so much now then certainly when everyone is employing RPA to do similar tasks within the same services.”
But why would Robot B only do 500 tasks? They wouldn’t dawdle because they didn’t like their boss. They wouldn’t spend hours on social media, and they would surely only be allocated tasks that they were 100% capable of processing.
Maybe Robot B could only process 500 tasks because there were only 500 available to be done. Maybe the core system was running incredibly slowly that day, or there was so much network traffic that latency was affecting cycle times. Maybe someone changed a port on a firewall and the robot needed to be reset. Or there were hundreds of exceptions and the robot had to try them multiple times before rejecting them.
It is strange (isn’t it?) that if a person’s productivity is 50% we assume idleness, a propensity to waste time on social media, or a lack of skill but if it is a robot we quickly understand that it is the workflow that is the problem,” he continued.
Data-focused technologies such as Process Forensics and some digital operations management technologies or WFO technologies that seek to improve performance by URL logging or other screen monitoring techniques are totally missing the point: people’s productivity is far more influenced by the flow of work through the system than it is by their willingness to work or their skill level.
Workforce monitoring technologies seek to intimidate people into working harder, but you can’t intimidate people into having more work available to do. Equally, fluctuating demand, bottlenecks in the workflow, variations in work complexity will all drive variations in productivity – as with people, so it is with robots,” he added.
The answer is to introduce digital operations management solutions in the back office that will be the result of a blended human/RPA strategy made up of:
- Some degree of “demand smoothing” – using service level agreements to allow for carrying some work from one period to the next to even out resource requirements
- Some degree of cross-team collaboration – moving work or people between teams to balance demand and capacity.
The plain fact of the matter is that with humans and robotics increasingly working alongside one another in service operations a blended and balanced approach needs to be taken on the issue of productivity.