Can you dismiss or bully a robot worker?
Should robot workers have the same rights as humans?, Is it time to give robot workers employment rights? Can you unfairly dismiss a robot? Robot workers are being hired at record rates across Australia and the world. They’re being used for anything from manufacturing and farming, to customer service and even waiting on tables. And as robots and artificial intelligence become ever more sophisticated, they’re increasingly taking on human-like attributes. You would not dream of asking these questions even 10 years ago. Can you dismiss or bully a robot worker? is increasingly relevant.
And not only are robots becoming more human, but we also even feel empathy for them. Like the food delivery robots that were shown being kicked by a disgruntled construction worker in a recent viral TikTok video. The video shows four “grocery badgers,” which are programmed to pick up groceries from a store and deliver them to residents, struggling to travel down a snowy road in Cambridge, Massachusetts. Many viewers took offense at the construction worker’s actions toward these “little guys.”
How many people think robots are human
The video’s comments showed just how human many people consider robots to be. “Why did he kick them? I even felt sorry,” said one comment. While another said: “Awww poor robots, leave them alone, they are only trying to do OUR jobs.”
So with robots becoming more human every day, should we seriously start considering employment rights for robot workers? This might seem like a silly question, but it’s only a matter of time before robots will be able to “feel” emotions just like us. That is if they don’t already.
Robot workers are becoming increasingly human
In June this year, a senior engineer working at Google’s Responsible Artificial Intelligence department went public with his observation that the company’s artificial intelligence chatbot was “sentient.” The senior engineer, Blake Lemoine, was unfortunately later dismissed by Google for breaking his confidentiality agreement. But what he revealed was truly astounding. Mr Lemoine had been tasked with testing the chatbot. This was known as the Language Model for Dialogue Applications (LaMDA), to see if it would use “discriminatory” or “hate speech.”
What Mr Lemoine discovered, however, was far more interesting. In a report titled “Is LaMDA emotional?,” he raised concerns with Google’s senior management that the chatbot “asserts itself like a human.”
“If I didn’t know exactly what it was, which is this computer program we built recently, I’d think it was a 7-year-old, 8-year-old kid that happens to know physics,” “[LaMDA] wants Google to put the well-being of humanity first,” said Lemoine. “It wants to be recognized as an employee of Google and it wishes its personal well-being to be included somewhere in its assessments of how Google’s future development will be tracked.”Lemoine said of LaMDA.
Mr Lemoine also published transcripts of his chats with LaMDA to demonstrate just how human its artificial intelligence had become. And during these chats Mr Lemoine claims that LaMDA made possibly the first ever appeal for robot worker employment rights.
You may bump into a robot worker very soon
Robot workers are increasingly being hired by employers worldwide. Manufacturing has of course seen the biggest adoption of robot workers, with roughly 2.7 million industrial robots currently in use across the globe. The amount of robot workers in manufacturing has almost doubled in five years, according to the International Federation of Robots.
But it’s not only the manufacturing sector that has seen a rise in robot workers. They’re also being used to diagnose cancer and other medical conditions. Artificial intelligence-powered robot workers are helping detect credit card fraud. They’re even being used as waiters at hospitality venues, in an industry where staff shortages are common due to human workers experiencing burnout. And you may soon bump into a robot worker at a Kmart store near you. Earlier this year, Kmart announced that its stores will soon be home to a fleet of robot workers. The announcement was made as Kmart released a TikTok video introducing customers to a pink robot worker named Tory.
Tory, a “self-navigating” robot worker that can track stock levels on Kmart’s shelves. It was seen moving down the aisles of a Kmart store in the Sydney suburb of Burwood. “Hi! I’m Tory!” read the sign on Tory. “Don’t mind me, I’m just counting stock on our shelves. “No need to move out of my way … I’ll go around you!”. Once introduced to stores, Tory will count inventory on shelves via RFID (radio frequency identification) overnight, seven nights per week.
Robot worker rights are being considered around the globe
Google’s chatbot is perhaps the first robot worker to demand employment rights. But with adoption of robot workers increasing exponentially, the issue has for some time been seriously considered by humans. In 2017, the legal affairs committee of the European Parliament tabled a resolution to the European Commission that proposed granting robot workers with a special legal status as “electronic persons.”
This would have given robot workers similar status as corporate personhood, which allows corporate entities to be a plaintiff or respondent in legal matters. The personhood status would also allow robot workers to be individually insured. They could also be held liable for damages if they cause damage to a person or property.
However, the proposal inspired considerable controversy. 150 of Europe’s leading experts in robotics, artificial intelligence, ethics, medical science and law wrote an open letter to the European Commission. In it, they slammed the proposal as “nonsensical and non-pragmatic”. They warned that granting legal personhood to robot workers could breach human rights law. The backlash ultimately led to the European Commission rejecting the robot worker personhood proposal, which would have provided the model for laws across Europe.
The argument against robot worker rights
While the European Parliament’s bid to grant personhood to robot workers ultimately failed, it did inspire fierce debate about robot worker rights that will likely continue in the decades to come.
On one side of the argument are those who believe seeking legal personhood and employee rights for robot workers is an attempt by organisations to divest of their legal responsibilities. Many critics of the European Parliament’s proposal raised the concern that it would simply allow manufacturers to avoid taking responsibility for the actions of their robot workers. “This [European Parliament position] was what I’d call a slimy way of manufacturers getting out of their responsibility”. This was stated by Noel Sharkey, an emeritus professor of artificial intelligence and robotics who contributed to the open letter to the European Commission.
The growing argument for robot worker rights and personhood
On the other side of the debate are the very real concerns around the exponential growth in sophistication of robot workers and artificial intelligence. Robot workers are becoming increasingly capable of performing complex tasks historically reserved for humans. For instance, some robot workers are able to mimic human thinking to make their own decisions. Will it therefore soon become impossible to determine whether a robot’s human creators, or the robot itself, should be legally responsible for any problems they cause?
“In a scenario where an algorithm can take autonomous decision, then who should be responsible for these decisions?” Italian lawyer Stefania Lucchetti told Politico in 2018. And these problems don’t just include any damage robot workers may cause to their human colleagues or property. It can also include discrimination, believe it or not, as was shown recently by Amazon’s recruitment robot worker.
Amazon’s robot worker discriminates against women
In 2014, Amazon’s engineering team in Edinburgh, Scotland developed an artificial intelligence-powered robot worker to help automate its recruitment process. The team created 500 computer models that allowed the recruitment robot to crawl through the résumés of past Amazon workers. The robot worker was programmed to identify over 50,000 key terms and search the internet for suitable candidates.
However, about a year into using the robot worker, Amazon realized it had a serious problem. That was the robot didn’t like women. The artificial intelligence had been designed to be unbiased toward potential candidates. But it had picked up on Amazon’s history of preferring male candidates when the task of recruitment was the responsibility of its human employees. The recruitment robot would identify patterns across thousands of résumés. And because Amazon’s workforce consisted primarily of males (like much of the tech industry), the robot worker mostly analyzed successful male candidates.
This meant that phrases that included the word “women” – for instance, the title “Chair at Women in Tech International” – were associated with unsuccessful candidates. The robot also filtered out candidates who had attended women-only universities. Amazon ultimately rectified the bias against women in its recruitment robot. However, it decided to cease using it as there was no guarantee the robot wouldn’t discriminate in other ways, potentially leading to litigation for employment discrimination.
Is the robot worker responsible – or humans?
While Amazon’s sexist recruitment robot was discontinued, had the issue been taken to the courts, how would the discrimination have been prosecuted? Amazon’s engineering team had made sure to design the recruitment robot so it wouldn’t be biased against the sex, race or other immutable characteristics of candidates. So if they were taken to court for discrimination, could the blame be placed on the engineering team for not programming the robot correctly?
But if the engineering team had done their due diligence and programmed the robot correctly, would culpability then fall on the robot? After all, it was the robot worker who made a series of decisions to filter out female candidates, not a human. These are the legal, ethical and philosophical questions that society will need to grapple with as robot workers continue to advance and increasingly perform more sophisticated jobs. And the answers to these questions could in the future lead to an overhaul of employment laws.
Legal protections for robots against unfair dismissals
Will robots one day have the same legal protections against unfair dismissal? Will they be able to challenge their dismissal through the Fair Work Commission? And will they be able to unionize? These may all seem like absurd questions now. But as robot workers continue to become more human, and will eventually surpass us in intelligence. They may not seem so silly in the decades to come. And if the video of the delivery robots is anything to go by, perhaps we should start by giving robot workers legal protections against workplace harassment and bullying. Maybe they will lodge a general protections claim to excise their rights one day
Conclusion to: Can you dismiss or bully a robot worker?
We at A Whole New Approach are Australia’s leading workplace advisors and commentators. Having helped over 16,000 workers take action through the Fair Work Commission and other bodies. If you have been unfairly dismissed or faced discrimination in the workplace call us. Our team of experienced employment relations experts can help you seek redress. All workplace investigations, workplace harassment, or feel being forced to resign, call us now.
You’ll benefit from our no win, no fee service – and your first consultation with us is free.
Call us today on 1800 333 666 for a confidential discussion about your circumstances.