https://www.polity.org.za
Deepening Democracy through Access to Information
Home / Legal Briefs / All Legal Briefs RSS ← Back
Africa|Business|Electrical|Health|Safety|Sensors|Systems|Technology|Testing|Wireless
Africa|Business|Electrical|Health|Safety|Sensors|Systems|Technology|Testing|Wireless
africa|business|electrical|health|safety|sensors|systems|technology|testing|wireless
Close

Email this article

separate emails by commas, maximum limit of 4 addresses

Sponsored by

Close

Article Enquiry

Raging against the machine

Close

Embed Video

Raging against the machine

Werksmans

9th May 2023

ARTICLE ENQUIRY      SAVE THIS ARTICLE      EMAIL THIS ARTICLE

Font size: -+

The meteoric rise of artificial intelligence (AI) is generating infinite possibilities and particularly so when linked to or combined with other technologies, such as neurotechnology. The United Nations Educational, Scientific and Cultural Organisation (UNESCO) describes “neurotechnology” as the use of any electronic device to read or modify the activity of neurons (being the specialised cells that transmit nerve impulses throughout the body) in the nervous systems.

The electronic devices typically used in neurotechnology include invasive technology (such as placing micro‑electrodes or other neurotechnological material directly onto the brain) or non-invasive technology (such as magnetic resonance imaging to map brain activity to identify brain tumours, strokes and developmental problems and wearables, which include smart watches, headphones, ear phones and virtual reality headsets to monitor the user’s heartrate, stress level, physical activity and behavioural patterns) and which when combined with AI, can lead to a collection of neurological, physiological and cognitive information, which can be used to interpret the user’s personal information. Inferences made in the interpretation of such information, may be subjective or objective, depending on the identity of the interpreter and the purpose for which the technology is being used. So, what does this mean for the user’s, mental privacy and other cognitive and human rights?

Advertisement

UNESCO, in its publication entitled ‘The risks and challenges of neurotechnologies for human rights‘ states that neurotechnology, on the one hand, has immense potential to improve learning and cognition, facilitated by thought-to-text creation and virtual and augmented reality supported by brain control, to name a few.

On the other hand, UNESCO warns that these advancements present a novel ethical and human rights dilemma, specifically insofar as it relates to the need to introduce specific human rights to prevent impairments to the user’s cognitive liberties and mental privacy. It further notes that any advances in neurotechnological applications must consider the potential consequences for the user’s autonomy, privacy, consent, integrity and dignity, as the users may not always be aware that –

Advertisement
  1. their neurological and other information is being processed and
  2. their neurological and other information is used in conjunction with AI (underpinned by an algorithm or many algorithms) to make inferences on the user’s behaviour, emotions, cognitive abilities, productivity and to predict the user’s decisions. The algorithm/s (used in conjunction with the neurological information) may also introduce biases against users which may lead to unchecked discrimination against the user purely based on the algorithm/s and for purposes unknown or disclosed to the user.

Some companies, from as early as 2014, have integrated the use of neurotechnology in the workplace and it is unclear whether the employees of these companies were, at all times, aware that their employers were monitoring, amongst other information, their neurological information. The aforementioned companies have made and continue to make use of some of the following neurotechnologies –

  • wireless sensors in employees’ hats working in production lines, which are combined with AI algorithms to detect workplace rage, anxiety and sadness (constituting what is referred to as “emotional surveillance technology”) to –
    • (i) enhance workflow,
    • (ii) suggest that an employee be placed on a break or on leave by the employer or
    • (iii) re‑assign the employee to a less critical task;
  • sensors built into the caps of high-speed train drivers to trigger an alarm if a driver falls asleep; and
  • SmartCaps, wired with electroencephalograms, to monitor employees’ brainwaves in order to monitor their level of fatigue and relay the neurological information to the employee in real-time to potentially prevent injuries.

In addition to the obvious purpose of the neurotechnologies, employers may use the neurological information of employees for more subtle purposes, such as to inform their decisions on promotions, retrenchments and dismissals. Further, the neurological information of candidates applying for an employment position may, even if not consented to, be used to differentiate and discriminate applicants by prospective employers and which could lead to unfair discrimination or unfair labour practices.

Although the introduction of neurotechnology within the workplace has the potential to transform workplaces in order to make them more efficient, the application of neurotechnology in the workplace may prove to be problematic given the current South African labour legislative framework.

Section 7 of the Employment Equity Act 55 of 1998 (EEA), prohibits medical testing by employers of its employees unless legislation permits and/ or requires the medical test to be conducted or unless the medical test is justifiable in light of medical facts, the employee’s employment conditions, social policy, the fair distribution of employees’ benefits and the inherent requirements of the job. For the purposes of the EEA, medical testing is defined as “any test, question, inquiry or other means designed to ascertain, or which has the effect of enabling the employer to ascertain, whether an employee has any medical condition“. A medical condition is defined as “the state of a patient’s physical or mental health, including the patient’s illness, injury or disease“.

In addition, section 8 of the EEA, prohibits psychological testing and other similar assessments unless the test is, inter alia, scientifically shown to be valid and reliable, is applied fairly to all employees and is certified by the Health Professions Council of South Africa (HPCSA) or any other body which may be authorised by law to certify the tests or assessments. Despite not being defined in the EEA, psychological testing includes, the conducting of psychometric tests by applicants for employment as part of the recruitment process. It is to be noted that the prohibitions to medical and psychological testing also extend to applicants for employment. The testing which is prohibited in terms of the EEA would give rise to unfair discrimination and the applicant/ employee would be entitled to approach the Labour Court for an order prohibiting the employer from using such test, alternatively for compensation.

The application of neurotechnologies to measure and assess, inter alia, emotional wellbeing, fatigue, health and cognitive performance may, depending on the test conducted, constitute medical and/ or psychological testing for the purposes of the EEA.

In the event that employers intend to implement said tests in the workplace, they would, as a first step, need to determine whether the test constitutes a medical test or a psychological test. If the AI test falls within the ambit of a psychological test, the employer would need to ensure that the test meets the standards of the HPCSA, as a minimum, and complies with the balance of the requirements in section 8 of EEA. To the extent that it fails to comply with these requirements, the test would not be permissible in terms of the EEA.

If it is established that the AI tests constitute medical tests, then employers will be faced with a far higher hurdle to overcome before being entitled to use such tests. As a point of departure, the employer would have to show that the test is permissible in terms of legislation. For example, there are certain industries where medical testing is permitted in terms of legislation such as, inter alia, the Occupational Health and Safety Act 85 of 1993 and the Mine Health & Safety Act 26 of 1996.

In workplaces governed by this legislation, depending on the extent of the health and safety risks as identified in a specific workplace health and safety risk assessment, employers are permitted to conduct breathalyser tests. Breathalyser tests, although geared towards determining the presence and amount of alcohol in the bloodstream and the psychological and cognitive impact that this will have on the employee’s competence and/ or ability to perform their duties, the actual test conducted is a medical test. Parallels could be drawn between breathalyser tests and the tests conducted using AI and neurotechnologies which measure rage, anxiety, sadness, exhaustion and fatigue and consequently, these types of tests are more likely be constitute medical tests than psychological tests.

In circumstances where an employer cannot rely on legislation to justify the use of a particular neurotechnology or AI medical test, the employer could nevertheless conduct such medical tests where the test is justifiable in light of medical facts, employment conditions, social policy or the inherent requirements of the job. However, other than in respect of the inherent requirements of a job (which criteria has been the subject of a number of matters that have come before the Labour Courts) the remaining justifiable criteria in terms of the EEA remain largely untested by the Labour Court. Consequently, an employer who relies on these largely untested justifications in order to validate the use of neurotechnology or AI tests will not be in a position to rely on established authorities to support the use of such tests.

Assuming the employer can justify the use of the test, either in terms of section 7 or section 8 of the EEA, the employer would nevertheless also be required to obtain the employee’s or applicant’s consent. The employer would need to ensure that the test conducted measures only that which the employer requires and does not extend beyond this scope by collecting information which the employee or applicant has either not consented to or for which there is no work‑related justification. The employer would also need to ensure that the data acquired is applied consistently to all employees and does not discriminate against any particular group/s either when conducting the tests, when interpreting the data acquired or when using the data to deal with employees.

Over and above the employment law related hurdles arising from the use of neurotechnology in the workplace, there exists the additional challenge of the processing of this personal information which would invoke the Protection of Personal Information Act 4 of 2013 (POPI Act). This is due to the data being collected constituting the personal information of the employee for which there are specific requirements that need to be complied with prior to the employer being able to process and/ or store this information in line with the POPI Act.

The potential applications of neurotechnology and AI are infinite and not isolated to employers using the neurological and other information received to assess and discriminate against their employees and/ or applicants for employment. Neurotechnology is becoming more prevalent in smart watches, earphones and wireless earbuds that can measure brain and electrical activity in the human body, and gather neurological information on its user, for purposes that may or may not be disclosed to the user and in respect of which they may or may not have provided their express consent to process, in part or in whole.

Given the speed with which neurotechnology and AI are taking over the day‑to‑day activities of both personal and work life, the South African legal landscape, which has traditionally been slow to respond to technological advancements, will need to evolve with greater speed in order reform and keep up with the social and technological changes which are permeating the world we live in. With this being said, a balance would need to be struck between protecting humans and their mental capacity, business efficiencies, neurological and cognitive advancements and the increasingly invasive impact of these machines on society.

Written by Janice Geel, Associate and Thembelihle Tshabalala, Candidate Attorney, Werksmans

Reviewed by Natalie Scott Director, Banking and Finance and Head of Sustainability and Anastasia Vatalidis, Director, Employment

EMAIL THIS ARTICLE      SAVE THIS ARTICLE ARTICLE ENQUIRY

To subscribe email subscriptions@creamermedia.co.za or click here
To advertise email advertising@creamermedia.co.za or click here

Comment Guidelines

About

Polity.org.za is a product of Creamer Media.
www.creamermedia.co.za

Other Creamer Media Products include:
Engineering News
Mining Weekly
Research Channel Africa

Read more

Subscriptions

We offer a variety of subscriptions to our Magazine, Website, PDF Reports and our photo library.

Subscriptions are available via the Creamer Media Store.

View store

Advertise

Advertising on Polity.org.za is an effective way to build and consolidate a company's profile among clients and prospective clients. Email advertising@creamermedia.co.za

View options

Email Registration Success

Thank you, you have successfully subscribed to one or more of Creamer Media’s email newsletters. You should start receiving the email newsletters in due course.

Our email newsletters may land in your junk or spam folder. To prevent this, kindly add newsletters@creamermedia.co.za to your address book or safe sender list. If you experience any issues with the receipt of our email newsletters, please email subscriptions@creamermedia.co.za