AI ‘efficiency’ or just an attempt to stay afloat? A preliminary look at the relationship between stock price and AI layoffs of tech companies.
Published:
Introduction
AI1 is rapidly changing our world. And not for the ways that CEOs of tech companies indebted to the potential future promise of ‘solving everything’ would have you believe. To date, AI has been trained without permission on vast quantities of content and art, displaced a large number of human jobs (the subject of this post), destroyed the consumer computer hardware and PC gaming market, and begun to increasingly ruin the environment as the noise and air pollution and general resource consumption2 of data centres becomes increasingly more well known.
While the number of questions one could ask about the impacts of AI are vast, we are going to investigate a specific area; specifically, the large number of recent layoff announcements from publicly-listed companies. While many people online have discussed the nature of these layoffs, this is a statistical blog, and so we are going to do things a little differently.
I have a hypothesis that they are not a product of some realised “AI efficiencies” nor any of the other things that are often parroted by CEOs. My hypothesis is that AI is a scapegoat for CEOs when the real issue is declining shareholder value. AI is a pretty convenient excuse, because it cannot be held accountable like a person can, and at least gives off the façade that it can write code or prose – especially when companies use the work of employees as supplementary training data for AI models in an attempt to replace developers and other workers. The crux of my thesis is this:
Software (and other) companies were over-inflated and grew too aggressively for the actual value their product(s) contributes to society and are now trying to find ways to cut costs as revenue has either stagnated or declined due to a regression towards the actual market value.
Unlike AI companies, we are in the business of transparency and open-source, so let us load all the R packages we need to do the analysis:
library(quantmod)
library(tibble)
library(dplyr)
library(ggplot2)
library(scales)
library(patchwork)The market performance of software companies
Unfortunately, choosing which companies to investigate is a pretty subjective exercise. However, there are a few that have been on my radar from the public fallout which the announcements created:
- Atlassian
- Salesforce
- Block
- Autodesk
NOTE: I know Block is not technically a software company, but they are interesting to consider anyway.
In R, we can easily pull daily closing prices back to, say, 1 January 2023 for these companies of interest. This will give us a nice lengthy time series over which to observe the temporal dynamics.
getSymbols("TEAM", src = "yahoo", from = "2023-01-01")## [1] "TEAM"getSymbols("CRM", src = "yahoo", from = "2023-01-01")## [1] "CRM"getSymbols("XYZ", src = "yahoo", from = "2023-01-01")## [1] "XYZ"getSymbols("ADSK", src = "yahoo", from = "2023-01-01")## [1] "ADSK"We can take a quick look at one of the time series – say, Autodesk – to get a sense for what these closing prices look like:
plot(ADSK)We can see some visual evidence of cyclic patterns as well as some spikes and dips. However, it is difficult to understand the phenomenon I hypothesised about, and so we need to include some additional information. In order to produce the plots that I had in mind from the outset of this work, we have to do some data wrangling. Some of this could definitely be functionalised, but I’ll keep it separate and complete for each company for maximum clarity. You will note that I have manually assigned labels of some local trends based on date ranges. I did this by just inspecting the raw time series plots for the closing prices prior to writing this post (sorry!).
TEAM <- TEAM %>%
as.data.frame() %>%
rownames_to_column(var = "date") %>%
mutate(date = gsub("X", "\\1", date),
date = as.Date(date, format = "%Y.%m.%d")) %>%
mutate(flag = case_when(
date < as.Date("2025-02-10", format = "%Y-%m-%d") ~ "Historical",
date >= as.Date("2025-02-10", format = "%Y-%m-%d") &
date < as.Date("2026-01-01", format = "%Y-%m-%d") ~ "Decline",
date >= as.Date("2026-01-01", format = "%Y-%m-%d") &
date < as.Date("2026-03-12", format = "%Y-%m-%d") ~ "Steep Decline"))
CRM <- CRM %>%
as.data.frame() %>%
rownames_to_column(var = "date") %>%
mutate(date = gsub("X", "\\1", date),
date = as.Date(date, format = "%Y.%m.%d")) %>%
mutate(flag = case_when(
date < as.Date("2025-02-10", format = "%Y-%m-%d") ~ "Historical",
date >= as.Date("2025-02-10", format = "%Y-%m-%d") &
date < as.Date("2025-12-01", format = "%Y-%m-%d") ~ "Decline",
date >= as.Date("2025-12-01", format = "%Y-%m-%d") &
date < as.Date("2026-02-01", format = "%Y-%m-%d") ~ "Steep Decline"))
Block <- XYZ %>%
as.data.frame() %>%
rownames_to_column(var = "date") %>%
mutate(date = gsub("X", "\\1", date),
date = as.Date(date, format = "%Y.%m.%d")) %>%
mutate(flag = case_when(
date < as.Date("2024-12-05", format = "%Y-%m-%d") ~ "Historical",
date >= as.Date("2024-12-05", format = "%Y-%m-%d") &
date < as.Date("2026-02-27", format = "%Y-%m-%d") ~ "Decline")) %>%
rename(close_price = 5)
ADSK <- ADSK %>%
as.data.frame() %>%
rownames_to_column(var = "date") %>%
mutate(date = gsub("X", "\\1", date),
date = as.Date(date, format = "%Y.%m.%d")) %>%
mutate(flag = case_when(
date < as.Date("2025-09-03", format = "%Y-%m-%d") ~ "Historical",
date >= as.Date("2025-09-03", format = "%Y-%m-%d") &
date < as.Date("2026-01-22", format = "%Y-%m-%d") ~ "Decline")) %>%
rename(close_price = 5)Okay, now we can produce the key graphic that I envisaged from the start: a time series plot of daily closing price with the announcement dates of layoffs added to visualise whether the price was in decline before the announcement was made. The code below draws a plot for each company and then wraps them all together at the end in a single matrix. Note that I have included a pretty simple generalised additive model (GAM) to capture the nonlinear decline preceding the layoff announcements. If this was a proper statistical modelling exercise (especially if we had more data!), I would do all of this outside of ggplot2 and handle any potential autocorrelation and other modelling intricacies properly and calculate the appropriate quantities of interest. However, for our exploratory visual purposes, the inclusion here of GAMs to guide interpretation of local window trends is okay3.
p <- TEAM %>%
ggplot(aes(x = date, y = TEAM.Close)) +
annotate("rect", ymin = -Inf, ymax = Inf, xmin = as.Date("2025-02-10", format = "%Y-%m-%d"),
xmax = as.Date("2026-01-01", format = "%Y-%m-%d"),
fill = "orange", alpha = 0.3) +
annotate("rect", ymin = -Inf, ymax = Inf, xmin = as.Date("2026-01-01", format = "%Y-%m-%d"),
xmax = as.Date("2026-03-12", format = "%Y-%m-%d"),
fill = "red", alpha = 0.3) +
geom_line(size = 0.8) +
geom_smooth(data = TEAM %>% filter(flag == "Decline"), formula = y ~ s(x, k = 3), method = "gam", alpha = 0.4) +
geom_vline(aes(xintercept = as.Date("2026-03-12", format = "%Y-%m-%d")),
linewidth = 0.9, linetype = "dashed", colour = "red") +
labs(subtitle = "Following an extended decline in closing price, Atlassian waited 57 days of further, faster decline after its price first dipped\nbelow the most recent historical minima before announcing layoffs.",
x = "Date",
y = "Closing price ($)") +
scale_y_continuous(labels = dollar) +
theme_bw()
p1 <- CRM %>%
ggplot(aes(x = date, y = CRM.Close)) +
annotate("rect", ymin = -Inf, ymax = Inf, xmin = as.Date("2025-02-10", format = "%Y-%m-%d"),
xmax = as.Date("2025-12-01", format = "%Y-%m-%d"),
fill = "orange", alpha = 0.3) +
annotate("rect", ymin = -Inf, ymax = Inf, xmin = as.Date("2026-01-09", format = "%Y-%m-%d"),
xmax = as.Date("2026-02-01", format = "%Y-%m-%d"),
fill = "red", alpha = 0.3) +
geom_line(size = 0.8) +
geom_smooth(data = CRM %>% filter(flag == "Decline"), formula = y ~ s(x, k = 4), method = "gam", alpha = 0.4) +
geom_vline(aes(xintercept = as.Date("2026-02-01", format = "%Y-%m-%d")),
linewidth = 0.9, linetype = "dashed", colour = "red") +
labs(subtitle = "Salesforce experienced a decline in share price starting from a similar date to Atlassian, followed by a steeper decline which,\nagain, immediately preceded layoffs.",
x = "Date",
y = "Closing price ($)") +
scale_y_continuous(labels = dollar) +
theme_bw()
p3 <- Block %>%
ggplot(aes(x = date, y = close_price)) +
annotate("rect", ymin = -Inf, ymax = Inf, xmin = as.Date("2024-12-05", format = "%Y-%m-%d"),
xmax = as.Date("2026-02-27", format = "%Y-%m-%d"),
fill = "orange", alpha = 0.3) +
geom_line(size = 0.8) +
geom_smooth(data = Block %>% filter(flag == "Decline"), formula = y ~ s(x, k = 3), method = "gam", alpha = 0.4) +
geom_vline(aes(xintercept = as.Date("2026-02-27", format = "%Y-%m-%d")),
linewidth = 0.9, linetype = "dashed", colour = "red") +
labs(subtitle = "The stock price for Block has been characterized by a macro-level stagnation. This lack of an increase in shareholder value, rather than an aggressive\ndecline like other companies, might have driven the push for AI-driven layoffs.",
x = "Date",
y = "Closing price ($)") +
scale_y_continuous(labels = dollar) +
theme_bw()
p4 <- ADSK %>%
ggplot(aes(x = date, y = close_price)) +
annotate("rect", ymin = -Inf, ymax = Inf, xmin = as.Date("2025-09-03", format = "%Y-%m-%d"),
xmax = as.Date("2026-01-22", format = "%Y-%m-%d"),
fill = "orange", alpha = 0.3) +
geom_line(size = 0.8) +
geom_smooth(data = ADSK %>% filter(flag == "Decline"), formula = y ~ s(x, k = 4), method = "gam", alpha = 0.4) +
geom_vline(aes(xintercept = as.Date("2026-01-22", format = "%Y-%m-%d")),
linewidth = 0.9, linetype = "dashed", colour = "red") +
labs(subtitle = "Autodesk experienced a decline in share price over five months before announcing layoffs whose reasoning included AI functionality and corporate\nrealignment.",
x = "Date",
y = "Closing price ($)") +
scale_y_continuous(labels = dollar) +
theme_bw()
p_all <- wrap_plots(p, p1, p3, p4, nrow = 4) +
plot_annotation(title = "AI 'efficiency' or just an attempt to stay afloat? Software companies exhibit similar stock troubles before ultimately making the same\nannouncement.",
caption = "Dashed vertical red line indicates announcement date of 10% company layoffs due to AI.\nShaded regions added to guide the eye to particularly interesting trends in stock performance which could be used more broadly to predict likely layoffs before announcements are made.\nRegression line and 95% CI computed using a generalised additive model.")
print(p_all)What do we see? My commentary in the plot title and subtitles gives it away, but largely, the stock prices of software companies were in pretty serious and prolonged decline prior to the announcements being made. I am not an expert in anything related to these companies, but these patterns do not really communicate to me that a company is remotely close to realising some “AI efficiency” or “leveraging AI to scale into the future”. Rather, it seems like these companies have turned to cost cutting as the option to salvage profits. And, unfortunately, labour is often the largest cost for a business.
Comparator case
One thought that emerged from the analysis above was whether these effects are constrained to companies whose primary business or product is basically software (i.e., the companies believe that they can salvage their business by letting people go and instead using AI models trained on their code). Systematically testing this across all software companies who are publicly listed is difficult and would require much more time than I have already spent making this post. So, what can we do? For now, we will explore a case studies of a non-software companies who have recently announced layoffs due to AI: Commonwealth Bank (CBA).
CBA
CBA is particularly interesting because they have pushed recruitment for AI roles very hard over the last six months, but have also recently announced layoffs due to AI. What does their stock performance look like? We can replicate the prior analysis for their case against when layoffs were announced on 23 January 2026:
getSymbols("CBA.AX", src = "yahoo", from = "2023-01-01")## [1] "CBA.AX"# Draw plot
CBA.AX <- CBA.AX %>%
as.data.frame() %>%
rownames_to_column(var = "date") %>%
mutate(date = gsub("X", "\\1", date),
date = as.Date(date, format = "%Y-%m-%d")) %>%
rename(close_price = 5) %>%
mutate(flag = case_when(
date < as.Date("2025-06-27", format = "%Y-%m-%d") ~ "Historical",
date >= as.Date("2025-06-27", format = "%Y-%m-%d") &
date < as.Date("2026-01-23", format = "%Y-%m-%d") ~ "Decline",
date >= as.Date("2026-01-23", format = "%Y-%m-%d") ~ "Increase"))
p6 <- CBA.AX %>%
ggplot(aes(x = date, y = close_price)) +
annotate("rect", ymin = -Inf, ymax = Inf, xmin = as.Date("2025-06-27", format = "%Y-%m-%d"),
xmax = as.Date("2026-01-23", format = "%Y-%m-%d"),
fill = "orange", alpha = 0.3) +
annotate("rect", ymin = -Inf, ymax = Inf, xmin = as.Date("2026-01-23", format = "%Y-%m-%d"),
xmax = as.Date("2026-02-24", format = "%Y-%m-%d"),
fill = "green", alpha = 0.3) +
geom_line(size = 0.8) +
geom_smooth(data = CBA.AX %>% filter(flag == "Decline"), formula = y ~ s(x, k = 5), method = "gam", alpha = 0.4) +
geom_vline(aes(xintercept = as.Date("2026-02-24", format = "%Y-%m-%d")),
linewidth = 0.9, linetype = "dashed", colour = "red") +
labs(subtitle = "Despite experiencing similar declines to tech companies, CBA experienced a climb before announcing AI layoffs.\nThis suggests a fundamental difference between tech and banking.",
x = "Date",
y = "Closing price ($)") +
scale_y_continuous(labels = dollar) +
theme_bw()
print(p6)Surprisingly, we see a steady increase in share price prior to the announcement of layoffs – the opposite trend to some of the software companies we explored previously. Connecting this finding back to the thesis of this blog post, one potential interpretation is that a bank like CBA is fundamentally different to a software company – particularly a software company whose value was likely always over-inflated – despite ultimately making basically the same announcement.
Some important caveats
I cannot stress enough that this analysis is not a robust piece of causal inference and nor was it intended to be. Ideally, we would draw a directed acyclic graph (DAG) which captures all of the relevant causal relationships around our variable of interest (stock price) and the outcome (layoff announcement), and then fit a statistical model based on the DAG which closes all of the backdoor pathways. Given the macro scale on which stock prices can be affected, I can think of numerous confound variables which can cause changes in both stock prices and the probability that layoffs occur – things such as market sentiment, CEO public statements, government policy, and international actions such as Trump’s attack on Iran would all need to be appropriately captured.
Parting thoughts
Thank you for making it this far! Compared to my other more instructional posts, this one is definitely much more speculative, but I think it is pretty interesting regardless. At the very least, it suggests some curious future directions that someone with more time might want to explore.
Unfortunately, it appears that many companies deem investing in AI better value than investing in people. With AI yet to deliver on basically anything that was promised in the years since ChatGPT was first released publicly, it remains to be seen whether this decision to double down on data centres, power-hungry GPUs, and monetised tokenisation will ever be worth more than the people that got these companies to the point where they could make that decision.
In this post, ‘AI’ refers to large language models (LLM) specifically. I know AI is more broad than that, but this is unfortunately the common parlance.↩︎
Tao, Y., & Gao, P. (2025). Global data center expansion and human health: A call for empirical research. Eco-Environment & Health, 4(3), 100157.↩︎
Note that I chose the \(k\) value for the smooth to balance parsimony of the local trend with ensuring an adequate fit to the data.↩︎
