An article on ethics, ecology, and the art of spending your money like it means something
Not long ago, I wrote about breaking up with Clinique. Yes, Clinique, the brand that had gently promised to keep my face looking professorial and well-moisturised for the better part of a decade. The reason? Its parent company, Estée Lauder, and the financial threads connecting it to individuals whose political ambitions include, but are not limited to, the attempted purchase of a sovereign Arctic territory, named after a colour. I shall not name names. You already know the names. The point is: I put down the toner, I stepped back, and I asked myself the question every conscious consumer must eventually ask: Whose world am I funding?
The moisturiser situation, while emotionally taxing, was relatively straightforward. There are other serums. There are other brands. There is, it turns out, an entire economy of people who would like to take care of your skin without geopolitical aspirations attached.
Artificial intelligence is a slightly more complex divorce.
The Inconvenient Truth About Your Favourite Chatbot
Let me be clear: ChatGPT is impressive. Genuinely impressive. It will draft your emails, summarise your lengthy board reports, brainstorm your next campaign, and occasionally write poetry that is only mildly terrible. I have used it. I have recommended it. I have probably, stood on some stage whether in Perth or Singapore and saw people nod along to things, knowing they will never change it.
But here is what has given me pause, and what I believe should give you pause too.
OpenAI, the company behind ChatGPT, has undergone a quiet but seismic shift in its character. The organisation that once draped itself in the language of safety, open research, and benefit to humanity has, in recent years, become considerably more entangled with power. Its CEO has cultivated proximity to political figures whose values sit at a conspicuous distance from those of most people I know who care about equity, democracy, or the basic dignity of marginalised communities. The billions have flowed. The alignment with certain political ecosystems has deepened. And the “open” in OpenAI has become, shall we say, more of a branding choice than a philosophical commitment.
As a cultural strategist, I spend my working life helping organisations understand that culture is not decoration. It is infrastructure. It is the set of values, stories, and choices that determine who gets to thrive and who gets left behind. And when I look at the culture being built around and within some of our most powerful AI companies, I see infrastructure I am not comfortable funding.
The Moisturiser Principle
When I wrote about stepping away from Clinique, some readers were amused. Others were moved. A few were annoyed, which is, in my experience, usually a sign that you have said something worth saying.
The underlying principle was simple: money is a vote. Every transaction is a small act of cultural endorsement. When we buy, subscribe, download, or sign up, we are not merely exchanging currency for convenience. We are affirming a direction. We are saying, in the quiet language of commerce: I am with you.
Most of us do not mean to say that, of course. We mean to say: I want this product. But intention and impact, are rarely the same thing. Those who heard my keynotes would have heard me explain why intention is not enough.
I call this the Moisturiser Principle: the smallest, most mundane consumer choices are not separate from the larger ethical questions of our time. They are woven directly into them. The face cream and the fascism, the chatbot and the concentration of power, exist on the same continuum. Pretending otherwise is not innocence. It is convenience.
But What Will You Use Instead?
This is inevitably the question. And it is a fair one. I have no interest in performing ethical purity from a position that offers no practical alternatives. Above all I’m practical if nothing else.
The good news is that alternatives exist and are, frankly, rather good. Anthropic, the company behind Claude, the AI I recently switched to – was founded by researchers who left OpenAI specifically over safety and governance concerns. Its published commitments to responsible AI development are more rigorous, more transparent, and more accountable than most in the industry. Is it perfect? No technology company is perfect. But some are ethically different, and with what is currently going on the world, that is genuinely significant.
There are others too. The AI landscape is expanding rapidly, and not all roads lead through the same Silicon Valley power centres. The point is not that you must use what I use. The point is that choice exists and choosing thoughtfully is an act of cultural agency, not inconvenience.
What We Are Really Talking About
Here is the deeper question beneath all of this, the one I would like to leave with you:
We are at an extraordinary inflection point in human history. The tools being built right now — the AI systems, the data architectures, the recommendation engines, the automated decision-makers, will shape the material conditions of billions of lives for generations. Who builds them matters. Who funds them matters. Whose values are encoded into their foundations, whose blind spots become structural, whose interests are protected and whose are discarded, all of this matters enormously.
The people building these systems are not neutral. They never were. The question is simply whether we are paying attention.
As a keynote speaker and professor, I spend considerable time teaching people how to read culture, how to look at the surface of things and understand the structures beneath. As a strategist, I help organisations align their actions with their stated values, which is harder than it sounds and more important than most people realise. And as a consumer, which I remain despite my best efforts, I try to practice what I teach.
Sometimes that means putting down the moisturiser. Sometimes it means reconsidering the chatbot.
Neither act will single-handedly redirect the course of technological civilisation. I am under no such illusion. But collective consciousness, expressed through collective action, even in the small, quiet form of a cancelled subscription, is, in fact, how cultures change. You know that one degree of change that makes all the difference over a period of time.
Your Move
I am not here to tell you what to do. You are a thinking person. You have your own read on the landscape, your own risk tolerance, your own relationship to convenience and compromise. Unlike my office dogs, Lucky and Khaya, who have no conscience about where their next meal comes from.
What I am asking, gently (and also very seriously), is that you sit with the question.
The tools you use daily. The companies you sustain with your data, your attention, your subscription fees. The values they hold, the political ecosystems they are embedded in, the futures they are quietly building toward.
Is this the infrastructure you want to fund?
Because if you have gotten this far – past the Clinique reference and the Greenland joke and the moisturiser philosophy, some part of you already knows the answer. Now, it’s your move…
