Why I’m not using generative A.I. in my business
- Remove the current class from the content27_link item as Webflows native current state will automatically be applied.
- To add interactions which automatically expand and collapse sections in the table of contents select the content27_h-trigger element, add an element trigger and select Mouse click (tap)
- For the 1st click select the custom animation Content 28 table of contents [Expand] and for the 2nd click select the custom animation Content 28 table of contents [Collapse].
- In the Trigger Settings, deselect all checkboxes other than Desktop and above. This disables the interaction on tablet and below to prevent bugs when scrolling.
AI isn’t inherently good or bad. But here’s why I’m not incorporating generative AI tools like Chat GPT, Dall-E, Midjourney (or even the AI tools within my beloved Notion) into my business.
Because this is the internet in the year 2024, I feel like it’s gotta be said: I’m not telling you what to do, nor am I trying to drag any business that’s been using these tools. This is my professional and personal take, the reasoning behind it, and I stand behind that.
Reason 01 — They’re way too thirsty
We tend to think of the internet as something intangible and and not as something that has a carbon footprint. But websites, “the cloud,” everything on the internet lives on servers, and those servers require energy to power and cool. (Think about how your laptop can heat up — and may have fans built in to cool it. Those servers REALLY need to be kept cool to keep functioning.) Then there’s the energy required to access the internet. Powering our cell phones and computers and routers takes energy. And every time someone visits a website, data is loaded within your browser, which also requires energy — the larger and more complex your website, the more energy required. Now the amount of emissions generated by a single website visit isn’t very large, but that adds up overtime and across the world wide web.
In general, there’s too much of a focus on our own individual impacts on global warming, when the vast amount of emissions are produced by about 100 corporations. But while we continue to push for accountability and regulation of those companies, we can still work to build websites more sustainably, power server farms using renewable energy sources like wind & solar, and to make conscientious choices about how we use the internet.
You might remember this discussion coming up around crypto, and then again around NFTs — both using blockchain technology. But blockchain technology requires a BUNCH of energy. Here’s a Forbes article that gets into blockchain's carbon footprint in depth. Generative AI / Large Language Models like ChatGPT are in the exact same boat. Here’s an article from Scientific American that breaks down the ecological impact of generative AI.
These models are energy-expensive to create, update, and to use — and so far few, if any, companies seeking to profit in the "AI boom" have sought to mitigate or offset the impact on our climate. Right now the cost far outweighs the benefit, especially when copywriters, artists, creatives, and experts exist.
Reason 02 — Shady data scraping
I said up top, AI isn’t inherently good or bad. What we’re calling AI right now isn’t really artificial intelligence. They’re large language models or deep learning models — a series of complex calculations that produce predictive responses based off data they’ve been trained on. Here’s a New Yorker article that breaks down how these programs like Chat GPT aren’t actually A.I. — even if the ship has already sailed on the use of the term.
Learning models need to, well, learn. And most of these models have used shady tactics (like purchasing user-uploaded data from apps with consent buried in terms & conditions) and have outright stolen data to train their models. Open AI (the company behind Chat GPT and Dall E) is currently being sued for stealing and using private data (including medical records) to train their programs. Midjourney is also being sued by a number of (and will be sued by more) artists whose names appeared on a list of artists whose work Midjourney was trained on without seeking permission and without compensation.
Now if someone actively consents to contribute to training AI (and ideally is also compensated for it), more power to them! But until generative AI is trained on ethically obtained data, I’m good without it.
Reason 03 — I’m a control freak
Because the use of the term AI implies intelligence, a lot of users trust AI output to be accurate, like the lawyers who cited a fake legal case in their claim after using Chat GPT to aid in writing their claim.
However, as everyone who's seen the more bizarre answers Google's AI has been spouting can attest, these answers are not necessarily accurate or true. And finding the source for a Chat GPT or Google AI answer isn't straightforward.
There's an inherent risk in publishing or posting, especially as a business, an article that you can't verify. Sharing misinformation can damage your reputation.
Speaking personally, I like being in control of my work. I like doing my own research, knowing why I’m making each design choice, and I like learning something new when I don’t know the answer to something. And when it comes to writing articles, I find fact checking to be more tedious than writing something myself.
Reason 04 — I enjoy the process
I won’t lie and say I love all parts of my job. (I’m looking at you, social media copy & strategy!) But the creative process — of designing a brand identity, mapping out ideal website architecture, building the site out, mapping out workflows — it’s all fun. Even when it’s frustrating, it’s ultimately so satisfying. And for any part of my job and running a business that I don’t feel that way about — there’s someone out there who does love that process. Finding and hiring / commissioning work from / working with / collaborating with / learning from them is worth it.
Articles Referenced and Further Reading:
The New Yorker: There is no A.I.
Forbes: Why Blockchain, NFTs, and Web3 Have a Sustainability Problem
Scientific American: A Computer Scientist Breaks Down Generative AI’s Hefty Carbon Footprint
Henrich Böll Stiftung: The Individual Carbon Footprint. How much does it actually matter?
MIT Technology Review: Deleting unethical data sets isn’t good enough
WIRED: Security News This Week: ChatGPT Spit Out Sensitive Data When Told to Repeat ‘Poem’ Forever
Business Insider: Google's water use is soaring. AI is only going to make it worse.
Medium: How Large Language Models work
Medium: Data Ethics in Artificial Intelligence & Machine Learning