Don't Let AI Take Away Your Gut
Wholly replacing critical thought with machine learning models defeats the purpose of efficiency-driven LLM use
When AI tools hit the scene over half a decade ago, the prognostication of how they would help (or hurt) humanity began in earnest. They were going to revolutionize the way we worked. They would help us do our jobs better, or simply replace our jobs altogether. The spectrum of AI’s capabilities and threats was quite wide (and to be fair, it remains that way.)
We are now beginning to see some consequences of AI on human intelligence as studies on these machines’ effects on human intellect come to fruition. The initial results - while not necessarily surprising - are a bit disconcerting. Replacing critical thinking and research capabilities wholeheartedly, it turns out, isn’t the greatest thing for our brains.
Trust Your Instincts
It has many names - a sixth sense, a gut, intuition, instinct. Very often time spent thinking about a problem or issue simply leads to the original instinctual decision you had at the beginning.
The science behind what constitutes this “gut feeling” and the insane connection between your gut and your brain is actually quite fascinating, but the key point is that it is something uniquely human. We emphasize the importance of neuroscience being part of the machine learning occurring with AI models by calling some of them “neural networks,” but there is no “gut feeling” when it comes to their ability to digest data.
Therefore when we think about intuition, there is no AI equivalent. Indeed, outsourcing your intuition to AI is a terrible idea, for a multitude of reasons. One of the more germane being the fact that human cognition and machine cognition are different enough as to be complementary versus interchangeable.
This is Your Brain on AI
Tapping into your gut is intuitive, but it’s also a skill. Knowing when to listen to it, when not to, and the subtlety of the messages is all something that takes time and practice to hone.
Even though it’s not a muscle, the brain works in a very similar fashion. Chronic disuse leads to declining performance - and while this seems obvious, many have still jumped head first into AI tools not as complementary efficiency drivers, but as wholesale cognitive replacements.
And studies are beginning to come out showing the deleterious effects of AI usage as a replacement for good ole’ fashioned thinking. “Digital dementia” has already cemented itself as of byproduct of the dopamine chasing behavior we find ourselves in day after day with our digital devices. The echo chambers they create and the absolute gutting of our attention spans may well turn out to be the least problematic aspects of this constantly online world we’ve created for ourselves as a species.
But as with everything else it seems to touch, artificial intelligence is supercharging this cognitive gap. Allowing AI to replace your own thought processes has always been thought to be bad, but just how bad is beginning to come into focus.
There’s always been the anecdotal “Hey, I think I’m using too much AI” think pieces. There’s the relatively scientific studies (funded by none other than AI powerhouse Microsoft) showing that high confidence in AI outputs leads to less critical thinking. And now there are robust academic studies pointing out a severe cognitive decline among those reliant on LLMs for writing purposes. To translate people in that group: much LLM make brain stupid.
So That’s It? We’re Just Done With AI?
Hardly. First, through the lens of our favorite sociological failing called late stage capitalism, there is simply too much money to be made. A bunch of corporate drones prompting ChatGPT simply to reply to an email with a couple of lines like a white collar Brave New World back and forth is a small price to pay for “operational efficiciencies” driven by a highly valued technology which corporations are pouring billions into. To put it bluntly, they’re going to need some kind of return on that investment.
Secondly, AI does have legitimate use cases that will enhance existing productivity without making us all knuckle-dragging troglodytes whose understanding of language comes in tokenized bits from a machine’s neural network. The key is to find where it is enhancing your productivity and thinking versus where it is merely replacing your cognitive function.
There will be overlap of cognitive duties between our brains and AI - this is inevitable and unavoidable. The key is to minimize this overlap and ensure that the vast majority of AI work being done on your behalf is catalyzing your own thinking, not displacing it.
One key way to do this is to check the work of the AI. If I’m stumped after a couple of Google searches on a topic I need expertise on, I turn to the deep research tools of LLMs like Google’s Gemini or OpenAI’s ChatGPT.
To avoid the ever-present danger of model hallucination, these AI agents are instructed to cite as much of their work as possible. Akin to reading a Wikipedia page, if a particular nugget or insight in the research jumps out at me, I back it up by taking a quick look at the source material to ensure the model understood it correctly.
It’s akin to working with a research assistant at a library - they’ll help you pull primary sources on a particular topic and point you in the right direction on a lot of things to help induce faster research. No one would point to this practice as a cognition killer, unless of course you did it for every single thing you wanted to know about and simply took the research assistant’s word for it all without looking through the documents.
These kinds of uses are where we should be looking - what’s driving the efficiency of a project without taking away our ability to critically think about the work? Granted, it’s a fine line. How many uses of deep research on a particular workstream or in a finite time period is too many? How many source docs do you need to review to maintain brain function at its current level? Will TDNBW ever get ChatGPT to make a funny image? These are questions that don’t yet have answers.
Complement - Don’t Replace
Humans are very good at making decisions with limited data - that’s the “gut” part of our ability to analyze a situation, unconsciously tap into decades of experience and receive signals from the Enteric Nervous System (aka the second brain) to forge a path ahead.
Artificial intelligence-powered machines are not as adept in low-data environments. To simply exist they require reams and reams of data - not to mention the initial supervised learning to get to a place of neural network nirvana. When humans have access to this data we also make data-driven decisions, but machines are able to do it at a much faster speed and a scale humans can only dream of.
Our job as workers - and, not to be dramatic, but as a species - is to figure out how to drive efficiency out of this amazing technology while avoiding becoming enslaved by it. The title of this very newsletter may not be more prescient than in this moment.
Grab Bag Sections
WTF Newton: Unless you’re super dialed in to village-by-village news in the relatively sleepy bedroom community of Newton, Massachusetts - which lies just outside of the greatest city on earth: Boston - you may have missed one of the most dramatic showdowns between the people and Mayor Fuller of the well-to-do city since she picked a fight with the teachers union and lost in incredible fashion.
There are many ways to view this tony suburb. This newsletter grew up seeing it through the lens of dichotomies. It certainly is not a tale of two cities, but there are pretty major differences between the leafy estates in Chestnut Hill and Waban and the more dense and what used to be middle-class communities of Nonantum, Newtonville, and lower West Newton.
Newton, like many other suburbs of tier one cities across the country, has moved to the model of being an unaffordable housing escape for all but the wealthiest of their residents. But it wasn’t always like that.
Having had the incredible privilege of growing up in such a bucolic suburb close enough to downtown Boston to ride my bike (and having access to public transit when I didn’t feel like doing that), I remember a city with a distinct north/side divide. This was socioeconomic, certainly, but also deeply political. You rarely saw northsiders win citywide election - and I’d be lying if I said there wasn’t palpable resentment from some of the population about this.
This newsletter could go on and on about why this divide exists, how it actually used to be the northside that secured the bag compared to the south, and how the construction of the Turnpike in the 1960s threw this dichotomy on its head, leading to disruptive commercial and residential progress in the northern villages and opening the door to a resurgence of the southern part of the city that continues to this day. But we don’t need to, as keeping in line with this post Google Gemini Deep Research has done it for us.
Keeping all of this context in mind is important to understand what this newsletter has deemed the Adams Street Massacre. Adams Street is not only home to one of the best restaurants in Newton (we need sponsors!), along with one of the best Dunks in the region, but is also one of the main thoroughfares through the storied village of Nonantum - a traditionally Italian mix of working- and middle-class families (until the most recent Boston housing boom, which further accelerated an already rising cost of housing stock.)
Adams Street is also home to the parade for the Festa, an annual Italian-American festival thrown by the St. Mary of Carmen Society and a cannot miss event for Nonantum-ites and neighboring villages alike. One feature of Adams Street is its distinctive divider line is painted in the Italian tricolor as opposed to the yellow lines typically dividing US streets (minus Bristol, RI, of course.)
So when Mayor Fuller sent road crews to work under the cover of darkness through the night with no notice to the neighborhoods to paint over the Italian colors and replace it with standard yellow markings only weeks before the 90th iteration of the Festa, residents were justifably pissed off. The fact that she lives in one of the wealthiest villages of the city on the other side of the Pike and has always been viewed as a southside mayor only served to rub salt in the wound.
Arguments about safety ring hollow, as you could paint that center line in any color and it won’t change the fact that Adams Street is congested and too narrow for the traffic it carries daily. There really is no other way to read the move other than a thumb in the eye of a village that voted overwhelmingly against her in the most recent mayoral election in favor of fellow northsider Amy Mah Sangiolo. Nonantum won’t have the chance to voice their displeasure at the polls this fall - Fuller won’t be running after the teacher’s strike debacle.
But what they can do is continue to paint Adams Street back to its original glory - hit up the newsletter, I’ll call up Swartzy’s to pitch in for materials, send some pizza from D&A (chicken cutlet is a HOF pie) and even buy a round at Tommy Doyle’s (RIP West Street.) Ora e sempre, Resistenza!
Album of the Week: We round out the Clipse love with their most recent studio album before the release of Let God Sort Em Out on Friday. Their junior album Til the Casket Drops also can be viewed through a lens of dichotomies. At times great, at other times forgetful, the album also marked the clearest division between brothers Pusha T and Malice as they began to grow apart in their careers.
Some absolute classic tracks live on this album. “Freedom” showcases a vulnerability from the Thorntons that we hadn’t seen on previous albums. “Popular Demand” features Cam’ron who drops one of his best verses.
But other songs like “Never Will It Stop” or “Counseling” are utterly forgettable. The album ends strongly with “Footsteps” and “Life Change,” so it might be fair to think of it as a sandwich with incredible bread and meat that just didn’t leave an impression.
The other incongruity on the album is the lyrical content of Pusha and Clipse. Pusha continues to lean into his fame and wealth, with the typical rap bragadocio of foreign cars, women, and homes. His future (excellent) solo work would continue this trend.
Malice is beginning to retract from the same thing that attracts his brother. He is leaning further into Christianity as he drops Biblical references throughout the album at a furious pace compared to Lord Willin’ and Hell Hath No Fury. The decision to split from his brother after this album came as little surprise (particularly because the album itself was not a commercial success.)
But in three days the brothers are back on the first joint album since Til the Casket Drops. Those Clipse fans upset they had to wait for four years between Lord Willin’ and Hell Hath No Fury likely gave up on the group getting back together after the first decade of no new Clipse albums. But 26 years later we finally have Clipse back on wax.
Quote of the Week: “Technology is a useful servant but a dangerous master.” - Christian Lange
A Note on AI Usage for TDNBW: AI Used In This Post
Generally, I do my best to avoid using artificial intelligence while writing these posts outside of the images. I believe AI is bad for writers if they are using it to actually generate text for them even as they further shape and tweak it into what we know as writing. The art of writing feels like a uniquely human endeavor that should stay that way.
But just like the venn diagram in the post shows, to maintain efficiency in a faster-paced world, writers will need to embrace AI where it can complement what they do and drive better and faster output. In that vein, I will begin explaining at the end of each post where AI was utilized and how. The text of the actual posts will remain fully human, but AI has proven an excellent research tool and I began utilizing it for this post.
The research for the WTF Newton piece was largely driven by Gemini’s Deep Research tool - the prompts and output all live here for those interested in the source material. I was impressed with the output and actually ended up sharing it with some Newton folks.
For the Album of the Week, ChatGPT helped pull out some of Malice’s biblical references throughout Til The Casket Drops. It was lacking in a lot of them I knew from memory, so this kind of analysis has a bit to go for the LLMs.
See you in two weeks!