Does our technology still work for us?


“Google’s search algorithm is geared to find the best content and UX for their users. Write the best content that helps people accomplish their tasks, and you’ll succeed in an SEO race.” - Me, circa 2012

This was solid advice during that era. I gave it to many marketing and website clients.

Write for humans and the algorithm would reward you. Make your content useful, and the algorithm would reward you. Have the best UX, and the algorithm would reward you.

The tool worked for us.

It was a symbiotic relationship. Google’s nascent ad functionality benefited from being THE place to find information. Users benefited by quickly finding what they needed. Websites and content creators benefited with larger audiences, focused and happy users, and a clear-cut path toward success. We all got what we needed out of the relationship.

In other words: the incentives matched for everyone.

AI Leaders want humans to work for the tool

The symbiosis was always a delicate balance. If you study nature, you’ll know that any symbiotic network is always a delicate balance. Drift too far in one direction or another and risk toppling the entire structure.

Tech leaders no longer seem interested in the balance. They seem to want humans to work for the tools.

“It would be impossible to train today’s leading AI models without using copyrighted materials…” - Sam Altman CEO OpenAI.
“Platforms, tools or frameworks that are hard for large language models (LLMs) and agents to use will start feeling less powerful and require more manual intervention. In contrast, tools that are simple for agents to integrate with and well suited for the strengths and constraints of LLMs will quickly become vastly more capable, efficient and popular.” Mat Biilman, CEO Netlify

Let’s break down a few propositions from these quotes.

Altman’s meaning is relatively simple: “The tool I’m building won’t work without your content. All of your content.”

Biilman’s quote is a bit more nuanced, but no better. In its essence, it says the following:

  • Agents and LLMs need to be able to use your site or tool
  • Sites and tools should be made easier for bots to use to be “more powerful” to end users
  • If the tools are more powerful, they will be more popular

Those propositions are stated directly. But what logically follows from those propositions? Two quiet corollaries:

  • 1st Corollary: If the AI tools are more powerful, users won't use your tool; their bot will. Users won’t read your content; their bot will summarize it
  • 2nd Corollary: If users only use their AI tool, and it doesn’t send them to your site, app, or product, the AI tool user won’t be your audience or your user. That person will be captured in the AI company’s ecosystem (💰, 👀, 🕦)

Either way, the message is clear: Give AI your content. Give AI access to your tools. And, in the end, let the agent and AI company take credit for your work.

The incentives don’t match.

SEO worked for us… until it didn’t

I worked in newspapers for the first six years after I graduated college (2006-2012). To put this time period in context, Google’s search algorithm still ranked partially based on the keywords meta tag for half that time (what a time to be alive… a time when your article on technology would have keywords about Britney Spears just to get noticed…)

I grew very familiar with the nature of SEO while working on the web for an organization which made money off the discoverability of its content. We were being given millions of views across our news outlets each month via search (specifically Google search). Do you know what the takeaway from our executives (and other executives in the industry) was?

“Google is stealing our content!”

Young and idealistic about the internet, I couldn’t handle that. It was, honestly, ridiculous. We were literally generating revenue from the views we got. Google was bringing us an audience for each story we published.

My, how the times have changed me…

In today’s ecosystem, I’d be forced to agree with those executives. Forty-year old Bryan is mature enough to admit that, but it’s still painful.

Google at the time wanted to make information discoverable (or at least searchable… if you’ve watched my search vs. discovery video, you may know about the distinction). As time marched on, and Google ads became more important (and thus user engagement became more important), Google wanted fewer users leaving the search page.

It started as a boon for Google’s search users. Search for a local restaurant, get a widget in the sidebar (nice and out of the way) with important information like the phone number, address, and even a link to the website. Nice!

"Places" screenshot on Google showing 3 locations for Central BBQ in Memphis along with their phone number and whether or not they're open

But then we got featured snippets… Now, when you search “What is an API?” you get an answer (pulled and quoted from a site) right in the results… with no real benefit to the original author.

Google result for "What is an api?" Showcasing a snippet from an AWS blog post explaining what an API is in enough detail where a user may not click the original link

Then, we got Google shopping results. I started writing this on a fresh legal pad acquired from a local Walgreens this morning. Walgreens was on my way to the library where I decided to write. I did a quick Google search to check stock. I wanted to quickly find the dedicated page on Walgreen’s site. Instead, I found this:

Screenshot of a "shopping" result from Google search of "Walgreens legal pad" showcasing 8 products in a grid (2 from walgreens, the rest from other vendors) and one regular result below. The grid is all "sponsored"

Yes, the page I was looking for is listed three times, but it’s listed in a confusing mess of six other products. Those other products are from their competitors! All of these posts (including the Walgreens grid results) are sponsored! It certainly doesn’t seem to be working for Walgreens. It seems to mostly be working for the company selling the ads…

It’s harder to get real views. It’s harder to transfer the engagement to the individual. There’s higher competition around products (even in branded search). And all of this before the advent of “AI Search Summaries,” which when launched had no attribution on answers and even now are unclear on a path to getting users to the original sources.

The algorithm once worked for us (both the creators and the users). Then, it became more important for Google to hold the attention to earn money. This made the incentive structure flip.

Our products, our content, our news exists to help Google’s product, not ours. Perhaps those news executives were right…

There’s no way around it: we have to talk about capitalism

In essence, the degradation of this symbiotic relationship can be laid at the feet of capitalism. The balance of power had shifted in the early 2000s, as small companies were able to make money on the web. Largely helped by early Google, but there’s always a tipping point. And that brings us to another issue with the quotes from earlier: in this game, who benefits the most?

The last section on search issues covers one of our capitalistic problem: when multi-billion dollar companies see a way to make more money, it’s hard for them to hold their original values. If we help agents and invest in making things easier for them, who are we helping? The agents are not benefitting. They’re not alive, sentient, or actually intelligent. So who? The multi-billion dollar companies and the venture capital (VC) firms behind them. They get user lock-in and, as we’ve seen with the devolution of Google search, that social contract can be renegotiated.

Let’s move, instead, back to Altman’s quote. We’ve got to give up all our copyrighted material to achieve a great and glorious GenAI world. A world where all the powers of creation are at the disposal of everyone!

Look. Creative work is hard. I get it. I’ve always wanted to be a cartoonist (and an actor… and a novelist… and a film maker), but I’ve never put in the work to get good (or even the effort to get mediocre). I get the allure of being able to type a prompt and get an artifact back: Comic art, landscape painting, cool background music, amazing video of an imaginary cityscape.

But here’s the thing about capitalism and creativity: businesses don’t want to pay for creative output. Ask anyone who has tried to make a living as a commercial artist. Pay is terrible, hours are long, and the work is constantly devalued.

Imagine the joy, then, when investors learned they could do away with those tiny payments to those pesky artists. That’s where the VC fervor and hype comes from. Not giving these tools to artists, but instead giving them to low-paid prompt engineers to lower the cost of creative output.

Oh, and let’s not even mention the software development side of the “Creatives” industry. We’ve been making too much money for too long. Wouldn’t it be better if a founder, investor, or product manager could just make “the next big thing” without a team of expensive engineers? $10 million dollars in seed funding right away!

But honestly, that’s not the worst of it. The worst is that creation is a core trait of humanity. We seek to create. As we learn and level up our creative skills, we often find bigger and better applications for them.

The AI scheme is to take the creative tasks away from humans so that we can be “more productive.” And while that may be an economic boon in the short term, it’s a real problem long term as the creative skills of humanity are slowly removed.

“That’s hyperbole!” I hear some of you cry (or at least, I hear the AI advocates cry).

Maybe.

Though, maybe not…

AI’s hidden cost on our brains

It’s perhaps too early in the current technology cycle for there to be definitive studies on AI’s affect on humanity. Even still, we can look at something very similar.

There have been many studies on “The google effect.”

At its essence, the google effect says that by offloading our data retention and recollection to the internet, our brains are worse at retaining and accessing information.

“Moreover, when people search for information on the Internet while working on the Internet, they are more likely to use the Internet rather than their brain the next time they encounter that issue, and they retain pertinent information in an interesting way: they remember the Internet address where the pertinent information is stored (e.g., domain name, database, etc.). People who have searched the Internet for a solution to a problem, for example, will remember the website where they found the solution more vividly when they encounter the problem again (50), even if they have forgotten the precise essence of the problem for which they were searching.“ - NIH Research paper
“People are more likely to return to the Internet and repeat the search process when confronted with a similar issue if they remember the location where the information was saved. Moreover, people’s perceptions of findability roughly predict the amount of time it will take them to find the information they need on the Internet, which increases their reliance on it laterally (38). Moreover, when searching for solutions on the Internet, rapid responses (answers that are obtained more quickly) are more likely to be convincing. People feel more confident when working in an Internet-accessible environment, and “Google effect” is stronger for those who have previously used the Internet.” - NIH Research paper

That’s certainly not great. We’ve modified our neural pathways, and not for the better.

While not yet studied, it stands to reason that if we outsource our artistic, writing, music, and even coding skills, the parts of our brains that help in creation would slowly function worse and become dependent on those tools (in the same way they have with Google search).

Definitely great for the person who sells what we become dependent upon. Definitely bad for core elements of our human nature.

“In addition, by examining the potential costs and limitations of the Internet, individuals are in a better position to develop and modify the technology so that it is potentially more productive, less disruptive, and more consistent with the everyday goals and functions of human cognition.” - NIH Research paper

The study also says that the more people understand this, the more informed they are, the better the decisions they can make for themselves. Since learning this, I even attempt minutes of thought to remember information rather than do an immediate search. Better to start rebuilding those synapses, after all.

But those sorts of solutions assume people are able to come to that understanding. The companies working on this have billions invested… Kind of puts the use of “dependent” in perspective…

Conclusion

My point in all of this is that when people start telling you that you need to work for your tools instead of the other way around, you need to DEEPLY inspect that.

We have very recent examples of how this has worked. It always starts at “Democratizing” powerful things (creation, development, writing, videos) and, so far, ends with the most benefits in the hands of large companies. When that happens, we lose the democratization, but with AI, we run the very real risk of losing the skills that make culture, the skills the benefit us all, the skills at creation.

I hope we don’t let them take that from us, just to make us “more productive.”