In the internet’s beginning, there was no Google. Even once the internet went mainstream it wasn’t easy to find stuff unless there was a directory or someone sent you a URL. There wasn’t that much stuff to be found, but we still preferred to have others help us get around.

The early, broadly popular search engines — Lycos, Yahoo!, Altavista, etc. — showed up in 1994-95. Finally, if someone hadn’t sent us a link, we might be able to find it ourselves. Generally search engines weren’t great, but they were something.

Google arrived on the scene in 1998. As a search engine it was a lot better than the competition (which is why it’s the generic verb for search now). There was definitely lots more stuff to be found by then, so we needed more sophisticated assistance. Of course, it also meant we found what Google decided we should find.

In 1999, Rich Site Summary (RSS) rolled out from Netscape, and Google Reader, the most beloved of RSS readers, arrived in 2005. You no longer had to hunt down content you wanted to read/follow. Gather it up in one place and have updates delivered right to you. You never had to browse or discover. We had help to find stuff and to organize and consume it.

YouTube arrived in 2005, and got big by 2006 when Google acquired it. Facebook also opened to the general public that year, as did Twitter.

By that point, we’d pretty much given up trying to find stuff for ourselves. It was a 24/7 fire hose of as many flavours of content as you wanted, and it quickly became impossible to keep track of everything good.

Fortunately, there was no need to do so. Social media brought a flood of conversations and recommendations from the broadest range of people and organizations. Of course, you only saw what came from those you chose to “follow”. But it was still too much and we needed more tech help.

PostRank, a startup I worked at, was founded in 2007 to do precisely that, using people’s engagement (likes, retweets, diggs, etc.) with content on social channels to determine what was “good”. Or at least popular with lots of people. Worked fairly well – Google bought that, too, in 2011, and rolled parts of it into Google Analytics.

The later 2000s were the first big era of the “influencer,” via the primary medium of blogs, with social media providing additional platforms for amplification and “joining the conversation.”

These folks parlayed their content and influence into lucrative careers. They were treated like rock stars at conferences, wrote books, partnered with brands… Some of them behaved like rock stars, too, but it didn’t seem to tarnish them much.

We chose them to tell us to what we should pay attention. What conferences to attend, books to read, products to buy, etc. Of course, it was a bit of a circle jerk since everyone just recommended their influencer friends and connections.

As social media evolved, some people became influential using just those channels, no blog required. There were Twitter celebs (the blue checkmark!) and Vine celebs. What they liked and recommended also blew up. Influential people went “viral” and “broke the internet.” (Yeah, once upon a time that actually meant something.)

Of course, the glare of a few huge celebs meant no spotlight for lots of smart, talented people, who at best managed to develop niche followings to support their careers. They told us what to read, watch, and buy, too, but in a slightly more… artisanal way.

That era of influence waned as we progressed into this decade. Instagram arrived on the scene in 2010, and YouTube became a place where you could get famous. Whether you were applying makeup or talking about your pets, there are definitely rock stars on those channels now. (I dunno who they are, but any 12-year-old should be able to tell you…)

Brands approach those people to partner and get them to tell us what to read/watch/buy. We eat it up – shortly we will have made a 21-year-old Kardashian a billionaire. Rockstar behaviour has accompanied some of these influencers, too. Sometimes with career-damaging consequences, often not.

Which brings us to the current era. I’ve written before about the conundrum of not only how to filter the firehose of content and shilling out there these days, but how to determine what we even choose to believe is real, given that it’s becoming increasingly difficult for the average person to tell.

In that aforementioned column, I put the impetus on each of us to get educated and critical about what we consume online. Silly, misguided me.

Per the tradition of the last 25 years, we’re continuing to find ways to outsource our critical thinking. The latest buzzword: content validation.

In fact, per that Fast Company article about the growing trend of content validation, we’re outsourcing the outsourcing. Companies are developing algorithms and whatnot to analyze content, primarily to detect fakes (fabricated or significantly altered media) and then tell us what’s “safe.”

Realistically, the firehose of internet content got too voluminous for mere humans to reasonably filter at least a decade ago. And sophisticated faked news and content wasn’t even a concern then. So it makes sense to fight fire with fire, or at least technology with technology.

Interestingly, though, it seems we still need humans to detect fake or at least seriously skewed and biased “news,” so human fact checkers are on the rise. AI isn’t quite capable of effectively tackling that yet.

The downward trajectory since the internet took over the world has been how we, its users and citizens, abdicate ever more responsibility and choice for what we consume (read, watch, believe, and share).

Tell me what’s entertaining. Tell me what’s interesting. Tell me what’s trendy. Tell me what to learn. Tell me what’s true. Tell me what’s real. If we weren’t so habituated to it by now it would be terrifying.

At what point does automated analysis and validation become groupthink? How do we know that “validated” content isn’t just another veiled ad? Would we even recognize when the powers that be dispense with “recommendations” and just start dictating what we are to think and want?

I don’t think there are many battles left to be fought in this war. I think we ceded the kingdom too long ago. And people don’t want to fight. It’s too much work and it isn’t any fun.

Tom Robbins published Even Cowgirls Get The Blues in 1993. A year before those first early search engines launched. Pretty sure at the time he wasn’t thinking of the state of the internet 25 years later, but this line from the book has gotten stuck in my head while pondering it all:

“I believe in everything; nothing is sacred. I believe in nothing; everything is sacred. Ha Ha Ho Ho Hee Hee.”

M-Theory is an opinion column by Melanie Baker. Opinions expressed are those of the author and do not necessarily reflect the views of Communitech. Melle can be reached @melle or me@melle.ca.