33 charts

Share this post

Artificial Intelligence and the New Dark Age

33charts.substack.com

Artificial Intelligence and the New Dark Age

When ChatGPT becomes Divine revelation

Bryan Vartabedian
Feb 13
9
6
Share this post

Artificial Intelligence and the New Dark Age

33charts.substack.com

Some ideas about enchanted technology. I’m thinking of putting some of this into a bigger project, and I could use your help. I’d love your comments or follow-up questions. Have a great week, and thanks for being a subscriber.


Science fiction author Arthur C. Clarke had a law that any technology worth its salt should look like magic. OpenAI’s GPT has that magical look. When I first flirted with GPT in December, it brought the same chilling effect of first seeing the World Wide Web 1994.

I think of ChatGPT and other generative technologies as black box technologies. We put in a question or a prompt, and poof, out pops an answer. Like some kind of oracle that we summon for understanding.

While we’ve seen some of the shortcomings of chatGPT, we generally tend to accept what the black box gives us. No wonder author James Bridle has dubbed the era of cloud computing The New Dark Age, describing a regression in time to when knowledge could only come through divine revelation.

Yuval Noah Harari in Homo Deus draws an analogy between our acceptance of faith and algorithms: "Just as according to Christianity we humans cannot understand God and His plan, so Dataism declares that the human brain cannot fathom the new master algorithms.” This idea of a master algorithm is a nod to the work of Pedro Domingos, who has argued that algorithms will inevitably meld into a system of ‘perfect understanding’ of the world and all its workings.

When we start to see the master algorithm as the ground truth, what will happen to our trust in faith and religion? Or will AI replace God, as

Shel Israel
asks ChatGPT this morning in his Substack,
It Seems to Me (ISTM)

Kate Crawford in The Atlas of AI — Power, Politics and the Planetary Costs of Artificial Intelligence calls this magical inevitability of technology enchanted determinism:

As the social anthropologist F.G. Bailey observed, the technique of "obscuring by mystification" is often employed in public settings to argue for a phenomenon's inevitability. We are told to focus on the innovative nature of the method rather than on what is primary: the purpose of the thing itself. Above all, enchanted determinism obscures power and closes off informed public discussion, critical scrutiny, or outright rejection.

Of course, the 300 pound black box in the room is search. We’ve been sold that Google is a machine that gives us the answers — an empty search box serving up a kind of algorithmic revelation. And we’ve all come to trust it almost unconditionally. Blindly, in fact.

Few of us recognize that Google is an advertising platform built to sell things based on what it knows about us. Everyone’s search is substrate. Everyone’s result is based on our behavioral surplus individually scraped and leveraged for Google’s bottom line.

Now the idea isn’t that we should avoid technology like GPT or search. You can’t, really. Rather, we need to understand something about the tool’s origins and relation to power. Who created it and what’s their intent? Who does it serve? What are the incentives? Who stands to gain from what we put in and get out.

And how does all of this reflect what’s in the sausage?

Easier said than done, of course. But we have to give this some thought. Or, to channel Toto when faced with the Almighty Oz, we probably need to pay attention to the man behind the curtain.

Obfuscation by mystification is part of the tech game. But it’s now part of our world. And our responsibility. We can’t let ourselves get hypnotized by the oracle.

Thank you for reading/subscribing. This post is public so feel free to pass it along.

Share

6
Share this post

Artificial Intelligence and the New Dark Age

33charts.substack.com
6 Comments
Kim Lucas
Feb 13Liked by Bryan Vartabedian

Agree a 1000%.

Expand full comment
Reply
Nick Genes
Feb 13

I’m reminded of early “Greek oracle” clinical decision support like Shortliffe’s MYCIN expert system - if you input hundreds of data points, the machine spits out a preferred antibiotic. The rules were all so easy to understand in isolation but they were combined and weighted to an extent that the machine seemed wise and powerful (and tended to be slightly more guideline-adherent than clinicians).

Those systems didn’t catch on because data entry was cumbersome, but everyone trusted the underlying rules engine.

Expand full comment
Reply
1 reply by Bryan Vartabedian
4 more comments…
TopNewCommunity

No posts

Ready for more?

© 2023 Bryan Vartabedian
Privacy ∙ Terms ∙ Collection notice
Start WritingGet the app
Substack is the home for great writing