The Speed of Change, or How History Learned to Sprint

If you dropped an ancient Egyptian from 3000 BC into the world of 1 AD, he’d be confused, sure—but not existentially shaken. Different rulers, new gods layered onto old ones, unfamiliar scripts, maybe some new military hardware. But the basic operating system of reality would still boot. The cosmos would remain ordered, hierarchical, meaningful. Tradition would still outrank novelty. The past would still be the instruction manual for the future.

That kind of worldview had legs. It could walk across millennia.

Fast-forward a few thousand years, and that continuity snaps—not all at once, but decisively. By the time you reach Kepler’s generation, the crack is visible. The universe is no longer a story; it’s a system. Nature obeys laws, not purposes. Observation beats authority. Math starts calling the shots. The metaphysical furniture is still in the room, but it’s already being quietly repossessed.

Even so, Kepler could still recognize himself across a century or two. The shock threshold was high, but the velocity of change was still survivable.

That’s no longer true.

Today, the rupture fits inside a single human lifetime.

And I know this not because I’ve read the theory—but because I lived the beta test.


I Was 30 in 1989. That Used to Mean Something.

In 1989, the world felt… settled. Not perfect, not just, not peaceful—but coherent. The Cold War was winding down. Institutions were flawed but legible. Technology advanced, but it mostly stayed in its lane. You learned a tool, mastered it, and expected it to stick around for a while.

You could plausibly believe that adulthood meant arriving somewhere.

That belief did not age well.


Exhibit A: Vinyl, CDs, and the First Wrong Hill I Chose to Die On

When CDs appeared, I rejected them outright. Vinyl was “authentic.” Warmer. Real. CDs felt sterile, overproduced, soulless. This was not a technical argument; it was a moral one. Music wasn’t supposed to be perfect. The pops and crackles were part of the experience. They proved something physical had happened.

History’s verdict: vinyl survived, CDs peaked, streaming ate everything, and authenticity became a marketing term.

Score one for me, but for entirely the wrong reasons.


Computers: Fancy Typewriters with Delusions of Grandeur

I was an early adopter of computers—but only as glorified typewriters. Word processing made sense. Spellcheck was handy. Files beat filing cabinets. That was progress I could justify.

The internet, however, struck me as optional. Interesting, maybe useful, but hardly central. I didn’t seriously get online until around 2000. In retrospect, that’s like saying you didn’t bother learning electricity because candles were working just fine.


Cell Phones: Status Symbols First, Tools Later

I adopted cell phones early, too—but not because they were useful. They were status markers. You carried one to signal that you were important enough to be interruptible. Actual phone calls were secondary.

When smartphones appeared, I dismissed them as stupid. I already owned a perfectly good camera. I didn’t need to carry a thousand MP3s in my pocket. Who was this for, exactly—roadies and DJs?

History’s verdict: everyone.


The Hubris Phase: HTML, Dreamweaver, and Peak Confidence

In the mid-to-late 2000s, I went all in on the internet. I built websites from scratch. Hand-coded HTML. Used Dreamweaver. I knew what a table layout was and why you shouldn’t use one. I considered myself competent—borderline expert.

This was the sweet spot: enough mastery to feel powerful, not enough perspective to feel humble.

Today, an average 12-year-old could outclass that version of me before lunch. Not because they’re smarter—but because the tools, abstractions, and defaults have leapfrogged entire skillsets. What once required craft now requires taste.

And taste is harder to teach.


Netflix, DVDs, and the Comfort of Being Wrong Again

When Netflix launched streaming, I scoffed. DVDs in the mail were fine. Predictable. Reliable. Why trade quality and ownership for buffering and ephemerality?

Now I don’t even remember the last physical disc I touched. I do, however, remember spending far too long sitting on the toilet watching funny cat videos on YouTube—a sentence that would have sounded like a stroke symptom in 1989.


The Pattern That Only Appears in the Rearview Mirror

At no point did any of this feel shocking in real time. That’s the trick. Change didn’t arrive with trumpets. It arrived with updates, convenience, and mild annoyance. Each step was defensible. Each objection reasonable. Each surrender temporary.

Until suddenly, the cumulative effect was undeniable.

The real shock isn’t that technology advanced. It’s that the ground rules kept changing faster than anyone could metabolize. Tools stopped being tools and became environments. Media stopped being consumed and started consuming us back. Expertise shortened its half-life. Cultural norms stopped evolving and started reconfiguring.

And none of us voted on it.


From Egyptians to Algorithms

The ancient Egyptian lived in a world where meaning was thick and time was slow. I grew up in a world where meaning was contested but stable, and time felt linear. Today, we live in a world where time is compressed, meaning is optional, and reality is filtered through systems no one fully understands.

That doesn’t make us stupid. It makes us transitional.

We’re the generation that still remembers when things arrived—and is now living in a world where things simply become.

If I’d been dropped straight from 1989 into 2025, I don’t doubt I’d have gone briefly mute. Not from fear or disgust—but from the sheer effort of reloading the assumptions I didn’t realize I was carrying.

History didn’t just speed up.

It changed gears.

And we’re still learning how to drive it without stalling—or crashing into the future while arguing about vinyl.

When “Nothing” Beats a Star: How Bad Logic, Taboo Words, and Moral Panic Broke Our Language

It usually starts as a joke.

“Nothing is brighter than the brightest star.
My phone flashlight is brighter than nothing.
Therefore my flashlight is brighter than the brightest star.”

QED—if you squint hard enough and stop thinking.

Or the old Norwegian chestnut:

Mor Nille kan ikke fly.
En sten kan ikke fly.
Ergo Mor Nille er en sten.

The logical flaw is obvious, which is precisely the point. In Erasmus Montanus (1723), Ludvig Holberg used this kind of syllogistic abuse to mock academic pedantry: airtight form, zero sense. Erasmus proves his own mother is a stone, and everyone else can see he’s an idiot.

These examples are funny because they expose how natural language breaks when you pretend it’s formal logic—or when you swap meanings mid-argument and hope nobody notices.

Unfortunately, that same failure mode now shows up in places where the stakes are much higher.

From logic jokes to language landmines

Consider an example I saw in a YouTube clip from Linus Tech Tips. The phrase “hard R” came up. Confusion followed.

One person heard “hard R” as a reference to a racial slur.
Another understood it as an older, now-offensive slang term for intellectual disability.
I understood it phonetically.

In linguistics, “hard R” can reasonably be taken to mean a fully articulated rhotic, as opposed to a weakened, vocalized, or dropped R. In Norwegian terms, that distinction is immediately intuitive:

  • A soft R like the R in Lars, lightly articulated and non-rolling
  • Versus the hard, trilled or tapped R common across Scandinavia and Germany — the “machine-gun” R of brrrr, det er kaldt
  • Or a word like rømmegrøt, where the rolling R is unmistakable to any Nordic ear

English largely lacks this contrast, which is why the confusion arises. Outside of Scottish English (and to some extent Irish and Welsh accents), English Rs are typically approximants rather than trills. The only truly universal English example of a rolled R is the onomatopoeic brrrr — precisely because it is imitating a sound, not using it phonemically.

From a phonetic standpoint, then, my interpretation was neither exotic nor unreasonable. It was simply outnumbered by newer, taboo-driven meanings that have crowded out the technical one.

All three interpretations exist. None are invented. Yet only one is socially survivable in 2025—and that meaning has effectively vaporized the others.

This is how language breaks: not through malice, but through taboo gravity. Once a term becomes associated with a high-voltage offense, all other meanings get sucked into the blast radius.

Intent stops mattering. Context stops mattering. Precision becomes collateral damage.

The euphemism treadmill at full speed

Which brings us to the big one.

Yes, the fully pronounced n-word is offensive regardless of speaker. No serious person disputes that.
Yes, the softened variant (“nigga”) functions as an in-group term among many black speakers—sometimes affectionate, sometimes not. That’s about social license, not semantics.
And yes, saying “the n-word” is an avoidance strategy: a way to reference the concept without performing the speech act.

But let’s not pretend this is philosophically clean.

Saying “the n-word” still puts the word in the listener’s head. The speaker avoids uttering it, but the listener does the reconstruction internally. That matters socially—but it doesn’t magically erase meaning. It just relocates responsibility.

This is not a moral failing. It’s how euphemism works. But we should at least be honest about it.


“Words = violence” and other category errors

When people object to the idea that “words are violence,” they are not denying that language can harm. They are objecting to category collapse.

Originally, this framing came from speech-act theory: the idea that some speech participates in coercive systems. Fair enough.

What we have now is inflation:

Structural harm → emotional harm → offense → violence

Once everything is violence, the word loses analytical value. Worse, it encourages intellectual laziness. Referential use, descriptive use, quotation, insult, and incitement are treated as morally identical acts.

They are not.

Calling all of it “violence” is the moral equivalent of proving Mor Nille is a stone. The syllogism may feel righteous, but it doesn’t describe reality.

Semantic drift, imported taboos, and Norwegian absurdities

This gets especially messy outside the Anglosphere.

When I grew up, the Norwegian word “neger” was a neutral descriptor: someone of sub-Saharan African descent. It was not an insult. There was nothing remotely pejorative about it.

Today it is treated as almost as bad as the English slur—not because of Norwegian usage, but because English-language racial history has been retroactively imposed on cognates.

That’s not linguistic inevitability. It’s cultural hegemony.

The same contradiction appears with “svart”. “Black” is acceptable in English. Svart—its literal translation—is often treated as suspect. The problem is not semantics; it’s association.

So we reach for “mørkhudet”—“dark-skinned”—which is so broad it can refer to half the planet. Precision is sacrificed on the altar of safety, and everyone pretends this is progress.

It isn’t. It’s euphemism treadmill failure.

Cleaning up the past by breaking it

The final insult is retroactive sanitization.

Editing Mark Twain (no more “Nigger Jim“).
Rewriting Astrid Lindgren (Pippi’s dad is no longer “Negerkonge“).
Removing words instead of explaining them.

This treats readers as incapable of understanding historical context and replaces education with erasure. Contextualization is responsible. Alteration is not.

Preserving the record is not endorsement. It’s honesty.

What actually went wrong

The problem is not empathy.
The problem is not language evolving.

The problem is conceptual sloppiness masquerading as moral clarity.

Ethics has swallowed semantics whole, and meaning is paying the price. When referential speech is treated as violence, confusion is inevitable. When taboo replaces explanation, misunderstanding is guaranteed.

Holberg understood this three hundred years ago. Bad logic doesn’t just lead to wrong conclusions—it makes smart people say ridiculous things with total confidence.

And once that happens, it’s not Mor Nille who turns into a stone.
It’s the language itself.

The Old Geezer Rises (Again): A Brief Note from the Undead

Yes, it’s true. Reports of my digital demise were not greatly exaggerated—they were spot-on. OGNDY.com—uninterrupted online presence for seventeen years!— flatlined sometime in 2023/4 after my sainted wife decided to retire her hosting plan without warning me. One moment my blog was ambling along like an aging rock star with tinnitus; the next it was gone, scrubbed from the servers like it had never existed.

We had a marital performance review that evening.

So here we are: the resurrection. Call it Version 2.0, The Reboot, or—if you’re feeling theological—the Second Coming of the Old Geezer. I’m not promising miracles, but I can at least get the lights back on and the amps warmed up.

What will you find here? The same thing you found before, only older, meaner, and with slightly better post-production values. Fitness rants from a man who has now learned (the hard way) what joints are for. Guitar musings from someone who long ago accepted that tone is 80 percent fingers and 20 percent swearing. Commentary on current events, politics, and culture—because if the world insists on being ridiculous, the least I can do is point at it. A little literature. A little philosophy. A dash of whatever else drops into my skull at inconvenient times.

This blog is not a brand. It’s not a “journey.” It’s not content strategy. It’s me, the Original Lars, doing what I’ve always done: thinking out loud where people can read it and judge me accordingly.

So welcome back. Or welcome for the first time. The Old Geezer’s not dead yet. In fact, he’s barely getting started.