Published on

[Dev Log] The Invisible Giant: Why We Must Summon Dennis Ritchie in the AI Era

Authors
  • Name
    Logan Kim
    Twitter

Subtitle: The Truth of the Machine Room Hidden Behind Python's Flashy Presentations

In October 2011, the IT industry lost two giants within a week of each other. One was Apple’s Steve Jobs, and the other was Dennis Ritchie, the creator of the C programming language and Unix.

The world enthusiastically mourned Jobs, who was armed with innovative UIs and spectacular presentations, while Ritchie’s passing went relatively unnoticed. But from a cold, system-architect perspective, the core of the iPhone (iOS, based on Unix) and the countless apps (Objective-C/C) that Jobs proudly held up on stage were nothing more than "applications" running on the bedrock of hardware mastery paved by Dennis Ritchie.

Looking at the ecosystem of languages dominating the AI era today, I am once again struck by the intense contrast between these two giants.

1. Python is Steve Jobs

In the AI revolution sparked by machine learning, the most familiar language is undoubtedly Python. With the influx of intuitive and brilliant frameworks like PyTorch and Keras, Python has reigned as the undisputed #1 programming language for the past few years. It has become the "full-stack magic wand" that junior developers first reach for when entering the field.

But is this Python's true performance?

Python is a fantastic language, but in itself, it is merely "Steve Jobs' presentation" (the interface) designed to manipulate heavy systems with ease.

In reality, the true "backend muscle" calculating billions of parameters and executing tensor operations is written in native languages like C, C++, and CUDA. These native languages, much like Dennis Ritchie, silently descend into the hardware's machine room to squeeze 100% of the potential out of CPUs and GPUs. However, because they are brutal to handle directly and lack reusability, we simply wrap them in an elegant Python shell via FFI (Foreign Function Interface) to call them.

Ultimately, the more the world obsesses over AI, the exponentially higher the value of native languages that directly control hardware at the bottom becomes. The rapid rise of Rust—armed with memory safety as a powerful rival to C—aligns perfectly with this trajectory.

2. Game Engines Wasting Resources Under the Guise of Glamour

This is exactly why I obsessively dig into native languages and FFI in game development. When you look deep into the C#-based Unity ecosystem, it is all too common to find structures that ruthlessly waste hardware resources, intoxicated by the high-level "glamour" the language provides.

One of the core target platforms for my game is the Steam Deck. From the perspective of an indie developer and entrepreneur, the explosively growing hand-held UMPC market is a strategic stronghold that must be captured.

However, the Steam Deck is a device with severe "Resource Constraints," entirely different from a desktop environment. The absolute specs of the CPU or RAM aren't the issue. The real enemies are battery life and thermal throttling. Defending a steady 720p at 60 frames per second on limited battery power without a charging cable is a mission you can never achieve by "vibe coding" and leaning on the engine's abstractions.

3. Coroutines: The Single-Threaded Tragedy Devouring Batteries

Take Unity’s Coroutines, which developers habitually spam, as an example. This syntax, which masquerades as asynchronous logic, is actually pseudo-asynchronous—it runs by dragging the Main Thread by the collar. If you cram game logic, rendering preparation, and network packet processing all into a single main thread, a bottleneck is inevitable.

On the Steam Deck’s multi-core APU, if the main thread is overworked at 100%, the remaining cores are left spinning idly, waiting for tasks. This inefficient utilization spike causes the Steam Deck’s cooling fans to roar like jet engines and melts the battery at an absurd rate. Frame drops are just a bonus.

4. Ultimately, the Battle is Decided by 'First Principles'

We live in an era where modern PCs boast dozens of gigabytes of RAM, and even VRAM is overflowing. There is no need to perform insane bit-masking and pointer acrobatics to cram a world into a 40KB ROM cartridge like the developers of Super Mario Bros. 1.

But just because resources are abundant doesn't mean we should tolerate structural waste. Doing less computation, using less battery, and pushing out higher frame rates on the exact same hardware—we easily call this "optimization," but from an architect's perspective, this is not mere tuning; it is a Fundamental.

If Python or C# are flashy "skills" that solve problems quickly, then C, Rust, and a deep understanding of memory architecture are the "fundamentals" that will never collapse under any environment. Why do I deliberately step outside the comfortable object-oriented bounds of Unity and descend to the bare metal via FFI? To follow the First Principles of the "invisible giants" who truly ruled the system from behind the flashy presentations.