@ Loup's Impossible? Like that would stop me.

January 2015

The repeated deaths of OOP

Since its inception in the late sixties, Object Oriented Programming has pervaded our industry. Current mainstream programming languages all support OOP —except maybe C, and it is no longer that mainstream. Most programmers I have met in person tend to divide the world thus:

From the look of it, OOP is here to stay.

On the other hand, despite having gone through serious mutations, OOP is still strongly criticised by knowledgeable people (including yours truly). Worse, the game industry seems to be moving away from it. Video games are simulations, a core niche of OOP. That’s what Simula was named after. If it’s not the best tool for even that job…

What’s OOP, again?

Oh. Right. I’m saying “OOP” all over the place, and I don’t even know what it means. Thankfully, Deborah J. Armstrong made a survey of how people use the term (The Quarks of Object Oriented Development), and came up with 8 fundamental building blocks.

Five are structural:

Three are behavioural:

(Those descriptions are directly taken from Armstrong’s taxonomy.)

Now I understand why nobody agrees on what OO means: few object oriented languages support all of the above, and the other languages all support some of the above. Consider:

Looking at the big picture, I can see that object orientation is not a paradigm, but a set of mechanics we can cherry pick. Yet we continue to treat it as if it were a unified concept.

I think this is why “OOP” survived this long. Without a definite definition everyone can agree on, its meaning keeps mutating beyond recognition as we change our programming practices.

And so, OOP keeps dying.

(Note: the following “deaths” are only in rough chronological order, and often describe a slow decline in popularity, not a brutal cessation of activity. Think of COBOL. The language is long dead, yet still maintained to this day.)

The first death

I wasn’t there, so I don’t know much. But it is easily summed up:

Actually I made up the term “object-oriented”, and I can tell you I did not have C++ in mind.

Alan Kay

For a time, “Object Oriented” mostly meant Smalltalk. We speak of Simula as the first object oriented language, but Alan Kay invented the term, and Smalltalk, though inspired by Simula, was quite different. It had some Lisp roots, and everything was an object, including classes and messages.

Then C++ effectively ported Simula in a C environment. In the process, it emphasised short term efficiency, and stopped many things (primitive types, classes, methods…) from being objects. Classes were mostly abstract data types with inheritance and polymorphism. Many objects were directly manipulable from the outside, through either public data members, or getters and setters.

Then Java came. It was presented as an “easier C++”. Same syntax, similar abstractions, garbage collection… At that point Smalltalk was hardly used any more, and “OOP” basically meant Java and C++.

This was the first death.

The second death

When I was first taught OOP, around 2004, Java was already an established language. Despite having been published 10 years before I was taught Java, the Design Patterns book had yet to take hold of the teachers. The older ones seemed to be stuck in the inheritance hierarchy mindset. I had my share of animals, shapes, and everyday possessions. The vision I got out of it was simple:

There wasn’t much time for subtlety back then. We had a language to learn, a project to complete, and of course the rest of the curriculum. One teacher stood out however:

Wait a minute. Inheritance is OOP’s flagship. And we should stop using it? Fragile base class? Okay, you win. Inheritance is Evil, and the other teacher is Stupid —no time for subtlety, remember? The argument was compelling, though.

The change was drastic. Before, we had big inheritance hierarchies, classes that model mostly real-world entities, and a separation of concerns that looked like “the Dog class is responsible for everything that happens to dogs”. Now we favour composition, and separate the concerns, not just the entities. The Model View Controller pattern, despite being older than Java, is a good example: it separates display from interaction of even a single entity: we don’t just have a dog. We have a model of the dog, an image of the dog, a clickable area of the dog…

This was the second death. The coffin soon followed, nailed by Java’s generics: suddenly we hardly needed Object any more.

The third death

Even with its second death, “OOP” still mostly meant Java and C++. (And C#.) The practices changed a lot; the languages, not so much.

Meanwhile, scripting languages were on the rise. Python, Ruby, JavaScript… None are statically typed, and JavaScript doesn’t even have classes. “OOP” just got a lot more inclusive. And the code and practices you would find in those languages were quite different too.

Test Driven Development for instance, was invented (Update: maybe rediscovered) by users of these dynamic languages. Most probably to compensate the weaknesses of dynamic type systems, by the way: without the host of proofs a static type system gives you for free, the need to check your assumptions is dire. Tests can provide those checks.

Now “OOP” is too diluted to mean anything any more. Being “object oriented” tells me so little about a language that I might as well not bother.

This was the third death.

The last death

As meaningless as it has become, “OOP” is still around. The practices have changed beyond recognition, but the name somehow survived.

This time it might not.

There are several reasons for this. Bjarne Stroustrup himself says C++ is not object oriented. Lambdas are everywhere, including in C++ and Java. Parallelism calls for a better way to manage state, which OOP doesn’t currently provide. Some core niches of OOP (GUI, simulation), are starting to get attractive alternative solutions. And again, the game industry is slowly moving away from OOP.

Lambdas and closures

Two fancy words for a simple concept. A lambda is just the literal description of a function —the same way 42 is the literal description of a number. With them, you are no longer required to name every single function. A closure is an implementation strategy. Basically an object that contains a piece of code and a piece of data. Together, lambdas and closures can make functions “first class”: they can be passed around and manipulated just like integers.

At a fundamental level, closures and objects are very close. Yet from a popular perspective, first class functions are not OO at all: they come from functional programming. Yet they managed to invade every single mainstream OOP language.

This is a severe blow to OOP. If the closure/object duality didn’t prevent closures from invading even Java, we can be sure influential people are acknowledging that OOP is not always appropriate. For real. It’s easy to pay lip service to the “best tool for the job”, yet decide that OOP is the best tool for whatever we are doing. This time, they went further.

This may not be enough to kill “OOP”, though. Lambdas and closures could be co-opted, spoken of as if they were object oriented all along. I’ve seen it happen with parametric polymorphism. Suddenly it got a new name, and even became an important characteristic of OOP, depending on who you asked.

The multicore revolution

Typical OOP programs are also imperative programs. Much mutable state, though it tends to be hidden under the encapsulation carpet. In a multithreaded context this quickly becomes a nightmare. Now your object could be interrupted at any time, and asked to run concurrent writes on itself. Hence the importance of thread safety, the proper management of locks…

There is an easier way: don’t mutate state. Always construct new stuff, never throw away anything. Then let your smart pointers or your garbage collector take care of unreachable references.

That gets rid of most of our concurrency related problems. Sure, you do need side effects at some point, but they can always be isolated to small portions of your program.

On the other hand, forbidding mutable state and other such side effects without changing anything else is crippling. You need a whole new bag of tricks. Persistent data structures, so you don’t have to copy the state of your entire application for every little update. Standard higher order functions such as map, fold, and filter, so you have an alternative to most of your for and while loops without resorting to the recursive nuclear option. Algebraic data types, so you can have functions that fail without resorting to the harder to tame exceptions. Monads, so you can chain such computations without having to write annoying chains of if-else expressions…

You can see where this is going: without mutable state, the pull towards functional programming is hard to resist. It will take time, but I think OOP cannot survive this. Not even actor models will help. While actors are very similar to objects, they don’t brand themselves as such. OOP may live on, but “OOP” will die.

Entity systems

Up until now, we could have said that OOP wasn’t so bad. Sure, it gained new features, and faced new obstacles, but that doesn’t invalidate the core idea. To do that, you would need to beat OOP at its own game. Guess what, entity systems are doing just that. They enjoy mainstream awareness in the game industry, and I expect they will eventually invade GUI programming in some form.

Being so utterly outclassed wouldn’t just mean the conditions have changed. It would mark OOP as a mistake. Maybe not one we could have avoided: we had to try. But a mistake nonetheless.

Back when inheritance hierarchies were all the rage, most games were designed around a big taxonomy of game objects. Like this:

       Object
        / \                 Small, incomplete,
       /   \                and totally made up
    Static  \               inheritance hierarchy
      /\  Moving
     /  \    |\
    /   Loot | \
Clutter      |  Enemy
          platform

The main problem with this design is its inflexibility. What about loot that moves? Invisible enemies? And so on. We could of course modify the hierarchy as we tinker with the design of the game, but the changes involved are heavy. Games can’t have that: deadlines are tight, yet we need experiments and backtracking to make them better.

We need something more flexible. One solution is to treat game objects as the accumulation of components. That way, you can easily have static loot and dynamic loot. They are both rendered and pickable, but only one of them is animated.

The question is, how do you add those components together? One solution is to use mixins. This gives us some flexibility back, but we can go even further. See, mixins still tie the data of the component to the code that processes it. But when you think of it, some data, like the position of an object, can be useful for several systems (rendering, collision detection, AI…).

The obvious solution is to separate code and data. Let the components be mere blobs of data, and have separate processors map over them. This gives you a database-like system: your game data is basically a giant table, with a column per type of component, and a row per game object. The processors then query the relevant columns.

This separation of data and code has a number of advantages. For instance, it makes it easy to separate performance sensitive processors from the rest. That lets you write much of your game logic in a scripting language, and treat it as ordinary data —like textures and meshes. Your game is now easier to modify before you release it, and easier to mod after.

Enough with the ECS advocacy however. My point is, this architecture cannot be mistaken for an OOP flavour. It separates code and data, and doesn’t mesh well with objects. Its growing popularity can only mean OOP is being driven out of one of its two core niches. If OOP is not the best tool for the jobs it was designed for, is it the best tool for anything at all?

You may think OOP can still be saved by Graphical User Interfaces, but I wouldn’t count on it: entity systems can be adapted to that context, and there are other promising solutions.

And so, OOP may die. For good, this time.