Flaws of Object Oriented Modeling

By Asaf Shelly (32 posts) on August 22, 2008 at 12:50 pm

At the beginning of the computer era the system designers came from the world of hardware and it is noticeable. In hardware there are many working elements that can operate in parallel and most times at different rates of operation. This requires hi degree of accuracy in system timing. Chip designers count the number of transistors between two elements to make sure that the Operation Flow is maintained.

The Assembly language defines a set of primitive / native operations. The programming process and software design using Assembly is in direct correspondence with the execution flow. If you want the flow to break then you use the Jump operation explicitly. This is because Assembly was originally designed by hardware developers and was written to accommodate the hardware.

The C language is a procedural language. It was originally created by Assembly programmers (as 'B'). We can still see Assembly type of thinking operations built in the language, for example:
++ is INC
-- is DEC
[A] ? [B] : [C] is equal to:
[A] ;// do operation [A]
JZ ;// if true go to [B]
[C] ;// else do [C]
JMP ;// go to end
[B]
When you are used to working with Assembly you get used to thinking in "test" – "do this if so" – "do this if not". C programmers hardly ever use this.
The C language is a procedural language. It helps us group together sets of operations and also releases us from the need to use Jumps or Go To-s, which can be very simple to track execution flow if you use it correctly but can be easily abused into what was coined as "Spaghetti Code" because of all the lines you need to draw when you try to track the execution flow of an application that was designed incorrectly.

Next evolution, came the language of C++ which is Object Oriented in design. This allows the separation of code modules into discrete software units called a class. Object Oriented programming allows multiple teams of developers to work on the same project very easily. Object Oriented languages can really help the developer manage the code.

The problems that came with Object Oriented programming is that these languages are really designed to help the developer manage the code…
Now it is almost impossible to follow the execution flow. When you want to know what happened in your application when the error message popped up you have to go over all the objects and all the classes that the execution flow went through and many times execution flow travels between several member functions before leaving to another object. Following the execution flow today is almost impossible.

Many times I have seen 'Pure' Object Oriented design producing a code that is a collection of many three-line functions that call each other. Every button click in the application travels through ten or more small functions. You can no longer follow the execution flow by reading the code. This brings two major problems that we face today.

The first problem is that it is no longer possible to detect execution flow bugs with a simple code review. Going over an Assembly code it is very easy to detect simple bugs such as down-casting, potential overflows etc. Reading an object oriented code you can't see the big picture and it is often impossible to review all the small functions that call the one function that you modified.

The second problem with this model is the "Not my fault" syndrome. "I only called a member of another object and it returned FALSE. Don’t ask me why". This is how you get an error that says "Problem with saving the document. Reason is '0x8000745 – unknown', What would you like to do?" What do YOU think I should do?! The programmer got this return value from some object that he is not familiar with and has no idea what this value means so he just pushes that to the higher level. The last level that you can propagate to is the user, and so my mother keeps facing these interesting decisions when she is trying to save a picture.

Object Oriented Modeling was invented to help developers manage the code but had no regard for execution flow. Up till now we used to use step-by-step debugging to see the execution flow. This is no longer relevant when we plan on having multiple threads going over our code. You single step one thread and another completes 5000 loops in the background. If you have multiple threads going over the same function they might all stop on the same breakpoint and you have no way of telling which is which effectively.

Following execution flow today is a terrible problem.

This is the first in a collection of articles that will introduce a new model called the Operation View Model with motivation for using it. This model can define any element in the computer world and it is the next step in software evolution.
Next article will demonstrate how operating systems follow an evolution pattern that is similar to the ones described here.

Categories: Parallel Programming
Tags: , , , ,

For more complete information about compiler optimizations, see our Optimization Notice.

Comments (27)

August 22, 2008 2:02 PM PDT

Clay Breshears (Intel)
Clay Breshears (Intel)Total Points:
30,476
Black Belt
As I was reading this I was saying to myself "He's right, you know." I can barely stand to read OO programs. I invariablely end up jumping back and forth from file to file in order to trace the logic. If I didn't have Visual Studio or some other IDE to point out where things are defined, I'd give up completely. I can see a more operartion or task-centric model would make things easier to parallelize.

(Of course, I only masquerade as a C++ programmer in seedy wharf-side bars, so take my opinions with a teaspoonful of salt.)
August 23, 2008 11:28 PM PDT


Steven Harari
It's time for the OO evangelists to wake up & smell the coffee.

Not everything is an object.

Sure objects are great for more than just managing code - it's the perfect approach for merging algorithms with data; not to mention the benefits of polymorphism. However, if you take a good look at C++, you'll see that it's more than just an OO language. A true multi-paradigm C++ program combines preprocessor macros, functional decomposition, task oriented programming, generic programming, etc and of course OO.

In order to develop truly robust software, you need to use all the tools at hand. If all you use is an OO-hammer, everything is going to look like a nail-object.
August 25, 2008 8:33 AM PDT


Emmanuel Stapf
I think that you used the wrong O-O language and tools. Eiffel which was created more than 20 years ago does not have the flows mentioned above. Why? Simply because Eiffel is a language and a method for building large, reliable and reusable software. The key aspect of Eiffel is `Design By Contract': a software system is viewed as a set of communicating components whose interaction is based on precisely defined specifications of the mutual obligations -- contracts (as Business people do with Business contracts.) More details at http://www.eiffel.com/developers/design_by_contract.html

And if you have the right tool then following code execution is really not a problem. An Eiffel programmer can usually manage between 500K and 1M by himself.

In any case I'm curious to read more about your Operation View Model and how we could apply it.
August 25, 2008 11:58 PM PDT


Yanic
The whole idea of OO is to compartimentalize behaviour into coherent units (i.e. concept oriented), and as a side effect the behaviour for a single 'use case' is spread over many different classes. A different approach would be to group behaviour according to use cases (i.e. task oriented), but that has other problems since tasks are less stable than concepts in a problem domain and behaviour needs to be reused across use cases.

I believe however that the problem lies not with OO approaches but dismal tool support.

And btw, Eiffel does nothing to solve the problem of control flow passing through many objects to achieve a single task.
August 27, 2008 9:39 PM PDT


Emmanuel Stapf
Yanic, in my post I was saying that "And if you have the right tool then following code execution is really not a problem." So indeed Eiffel as a language does not bring you this, but EiffelStudio code browsing facilities let you grasp code path execution very quickly. No other tool on the market for C++, Java or C# is able to do better than EiffelStudio.
August 27, 2008 10:36 PM PDT

Asaf Shelly
Asaf ShellyTotal Points:
3,020
Black Belt
Hi Emmanuel, Can you refer to specific abilities of the IDE? (email is also an option). Thanks, Asaf
September 18, 2008 8:05 AM PDT


Emmanuel Stapf
Basically all elements of the code can be browsed to their actual definition and for each element of code (routines or classes) you can also figure out who is using it as well as what it is using. With this kind of information on hand it is very easy to grasp a large system. For more info, check http://www.eiffel.com and http://docs.eiffel.com
November 3, 2008 6:10 AM PST


Mark Lee Smith
You're not wrong... for overly simplistic understanding of object-oriented programming you present here. In reality object-oriented programming is a very broad subject with many interpretations: there are boundless approaches towards every aspect of object-oriented programming!

The problems you touched on here also exist in procedural programming with functional decomposition and are not inherently problems with object-oriented programming.

Some approaches to object-oriented programming have very good concurrency semantics; some use powerful dispatch mechanisms and don’t force the programmer to use conditional expressions in place of polymorphism; some of don’t force programmers to use a flawed inheritances mechanism! The fact that mainstream object-oriented programming languages do it wrong doesn’t mean that object-oriented programming itself is flawed.
November 3, 2008 10:09 AM PST


Ralf
This is one of the most stupid blogs I have read so far.

What is you solution? Functional programming - for example - will not solve the "problem" of non linear execution flow!
Remove all function calls and inline everything in on big function? - In a program consisting of more than one million lines?

One of the best possible solutions to mannage complexity is to divide it: Build components and modules.

What do you think how a car is engineered? A plane? A spaceship? Your computer?

Clay Breshears (Intel):
Whats wrong with using IDEs? If you think using notepad is the right tool to programm - Do you think paint is the right tool for professional design?

IMO programming productivity will increase with even more sophisticated IDEs and source code management tools like it is increasing with more powerful languages. At one point using notepad/vim/emacs for programming will be like carving stones instead of using a word processor.

I don't understand people who insist on using editors, because they think thats l33d - instead of looking for the most powerful tools which help them to achive a goal.

Do you really work at intel?
November 3, 2008 10:50 AM PST


Rory
You can (and probably should) use unit tests to isolate and test execution in a single class.

Crap user error messages are not the tell tale sign of objected oriented software they're the tell tale sign of crap software
November 3, 2008 11:37 AM PST


Tad
Ralf, I take issue with your Emacs/Vim hating, if only for the fact that you lump them in with Notepad, the god-awful demon-child of Redmond. Obviously, you've never taken the time to master (and customize!) either one; otherwise, you'd realize that point-and-click IDEs don't stand a chance compared to an interactive, self-documenting and self-extensible runtime.

And calling Emacs an 'editor'? There's a reason people choose to check their email, browse the web, step through their source code, and launch nuclear missile strikes without leaving the safety of Emacs, and that's because it's an operating system, not an editor.

You owe me 10 Hail Richards for your blasphemy.
November 3, 2008 12:01 PM PST


Tad
* Ralf transfers 10 Hail Richards to Tad *

I felt a litle bit trollish and knew some people would get a litle bit upset if I mention Notepad together with vim and Emacs. ;-)

Well I have used all 3 of them. Ok, ok Emacs is slightly more powerfule than Notepad...

Anyway, its always funny to watch some hardcore linux/unix coworkers navigating in the source with their stone age tools.

It's not about point and click. Its about automatically showing the source of a function underneath the cursor in another window (Source Insight). It's about automatically showing all places where the function under the cursor is referenced - in another window (Source Insight). It's about an integrated debugger (Visual Studio). It's about refractoring...

Well I think the current IDEs are not the end of the story. I predict a time where source code is no longer stored in text files. Instead all functions / classes / etc are stored in a database as an AST. There will be the possibility to show and edit the source in your prefered style. For example: curly braces, python indent, pascal begin -end, etc.
There will be an integrated revision control system in the database. You will "assemble" your "source views" on the fly to have all functions for a use case or and execution flow in one editor window. There will be the possibility to expand a function call under the cursor: The IDE will show the source code of the function inlined in the editor with a different background color. Imagine starting from the main function and then "open/expand" the relevant functions in the same editor window. You can edit these expanded functions. Debugging will be similar. There is no need to jump between different source file if the debugger can display the functions you step into in one source window and automatically close them if you step out of the function. There are much more possibilities if you get rid of flat text files.

But we are no there, yet. Meanwhile I try to use the best what I can get.
November 3, 2008 12:12 PM PST


gil
I agree with this to a point. Sometimes flow control is very important, sometimes it is not.

I write videogames. Often we have many objects that move about in a physics modeled world. These are best handled as objects.

But you also have ui control with dialog boxes, pause the game states, switching from level to level, showing reward moments in sequence: these things can be difficult to time right if done as objects.

Thus I like to have a scripting language that can sleep and wait for things to happen to manage the overall game logic.. where the top level of this script code looks like this (and you can extrapolate this down a few levels into the subfunctions)

do
answer = MainMenu()
if answer == newGame then
PlayGame(nil)
elseif answer = loadSave then
game = AskWhichSave()
if(game != nil) then
PlayGame(game)
end
elseif answer == quit then
break;
end


while (true)
November 3, 2008 12:13 PM PST


gil
Damn! I lost the tabs in that post.....
November 3, 2008 1:55 PM PST


Mark Lee Smith
Tad: have you used Lisp, Smalltalk or Self? They included many (all?) of the editing features you mentioned... decades ago.
November 6, 2008 4:39 AM PST

Asaf Shelly
Asaf ShellyTotal Points:
3,020
Black Belt
See a discussion about this also here:
http://www.reddit.com/r/programming/comments/7b09p/flaws_of_....._modeling/

November 6, 2008 11:06 PM PST

Asaf Shelly
Asaf ShellyTotal Points:
3,020
Black Belt
See more comments here:
http://www.reddit.com/r/programming/comments/7b09p/flaws_of_.....d_modeling
May 28, 2009 9:35 PM PDT


Mark
I too work at Intel and am absolutely humored when a fellow hardware engineer / electrical engineer bags on object-oriented programming or C++. To be fair, arming a hardware engineer with C++ turns out to be a really, bad, bad idea: amazing when you consider the parallels of silicon design with object-oriented decomposition. In any case, like most non-software engineers, Asaf completely misses the point of OO. Simply put, OO design exists to manage complexity. And yes, when you're working on 100K+ line programs, it's OK to defer ownership to other folks on the team...

As many mentioned above, you evade the issues mentioned by (1) architecting up front, (2) writing decent code, and (3) using ULTs to validate classes, modules, and layers. And specifically addressing the jumps / multithreading, knowing how to use the debugger often goes a long way in flushing out issues. Of course, considering multithreading in the design phase is probably the only way 100% to the code right. That's kind of a trick one learns when suffering through an Ada class in college instead of electromagnic-theory. :)

As far as the tools comment and Intel...folks in the real world have no idea :) To get a software design/modeling tool in Intel requires an act of congress. (In the Window's world) if you can't get by with Visual Studio, power-point, and maybe Visio, you'll quickly find yourself on the validation side of product development :). And don't get me started on the home grown tools :).
September 18, 2009 2:31 PM PDT


Mauricio Noda
"You can no longer follow the execution flow by reading the code."

This problem is not related to the programming paradigm, but to the size of the system. Decomposition provided by procedural programming and OO made bigger systems viable. Because procedures and OO came together with bigger systems one might think its the newer paradigms that are making code harder to read. Comparing a small assembler program to a large OO one is a mistake. Code a million lines of assembler and you will have a lot more trouble reviewing it than the equivalent any high level language, procedural or OO.


"The second problem with this model is the "Not my fault" syndrome".

It is again a problem related to the size of a system. You divide work between people if you want to finish a large project in a reasonable time. You reuse someone else´s code if you want to finish it earlier. You blame someone else when the project starts going wrong. It happens in any programming language with any paradigm. It DOES happen with assembler when you use CALL/RET statements.


"If you have multiple threads going over the same function they might all stop on the same breakpoint and you have no way of telling which is which effectively."

Good parallel architectures allows you to increase or decrease the amount of threads without changing the behaviour of the system, changing only the performance. So if you need to use breakpoints, just decrease the amount of threads to 1, debug, then increase it again.

Actually, OO helps managing parallelism better than any paradigm before it. The main headache in parallel programming is avoiding those hard-to-debug racing conditions. Racing conditions occurs when 2 or more pieces of code (instructions) tries to manipulate the same shared data at the same time. Since good OO design groups data and instructions that manipulate these data together, all the instructions that can interfere with each other stays together and can be handled as one.


If there is a problem with OO is that it requires quite an amount of experience to be used effectively and can be disastrous in the hands of the inexperienced. Combine a lot of gotos with polymorphism or a very tall inheritance structure and you will have a polymorphic spaghetti code that can be worse than any anti-pattern in older paradigms.
November 30, 2009 11:19 AM PST

Asaf Shelly
Asaf ShellyTotal Points:
3,020
Black Belt
Hi Mark and Mauricio,

Thank you for the interesting comments and apologies for the delay.
I will answer these last to first (cache reasons :)

First of all let me start with declaring that I am not all hardware developer. Here are a few lines of OO code that I did in the past ten years:
http://www.asyncop.com/MTnPDirEnum.aspx?treeviewPath=%5bm%5d.....WinModules

I agree that OO helps manage huge amounts of code. With that, I find a huge amount of problem with Window Media Player telling me that the video cannot be played because - the file is corrupt - or the website is down - or there is no Internet connection. This is because someone was such a good OO programmer that they completely ignored return values. This is so common that Exceptions are used to make programmers handle errors.. another bad bad thing that comes from OO paradigms...

My problem is not with OOP, it is with the paradigms surrounding it. Windows NT Kernel is fully OO - every driver is an object, and it is fully parallel. No one considered writing 2 to 4 lines of code per function in a device driver.

Sometimes you cannot reduce the number of threads to 1 because then you won't find the data races.. What if you have a race in function 23 in the call stack when it is run with function 15 on another thread's call stack?

You don't find flow-control bugs by running the application. You find these bugs by going over the flow diagram. OOD does not contradict flow diagrams but the methodologies used today with OO do not even mention it.

Only OOP can produce the term "Random Bug" which means "a bug that happens every now and then, not sure why". These bugs are "random" because they are related to flow control and only few OO expert have ever mentioned flow control as part of the system design.

Regards,
Asaf
January 7, 2010 2:48 AM PST

levtraru
levtraruTotal Points:
5
Registered User
Maybe it is not necessary to control or care about the whole execution flow.

Some of the keys of OOP are to identify responsibilities and to delegate.

When you send a message to an object and you don't get the desired result then maybe you are delegating into the wrong object or that object is not accomplishing its responsibility. In the first case you should correct the calling object to call another one; in the second case you can forget about the calling object and concentrate on correcting the called one.
Defining Unit Tests is very usefull to achieve this.

In OOP it is really important to care about design. Most of programmers learnt to program in a procedural way and then try to think the same way when programming object oriented.

OOP is not best or worst than Procedural Programming, it is just different.

Kind regards.
January 15, 2010 10:28 AM PST


Christian Posta
Asaf,

Thanks for your post! It gives an excellent path to some vibrant discussions.

I would like to focus on the title of your post, and how I don't believe the "flaw of object oriented modeling" is a flaw at all but rather two different approaches to programming with each one best suited for different contexts.

You say, "Object Oriented Modeling was invented to help developers manage the code", but this is not entirely the case.

Modeling real-life systems has been around for many hundreds of years, though surely longer. Modeling can be found throughout many disciplines including engineering, architecture, mathematics, et. al. As mentioned above in previous comments, which I fully agree, modeling is an approach to problem solving that helps manage complexity in a given domain. Trying to implement a model in software can become difficult using languages that are not well-suited for modeling. Object-oriented languages are a better fit for models because of their state+behavior (object) approach. Procedural languages are not as good because they're focused more on completing certain predefined tasks and not on capturing the concepts in the model. Object-oriented modeling is the implementation of a model with an object oriented language and can reap the great benefits of modeling. To say it was invented to "help developers manage the code" is entirely simplistic and misses the point of modeling in the first place.

On the other hand, you're somewhat correct when you say modeling in software reduces the ability to follow an execution path. The reason is simply because modeling is focused on concepts and behavior, and not a predefined, task-oriented, series of steps. Therefore, the reason you put forth for a "flaw in object modeling" is no flaw at all. It's a completely separate approach for solving a problem. If the problem has to do with mathematics and physics, no doubt you would model it. If the problem has to do with a complex business domain, a model will help manage all of the complexity found in such a domain. If your problem is writing device drivers or embedded systems software, a procedural approach probably would be more appropriate.

Can you write device drivers using a modeling approach with an object-oriented language? Probably. But it might be overkill, muddy the tasks taken to perform the driver's functionality, and obscure the execution path.

Can you write complicated business logic in a purely procedural manner? Probably. But the software would end up looking like a monstrous tangle of functions without key active elements in the domain clearly conceptualized, and maintenance would be a nightmare as a better understanding of the domain emerges.

Your post must assume that the people who argue 'for' object-oriented modeling argue for it as a solution to every problem. That is most certainly not the case. With the understanding that object-oriented modeling, or rather modeling in general, is appropriate for certain cases, your post demonstrates no "flaw" in object-oriented modeling at all. Either focusing on concepts or focusing on tasks is an appropriate approach given the context.


Thanks again for your post.

Christian Posta
May 13, 2010 1:31 AM PDT


kenneth dakin
Asaf Shelly is really describing problems (i.e. following the control flow), that are mainly caused by asynchronous processes (that are also evident in any event-driven programming scenario).

He uses this problem as a stick to criticize OOP.

OOP can be severely criticized without recourse to following the control flow - although it certainly doesn't help. Hidden or obscure processes help no-one.

In my opinion, OOP creates many more problems than it supposedly solves, causes unnecessary overheads and certainly bloats code.

I personally use "control tables" to structure programs (and have done so successfully for 40+ years). The logic of even a complex program can be easily built into a decision table that is designed to be "executed" quite efficiently. At a glance, complex relationships can be viewed effortlessly and changed easily (without the need to change the interpreter in most cases).

It is my experience that most proponents of object oriented programming don't really understand their subject deeply enough to even explain how its nuts and bolts work. It is as if they worked on some other plain of existence to that of the real world and leave it to compiler writers to sort out the mess.

Assembler programmers (or high level language programmers who are aware of how the machine functions) are much more savvy than OOP programmers who strangelly think in terms of creating "methods" to make Lassie (who is a dog) save Timmie (who is a child). To use the same metaphor - I think OOP programmers have been "sold a pup" that has perhaps is deaf and blind and has Hip dysplasia (canine).

Let us inform the Kennel club of the dubious breeding practices and - please can we have our money back?

Ken
July 30, 2010 7:14 AM PDT


Allan Maher
Object Oriented Programming: Its a bit like communism, a nice idea but it doesn't work?
May 14, 2011 10:16 AM PDT


asim khwaja
I agree with Ralf and Christian Posta. And yes he really works at Intel! I am a software developer who has worked at Intel in the past. Most of the people doing programming are hardware people and that's what you would expect from them. Because while writing code to test a chip one need to know the exact execution path and all minor details in order to track defects in chip design. But that only means you are using the wrong language for your task. In developing software (and not code at the hardware level), if you still find the need to track the exact execution flow in order to find issues with the code, you are a naive programmer and using the wrong approach. And obscure error messages in windows or its application, does not reflect upon OOPs. They are just bad programming habits programmers have acquired over the time. Take the best quality tool in the market and hand it over to two people - one can create a master piece and the other a mess! And these obscure error messages are not a result of complications in tracking control flow in OOPs but exist also in C code.
August 25, 2011 7:34 AM PDT


Rick

The real problem that I have with OOP is the ARROGANCE that it has created in the software field. The smartest person in the world is of no use if they can't communicate and be reasonable. There is no reasoning with many OO programmers - they are like creationists (adam and eve living with the dinosaurs). OO is good for managing complex implementations, yes. But the problem is a lot of OO programmers don't TRY to simplify. OO is NOT simplifying - its just hiding. There is a big difference between simplification and hiding.
October 18, 2011 1:26 AM PDT


Orwin O'Dowd
Orienting to an object in reality is harder than you might think: to represent merely the shift from one vector of oriention to another in 3D space, mathematicians depart normal numbers for quaternions. Its game programmers who now wrestle with this complexity, and their frustrations work up the line to designers. And math programming shakes up the shop: watch Wolfram Alpha, and do have a look at Cabri and Cinderella.

Am I just being theoretical and a math snob? My concern is that even if you don't have quaternion-level flexibility to back out of jams, the electrons you are trying to order around the chip do, and *will use it* if it gets them to a lower-energy state, which typically leaves you at the console with less initiative to intervene.

I think you need default objects to grip the chip before taking options to steer it. This is max hard with (a) disks that rotate while the chip doesn't (the database devil); (b) USB devices that stack like supercomputer components; and (c) the DSP chip, which does that Fourier stuff underlying the uncertainty principle! Here one is advised to distinguish kinematic (monitoring) and dynamic (intervening) aspects of the model. It would help to have generic escape to the monitoring frame for jam resolution.

Trackbacks (5)


Leave a comment  

To obtain technical support, please go to Software Support.
Name (required)*

Email (required; will not be displayed on this page)*

Your URL (optional)


Comment*