theGSman wrote:
You can even extend a C style "object" without recompiling the original code:
Code:
struct extended_struct {
struct base_struct b;
... ... ... // additional variables
}
void extended_func(struct extended_struct *e) {
... ... ...
base_func(&(e->b));
... ... ...
}
This pattern is used extensively in the Linux kernel as the previously mentioned example of an object-oriented software architecture implemented in a non-OO language. I don't think there's much serious debate on whether or not OO has value for projects of significant size and complexity. "Object Oriented" is a point of view - an abstraction model - a way of breaking down a system into pieces and then understanding the relationships between those pieces. It's what software engineers do every day, and it's not scary. Mostly it's a handful of terms of art, such as "class" and "inheritance", that all people familiar with OO will know. And once you know the terms, you can discuss OO architectures with others.
OO programming is often discussed along with OO languages, and that's often where people start to passionately disagree. I find that often people cling to the language they learned in college. I've seen C++ programmers argue with C programmers, with C programmers claiming that everything C++ can do, they can do too, with the C++ proponents claiming that their language has the syntax to more expressively describe the relationships and operations with a level of type-safely. Then colleges started to teach Java, and now the Java guys were arguing with the C++ guys about type safety, interfaces vs. multiple inheritance, etc. The C# guys argued with everyone, although the C++ people started to come around; acknowledging that C# has some nice syntax. With the rise of Apple tech, Objective-C became relevant and... well.. some people think it both the purest and yet still worst language of the bunch.
I did embedded systems programming in C++ back in the mid 90s, at 3Com. This was on Motorola 68360 CPUs in Ethernet switches. We avoided operator overloading and exceptions. Operator overloading hides the complexity of a simple line of code ("a = b + c" can become a function call). Exceptions were not ready for prime-time. By limiting ourselves to what was thought to be a "safe" subset of C++, using it for object modeling and class hierarchies, we got the syntax and type-checking benefits without the perceived "bad stuff". It was very successful... and that was on systems with limited RAM and that had many software components, protocols, etc, that needed to run with certain performance characteristics.
I've not used C++ in embedded systems since then. The embedded systems guys I come across generally fall into two camps: The "never used C++, but instinctively hate it" camp, and the "learned Java in college, C++ is gross and unsafe" camp. I'm glad I used C++ back then, as I think I might have otherwise developed in that first group. People tend to fear what they don't know. I was the same with Objective-C until I studied it a little. The syntax still looks strange, but I appreciate what it's doing.
Object Oriented facilities are present in many languages now. I work almost exclusively in 'C', but I do appreciate OO design and I still do like C++. It has all the power of 'C' for the low-level stuff, yet still lets you think high level, if you want to.
This thread was about Object Oriented dispatch - a language neutral topic, so this car has veered way, way of the road. The core question from the poster was as relevant, I think, as the implementation of Forth's NEXT primitive. It is the sequence of operations needed to call a class-specific method, and is at the very core of an OO implementation. So, what is the best way to implement a class-specific vtable or similar object-type-specific jump table on the 65816? If anyone is still interested in that question, I would suggest reading the first message of the thread.