Prefixing names using some notation

Started by
96 comments, last by phresnel 13 years, 11 months ago
I agree with YogurtEmperor. I should have mentioned myself that I'm mostly concerned with not my own code, but with code written collaboratively by a group of people. And as he mentioned, you can't expect everyone (even yourself) to always write perfect code. Each person has a different view on programming and naming the variables. I wanted to choose a convention which maximizes the help with reading and expanding someone else's code. It turned out that the majority of programmers sees prefixes as not being useful at all.
Advertisement
As an open source developer, I am appaled that anyone would even SUGGEST not using naming conventions. Not following naming conventions or lack of their use is considered to be arrogant and disrespectful. Naming conventions provide consistentcy for others to easily understand and locate code. They minimize effort and visually improve code readabilty while MAINTAING consistency. A good naming convention allows a developer to make reliable inferences about the code, very useful for large code bases. The arguments presented mentioning the IDE's is laughable. Let's come to a conclusion between emacs and vi before we get started on IDEs.
But it is not meaningless.

Discussions like these help shape the conventions that are normally used. While there is no set "best practice", somewhere between no prefix and Hungarian is a happy medium that most programmers can live with. The simple fact of the matter is most programmers tend to write code that is harder to read then it should be. Plain "c" is frequently criticised for its cryptic nature and yet it is in fact poor programmer practice that is at fault and not the language itself.
Quote:Original post by Giedrius
It turned out that the majority of programmers sees prefixes as not being useful at all.


Here is something to consider.

Very few people still choose C or C++. Managed languages are typically dynamically typed, most of them don't offer any static analysis, classes and types are even constructed during run-time. This means that not only are types very lax, the error might occur after several hours under some exceptional corner case.

Yet very few web servers or server farms are written in C or C++. Many are written in Java, but also in PHP, Ruby, Python or similar.

Web development is notorious for having "worst" developers there are. Most are not even programmers, nor have any formal training in software development.

Yet they churn out perfectly reliable, incredibly useful web sites that serve tens, even hundreds of thousands of users or more. Facebook is written in PHP - when was the last time there was a problem with floats vs. ints. Remember that one in a billion chance of error is about once every 14 seconds for them. It is not Google's C++ coding convention that gives them the edge. Read on their build infrastructure, testing and code reviews.


Anyone considering engineering in any shape or form simply needs to take a very long and very hard look at this. It simply cannot be ignored. How is it possible? It's not hype, corporate marketing, brainwashing. It is a simple reality.


In past 2 decades, only two things had measurable effect on how software is developed. And they improved it by orders of magnitude:
- Managed memory
- Automated testing

Everything else is fluff and variation on a theme, but those two things expanded programming from obscure dark art that takes incomprehensible amount of work to something that became accessible to everyone. There is no corporate hype around those two, nobody is paying big bucks for consultants to shove these down everyone's throats, there is no big corporation pushing these - it just works, it just happened, because it is superior.

And it produces results. Actual results that matter to user. It completely eliminates defects.

Want your code to check for types to make sure ints are not used as floats? Write unit test.
Want to make sure your interface doesn't break when incorrect parameters are passed? Write integration test.
Found a bug due to rounding under obscure edge case? Write regression test.
Worry about types, scopes, accessibility, etc... If using C++, minimize sharing, global state, dependencies. Use inversion of control, make all business logic local to functions.
If dealing with legacy code - write test harness around it as you integrate it, isolating old from new. Refactor or absorb old codebase into new.
Worried about pointer sizes and number of bits depending on platform? Have continuous integration server that builds for all platforms and runs all tests after each change.
Have a developer that doesn't know how the system works? Have them look at tests.

Hungarian serves no purpose anymore. It is a concept obsoleted by superior methodologies. Even if dealing with legacy codebase, investing time into above two concepts will have exponentially higher ROI. Those are the real things which really work in real world.
I'm not a big fan of Hungarian notation. I used it heavily when first learning to code, but find myself relying on it less and less. However, what I do find kind of unsettling is the notion of relying on a pop-up tool tip for that sort of information on a variable. It just feels to me like having a television with no buttons and can only be controlled with the remote. Sure, the remote should always be readily available, but there are likely circumstances where it may not (dead batteries, dropped it in the tub, etc.).

You won't catch me coding in a text editor without the use of tool tips and intellisense that often, but it does happen. When it does, I'm not necessarily driven to use Hungarian notation, but I also don't view the presence of those features as reasons to avoid it.

Personally, I think we make too big a deal of it all in the first place. ;)
Quote:Original post by CowboyNeal
I am appaled that anyone would even SUGGEST not using naming conventions.
Yeah, even if you say "dont use prefixes" or "make the code look like prose", then those guidelines are still the basis of an ad-hoc naming convention (and you damn well better be consistent)!
Quote:Very few people still choose C or C++.


http://langpop.com/ disagrees with you. They indexed google code, freshmeat and many other sources. C/C++ consistently ranked top in popularity. Last time I checked there are many new popular programs coming out in C/C++. Google Chrome comes to mind. Many developers also choose C/C++ becuase of patent restrictions and ECMA/ISO non-compliance (as found with .NET framework).

Quote:Facebook is written in PHP


Yes, but they had to rewrite the PHP runtime because PHP as it was, didn't cut it for them. http://developers.slashdot.org/story/10/01/31/0252201/Facebook-Rewrites-PHP-Runtime-For-Speed?from=rss

Quote:Hungarian serves no purpose anymore. It is a concept obsoleted by superior methodologies. Even if dealing with legacy codebase, investing time into above two concepts will have exponentially higher ROI. Those are the real things which really work in real world.


There really isn't a one-naming-convention fits all. Many companies and projects adapt their own for that very reason. I don't advocate Hungarian, but to say it serves "no purpose anymore" is not true. For example, sometimes you want to know if a String is an unsafe string or safe string. They are both type String, compiler will not help you out and Intellisense won't either, but that information can be very useful to know. Another example is with automated UI code generation in api's such as Qt. The MOC compiler handles all the code generation for UI...very useful to know the "non primitive" type in code since the generated types will be in their most primitive forms. Proper planning and adhering to a convention has as much value as the testing examples you give and that value becomes exponential when you scale your project.
Quote:Original post by YogurtEmperor
it is frustrating seeing a value being used and not immediately knowing whether it is a local, a parameter, a member, or, God forbid, a global.


The thing is... why does it matter?

someVariable (hm, I wonder what that is)

gSomeVariable (oh, it's a global! ... ok great, now what)

I don't understand how that information is at all useful.

If you actually need to know more information about a variable (sounds like your variable name isn't very good [wink]), right click -> go to declaration/definition.
Quote:It seems the main difference between our views is that mine assume the author of the code is inconsistent, not necessarily using the best coding practices, etc., whereas yours assume the author is adding lots of comments, using decent coding structure, etc.
No, comments have very little to do with it. In fact, I find that when the code is clear and readable, not many comments are needed at all.
Quote:If you are lucky enough to be surrounded by reliable coders, maybe naming conventions are not as necessary.
But in my experience, most programmers are not reliable, and at least some confusion can be cleared up through naming conventions.
I guess I don't understand this part. If programmers are expected to be unreliable, how is using Hungarian (or a variant) going to help? An 'unreliable' programmer could easily change a variable's type without changing the name, or maybe even use the wrong prefix to begin with, which completely defeats the purpose of using the naming convention in the first place.
Quote:If naming prefixes are hard to read it just means you haven't read much of it.
I disagree with that. I've read a fair amount of code with prefixes, and a fair amount of code without, and I prefer without.
Quote:Anyway, as I said, you have to decide if naming conventions are for you or not. But if you pick one, stick with it, reliably.
Agreed :)

Anyway, DevFred summed it up nicely, IMO:
Quote:Original post by DevFred
Because clean code should read like prose. That's what humans understand best. You should choose good names that speak for themselves. That's way more important than distracting prefixes.
This pretty much reflects my own personal views exactly.
Quote:I also want to reiterate the purpose of naming conventions in the first place. Naming conventions are first and foremost for the benefit of other people who have to read your code.
It is assumed that you yourself can read any style you use, and you can look at code and recall the idea you were expressing at that time, with or without name prefixes.
I'll reiterate that I, personally, have a more pleasant time reading other people's code when prefixes are not used. That's just anecdotal, of course, but it at least shows that not *everyone* prefers the style you're advocating for.

Don't get me wrong, I absolutely respect your opinion, and think everyone should use whatever conventions work best for them and/or their team, be that Hungarian or something else. It sounds like you're experienced, and for whatever reasons have found Hungarian-like naming conventions to be beneficial in your own work, and I can't argue with that. It just kind of seems though that you're arguing that Hungarian (or something like it) is 'the' right way to do things and is objectively better than other conventions, which I would question.
Quote:Original post by CowboyNeal
As an open source developer, I am appaled that anyone would even SUGGEST not using naming conventions.
Who suggested not using naming conventions? Or am I misunderstanding?
Quote:Original post by szecs
I mean this shit always goes this far. At least once in every months. Aren't you bored of this already? (You means everybody, even the 'veteran' users participate it these meaningless flamewars. All the time...)
You're right of course, but I'm going to post this anyway! :)
Quote:Original post by CowboyNeal
Quote:Very few people still choose C or C++.

http://langpop.com/ disagrees with you.
I wasn't aware langpop measured choice. Sure, a lot of people still use C or C++, but for most of those people the reason they use it isn't much of a choice. With legacy code 'choosing' a different language can be a very stupid business decision. If someone only knows C or C++ (because the internet said to learn it) they won't 'choose' to program something in Python or C# because they don't know it.

I guess this depends on how you define 'choose'.

C++: A Dialog | C++0x Features: Part1 (lambdas, auto, static_assert) , Part 2 (rvalue references) , Part 3 (decltype) | Write Games | Fix Your Timestep!

This topic is closed to new replies.

Advertisement