This is more of a rant on development rather than useful code, but hopefully it helps provoke some thought. And while I may be targeting the English language in this post, I believe the other languages run into this issue just as much.
The very first post I made on this blog was a quote from the book "1984" about a pared down language because (a) it surprised me that everybody focuses on the Big Brother aspect of the story rather than the redefined communication, (b) an explicit language where each word can only mean one thing makes a lot of sense, and (c) having words that are meant to be a scale (good, better, best) actually having the same initial word in them (good, plusgood, doubleplusgood) is more in line with the Latin roots of the English language. Think back to when you were first starting to understand all the oddities that make up the English language. I before E except after C. To pluralize a word, add an S; except if it ends in these letters, then do this instead. There’s a lot of rules to the language that don’t make a lot of sense. Although I’m pretty sure Ph.D. bearing learned people can give me the reasoning, the general answer to why it must be done this way is “Because that’s how it’s always been done”. Outstanding. Way to think outside the box.
When you're gathering requirements for a problem, how often do the developers interpret a different meaning than what the end users actually meant? Like referring to the "home page" of a site as the first page a logged in user sees versus the first page that every user sees. When the manager is talking about the project, are they talking about the entirety of building the software, the Microsoft Project file, the Visual Studio project file, or some other meaning to them entirely. It gets even worse when a word is used that has a completely different meaning in another language (which at least one project has run into). Assuming you're programming only in the English language, you have somewhere in the range of 475,000 to 600,000 words to work with. Not only that, but more words are added to the collegiate dictionaries every year. And then there are words that are commonly used that don't even exist in a standard dictionary. So why must we overload the same words over and over again? Stop being lazy and calling every application that serves out data or hosts another application a "service". Give it a unique name.
Microsoft’s been getting taking flack about overly descriptive (but entirely accurate) developer product names. Sure it’s easier to simply say “Astoria”, “Geneva”, or “Longhorn”, but unless you’ve heard of them before you have no clue what they’re actually for. Now hearing “ADO.NET Data Services”, “Claims-based Identity Management”, or “Windows Vista”, you actually have some idea what’s being talked about without having to spend a lot of time digging into what the product actually is (albeit, not a much better idea in the case of Geneva…). Sure we need to account for being able to talk about things abstractly in some cases, but we should be able to categorize whatever we’re talking about in a similar way that biologists categorize plants and animals. IIS is a type of web server, which is a type of server, which is a type of computer, etc. StructureMap is a type of IoC containter, which is a .NET piece of software, which is a development tool. Although there’s a lot of overlap when describing software, it seems like there could be an easier way to describe and categorize specific software.
If you really think about it, each and every word only exists because a group of people have agreed on a general meaning for it. Words like "blog" and "podcast" were created to sum up new trends in technology that had not been defined at the time. All it takes is for somebody to come up with a word and others to start using it for it to catch on. In much the same way Scott Hanselman wants to have a word that says “I’m a technical person and know what I’m talking about”, I’m think I’ll start using Newspeak terminology to better describe parts of the software I write.