I've stumbled across a few blog posts lately that talk about why everyone should use one technology over another, or why someone is leaving a particular language for another. Obviously there's no shortage of evangelical blog posts pushing the merits of one technology and lamenting the poor state of whatever-you-plebs-use. But this latest spate got me thinking.
Most (good) developers talk about using the most appropriate technology for the job. At its most basic level that means choosing Objective-C for a resource-hungry iPhone app, or writing your latest facebook-killer application for the web rather than the desktop. That stuff's obvious. The more idealist polyglot programmers will take it further and push Ruby on Rails for web apps with a small budget, or they'll suggest using RavenDB and deploying to AWS because all you need to do is store and retrieve documents across the web. If you're in a Windows environment with a team running Scrum, choose TFS, C#, and SQL Server.
So "Horses for Courses" right?
The aim is valid and noble, and it's certainly one I strive for. But one thing frequently gets overlooked, and that's the people on the team (or to stretch the metaphor - the jockeys).
If you have a team of programmers who are very used to writing software using certain technologies, think very carefully about advising them to move to something else. I'm not saying don't do it (in some cases you really should), but there comes a point where the benefits to be gained by using language X on platform Y with source control Z just aren't worth the trouble.
Unfortunately, most programmers write code in one way. They use one language, they know one data storage mechanism, and they've only ever written applications for one environment. Maybe in a past life they tried out some other language, and maybe they dabble in HTML occasionally, but they're only experts at one thing.
You, on the other hand, might look at a set of requirements and decide a NoSQL data store running behind RoR is the "best" solution for this project. Similarly, you recommend using git as the "best" source control system to use. Great. Unless you're the only one who knows this stuff - then you're dreaming. If you have a team of C# developers, you'd want to have a pretty good reason for suggesting they program in a different language. If every other project they're working on uses TFS, learning git is going to introduce a lot of overhead (initially). Sometimes, the current way of doing things is the "best" way, even if the idealist in you disagrees.
Now, that's not to say it's never a good idea to force a shift within a team. Consider a team of VB 6 developers who, for the last 15 years, have been dutifully writing VB windows applications with an Access back-end. At what point do you tell them it's time to move on? (Ideally it would have been at least 5 years ago, but that's clearly not an option). Assuming you don't outsource or "refresh" the team, you should strongly suggest they change, but acknowledge that the extra effort they'll have to put in will increase the work. Also be aware that you're unlikely to get a quality solution from them if they don't yet know what they're doing.
My point is, when choosing the right technology for the job, consider everything, and that includes the skillset of the developers.
With that in mind, blog posts encouraging everybody to stop using .Net because it sucks, or telling them they should never use pure HTML and JS for business apps are just ridiculous. Yes, you might have had an overnight change of heart and now realise language X is the worst thing in the world, but you're thinking about the specific situations you've been in, and developers with specific skills (usually just the individual author). If your whole team can just up and move to Ruby, then fantastic! Say hi to the rainbow coloured unicorns for me!
It's always good to encourage teams to learn new technologies. It's occasionally good to force a team to move on, but sometimes the "best" way isn't the "ideal" way.