I just watched Dan Leuck's interview with Civil Beat (http://tinyurl.com/3tluvyw) and found it really informative, but he touched on one subject that has been a sore point with me for years. He said that when they are looking for expertise, sometimes job postings go 6 months in Hawaii before being filled.
Now I know it's often necessary to find someone with a particular expertise, but for decades now I've seen companies insist on specific expertise that could be developed in-house. Often these sets of requirements are so specific that it's no wonder they take a long time to fill. You see things like this:
"Looking for motivated person with 6+ years using PHP and MySql in a Subversion/Hudson build environment."
Usually they are even more specific, with maybe 10 specific technologies in the MUST category and another half dozen in the WANT category. It's frustrating seeing someone who has been using Perl with PostgreSQL with CVS or git turned away from the example job above without an interview. If you are willing to wait 6+ months for the right person and maybe get them from the mainland, wouldn't it be better to take a local person with similar background and give them 6 months to come up to speed in your specific technologies?
I've been working as a software engineer, mostly in Silicon Valley, for nearly 40 years. But for my first job I was hired by a startup to do assembly language and drivers on what would be called a custom Symetric Multiprocessing embedded system today. Pretty heady stuff after only a few weeks of assembly language, and for a different CPU at that! Within a year I was training new engineers in this system.
I'd be interested in what other people think about hire-and-train vs waiting for a person with the exact right background.
Yes John, many hiring managers have no clue what it means to gain expertise in some branded technology. They throw out a list of mixed labels where one, such as JAVA EE, is several orders of magnitude more complex than, for example, CSS and HTML. Why are these in the same list? It's like saying "must be master cabinet maker", and also "proven experience screwing in brass handles."
My best example on this topic is once when an engineering manager liked me enough to let me learn everything from scratch, on my own - a specialized medical device, PID controllers, embedded systems, a new language, a new operating system, a new graphics language, and more. In nine months I had a working prototype that lead to 40 patent claims for new data visualizations, and other companies are now copying some of these ideas.
A good programmer is a good programmer. Generalists should get more respect. The need for extremely specialized world-class expertise does exist, but such need is so rare compared to what most job postings indicate.
Aloha John & Don - I agree with you regarding job descriptions often being too specific and that the ability to train a smart engineer on any technology stack is frequently underestimated by hiring managers and recruiters. That being said, there are plenty of situations where you need a particular skill that can't be picked up overnight, even by an intelligent engineer. If you are a small company, you might not be able to afford a 6-12 month investment to bring a new guy up to speed on a complex tech stack. Sometimes the company's customers have an expectation that people working on a system start with domain and/or technical knowledge. Our company has brought people up through our intern program and developed their skills (technical and domain) through mentorship with senior team members but we can't afford to do this for everyone in every situation. Most Hawaii tech companies aren't big enough to absorb that much upfront cost.
re: my comment about our schools not producing enough top notch developers and engineers
Let me first clarify that A) they do produce some great guys (we've hired four), just not enough and B) I know many excellent profs at UH and HPU - my comments shouldn't be taken as a dig on the professors.
We find that developers coming out of school have radically different spin-up times. The top guys already know how to work in a collaborative environment, are familiar with common tools such as source control, and have built non-trivial open source software that demonstrate their skill. We aren't producing enough of this type of developer.
Aloha Dan (or do you prefer Daniel?). Thanks for chiming in. No question that there are times when you need specific talent. If you need a net security guru, a generally bright guy and a couple months training won't do.
But there is a place between the new college hire (or intern) and a person with the specific background you need. I've been on projects were we desperately needed more people. I would recommend an excellent person I'd worked with before, only to have them turned down by the hiring manager for lack of some specific background that wasn't necessary at all. In one case the boss was looking specifically for experience in wireless networking. The person I recommended had extensive background in network protocols, but not specifically wireless. So we stayed understaffed for months.
I'm pretty firmly in the 'train em' camp, unless the need is urgent and the skill is deep and not easy to learn. If you wait 6mo-years then will your product be worth while when you do find someone? A good programmer can learn a new environment/language pretty quickly, although there are LOTS of details to an API that unfortunately are only learned over time. Thats a failure of the documentation/training/design of the API stack. A good programmer can network people and find help on the interwebs for issues. They may need more time than the guru with 12yrs experience, but by the time the guru becomes available, the noob will have gotten thru several iterations and be close to something you can ship. Then hire the guru to help polish it.
As for schools - I dont know about UH but my feeling is - teach them several languages (scripting, compiled, etc) and a whole bunch of algorithms/data structures so they have a software toolkit and ability to pick up new tools as needed. Give em a bunch of projects that force them to learn new stuff/techniques. Participate in programming challenges like the recent Node Kockout (and other upcoming ones) the guys at HiCapacity.org are doing --- thats the Honolulu Makerspace akin to our Maui Makers.com. Internships are absolutely terrific ways to learn. I was one and have hosted a half dozen or so in my career.
Two stories for both sides:
1 - Back in 1998/99, I was looking for work. One local company was staffing up big but trying to hire Java programmers with 2-5yrs experience. Java was released in 1996. They had a really hard time finding "qualified" people. After many months of not filling jobs and getting behind in business plan (i.e. losing $$) they changed their 'qualifications' requirements.
2 - much more recently my employers were implementing an IP network on a UAV with fairly well powered microcontrollers. The decision was made NOT to hire an experienced network stack engineer, even as a consultant. Serious network issues developed later due to improper stack & network design. Redundancy coding was also deemed not requiring experienced expert. The project ran many many months late (for programming and aerospatial reasons). RF, networking, and intermittent redundancy failure issues plagued the project right up till the one working aircraft fell out of the air. Should experts been hired rather than trained? yes - training requires time to gain experience, and experience often comes of failures.
I'm pretty non-traditional by today's standards since I don't have a degree in CS. I learned just about everything I know about software and computers from working or reading on my own. There weren't all that many colleges with CS departments when I first started programming. (I wrote my first Fortran programs in 1966). But I think it's true of any experienced engineer that they learned almost everything important after leaving school. So some company gave them a chance to get the experience that the next company demands. At least that's the way it has always worked for me.
A good example is the way I learned SNMP. I was working at NGC, the company that made the Sniffer network analyzer, when someone referred a guy who was working on a book on SNMP to me. He wanted some traces of SNMP traffic accessing an enterprise MIB for his book. I got him the traces and then he wanted explanations of what the traces showed. I learned from those traces and some reading of RFCs and wrote it up. He used a couple pages of my explanations in the book and thanked me in the acknowledgements. A few years later I was asked if I knew anything about SNMP during a job interview. I mentioned that book and that I had been thanked in it, though I tried to make clear that I was not an SNMP expert by any means. Apparently they weren't listening. They hired me and my first task was to design the MIB for their next-gen product and implement the SNMP agent. I had to learn quickly and finished in just under 3 months. A few years later I found myself enlisted to write the SNMP part of a book my boss wanted me to write with him. That book came out in 2007. I never set out to become an SNMP expert, nor did I ever take a class in it.
This has been the pattern that much of my expertise-gathering has taken. That's how I learned about drivers, symetric multiprocessing, SNA, ... But most of these opportunities to learn new stuff happened after I'd already been hired to do something else. So, if it works when I'm already an employee, why wouldn't it work if they hired me to do it. Companies are having a great deal of their software written by people who are learning on the job every day. They just don't seem to realize it.
Like so many tough decisions, "it depends".
Personally, I prefer the "train 'em" approach, but that requires both knowledgeable staff and available time to acculturate and train someone. In my experience, very detailed job descriptions often indicate that more senior managers didn't understand what was going on, so the job becomes a list of technologies. In such environments, there's nobody left to mentor the new hire, and many managers are (rightly) worried about too much on-the-job, unguided exploration in hopes of learning on the company's dime. So these managers naively ask for the world expecting that knowing technologies means knowing systems, or that writing code implies the ability to create stable applications. If only it were so ...
But there is no doubt value, from a hiring manager's standpoint (although my current employer doesn't have "hiring managers" per se), of setting up a filter to ensure that there's some level of technical competence in the applicant pool. Usually, looking for a duration of practice with particularly related technologies is one way to do that.
So, my point of view is to find a mix -- use the job description to find someone with reasonable technical chops, and the demonstrated ability to perform technical tasks, and then mentor them to the unique case at hand.
I apply will ye, nill ye, to jobs. My general rule of thumb is to confidently hit 50% of must, and be able to take a stab at a further 25%, but I'm not necessarily zealous about those percentages. If I've got 50% of must I'll definitely apply.
There was an interesting (and relatively unbiased for Fox) article about the lack of skilled workers in the US, even basic engineering graduates as parents encourage kids to avoid manufacturing and similar pursuits, ironic given that Engineering was the lifeblood of America and one of the main things that powered its economic growth:
Found it particularly interesting to see this on the back of Eric Schmidt's comments about my home country: