Are jobs changing in nature so rapidly that it is impossible to determine the specific competencies that each one of them will require to ensure success? How does a recruiter, an HR generalist, or a hiring manager deal with discontinuous change, vaguely defined jobs, and constantly evolving strategies? I once had a recruiter working for me who thought that the time we spend interviewing and screening candidates for specific jobs was a waste. He insisted that all candidates who had met a very small minimum number of criteria for a job would be able to perform that job equally well. The only remaining thing to determine was how well the candidate fit with the hiring manager and, to a lesser degree, the organization. He felt that it would be more cost effective to make a lot of hires quickly and then let their on-the-job performance determine who should be kept and who should not. In the simplest terms, he felt it was better to have a loose, easy-in, easy-out hiring practice than a much tighter and thorough upfront screening process. At the time I was appalled at the thought and said so. I felt recruiters had a responsibility to ensure quality and to make sure that the very best were being presented to hiring managers. Now I am not so sure. When I think back to the middle of the 20th century, most jobs were filled fairly quickly. There were few employees who had the specific title of recruiter, and those who did were often just clerks who made sure paperwork was properly completed. Most jobs were filled after a brief interview with a hiring manager who made his decision based on a candidate having a critical skill or two, and on soft factors such as eagerness, appearance, family background, and physical characteristics. Most jobs could be learned quickly and it was quite easy to see whether a job was being done well or not. It was also easy to get rid of poor performers and plenty got fired right away. However, a lot didn’t. There were many things wrong with this approach. The most obvious was that it blatantly discriminated against anyone who did not fit the stereotype of the hiring manager. Greater awareness of discrimination and new legislation drove the growth of the recruiting profession and removed much of the injustice this system perpetuated. But this system did have one virtue: it was simple and was built on a belief that attitude and performance were what really counted. Many engineers, doctors, and lawyer were trained in what amounts to an apprentice system right up until World War II. Formal skills training only gradually gained acceptance after the war, when thousands of GIs went back to school on the GI bill. As we moved into the 1950s and 1960s, these more causal hiring practices were replaced by the development of job requirements: things like minimum levels of education or years of experience before a person would be considered for a position. This was seen as fairer and served as a screen against hundreds of people potentially applying for the same job. The problem with this approach is that the defined requirements were almost never connected to actual performance. They only seemed fairer because they eliminated or reduced screening out because of race or sex. However, we have learned over the past 40 years that people who qualify for jobs based on their education or experience alone are not necessarily good performers. We now know that simply selecting people by generic measures like education and experience don’t work very well and discriminate against those with the real skills who do not have the required credentials. Job requirements today are changing so fast that we can’t keep up. During the dot-com boom we saw how quickly new skills became needed and how weak our selection systems were. We just didn’t know what competencies or skills we should look for, and we didn’t have time to find out. Managers were, and still are, confused as to what they want in a candidate, and there is a tendency to go back to selection criteria that smack a bit of the past. Referral programs are a bit like family connections, and attitude is now more important than ever in selection. The need for HTML programmers grew exponentially for months in 2000, as did the need for network administrators and other kinds of programmers. Most recruiters didn’t even begin to understand what they were recruiting for, and clearly no meaningful generic educational guidelines could be established because few schools offered the education. On the other side, there were almost no experienced people available either. Managers were frustrated at the recruiters and vice versa. Unfortunately, this situation will be a characteristic of the emerging century as new technologies replace old ones and entirely new skills are needed. It will be very difficult to use traditional techniques or measures ó or even to figure out the precise competencies and skills that are needed for a job. So, what will we do? Three rules seem to be forming around defining new positions and redefining the more traditional ones: 1. Use technology to profile jobs quickly. Several vendors now offer software that allows you or a hiring manager to describe a job, pick out key competencies and skills, and draw a profile of what is needed within a matter of a few hours. Older methods might take months to produce profiles ó albeit those profiles were probably more accurate and complete than these. The issue is how permanent the jobs and functions you are hiring for are. For example, the duties of programmers change constantly. C programmers have had to evolve through C to C++ to C# in the course of just a few years. How much do you want to invest in perfection? What becomes important are things like general programming speed and ability, perhaps, rather than the specific language they know. 2. Be competency flexible and teach hiring managers that development is part of recruiting. The 80/20 rules applies more than ever, as new jobs and duties emerge and recruiters are forced to find ways to define them and select candidates using them. Managers will be forced to accept that they will not be able to find candidates with 100% of what they want. Managers and HR will learn that development is a core function of the firm in the 21st century. IBM put in place a development-centered in the 1960s when they began hiring and developing new college grads because there were no people with the skills they needed. Remember, there were no programmers when the first mainframes were produced, so IBM had to develop them. Many companies have used development as a strategic edge ó when you have people with skills and others don’t, you tend to win. 3. Have robust performance management systems in place. By hiring people using broad competency descriptions, as I am advocating, you may hire some poor performers. And that’s okay. What is not okay is ignoring that and allowing them to stay in your organization. A good performance management system, based on whether people achieve realistic goals and meet the requirements of their position, is essential to success. The hallmark of the best 21st-century organizations will be their approach to defining the people they need. Traditional measures of education, experience, attitude and cultural fit may play a small part, but what will be significantly different is a quick, flexible approach to defining competencies combined with efficient performance management systems. This will result in more fluid and less well-defined jobs, but broader and more multi-skilled employees.
Advertisement
Too Fast to Fathom?
Nov 25, 2003
This article is part of a series called News & Trends.
This article is part of a series called News & Trends.
Get articles like this
in your inbox
The longest running and most trusted source of information serving talent acquisition professionals.
Advertisement
Related Articles