Advertisement

Annual Online Screening and Assessment User Survey Results: Part 1

Jan 28, 2004
This article is part of a series called News & Trends.

Towards the end of 2003, I invited ERE readers to participate in my second annual “20 Questions About Online Screening and Assessment” survey. The goal of this year’s survey was to pick up where last year’s survey left off by continuing to identify important trends in the usage of online screening and assessment tools and by providing up-to-date data regarding usage rates for online screening and assessment tools. In order to help provide continuing insight into the development of major trends in the usage of online screening and assessment tools, this year’s survey asked many of the same questions as last year’s did. However, one major difference between this survey and the one conducted last year is that last year’s survey did not provide much in-depth information about the usage of applicant tracking systems and screening tools relative to that of assessment tools. In order to help collect this type of information, this year’s survey added a few specific questions related to the use of ATSs, separated questions referring to screening tools from those related to assessment tools, and provided questions designed to drill down more deeply into the characteristics of usage for both screening and assessment tools. First of all, a big hand to all of the ERE readers out there who were kind enough to spend some of their valuable time completing my survey. Thanks to you, I was able to collect a wealth of information. In fact, the survey provided so much good information that I will be presenting the results in two installments. So, without further ado, on to the results. Sampling Limitations Before I begin discussing my findings, I think it is important for me to bring up the fact that the conclusions reported in this article may have been influenced by several characteristics of the sample. These include:

  1. Those persons who are presently using screening may have been much more likely to take the time to complete the survey because they have made an investment in screening technology.
  2. ERE readers represent a population that is likely to be much more technology-minded than those in similar positions who do not read ERE. This means that responding ERE readers may be more likely to be using technologically advanced methods such as online screening.
  3. ERE readers are busy folks so I had to keep my survey short. Because I limited myself to 20 questions, it was not possible for me to drill too deeply into any one particular issue.
  4. Because only nine participants (11%) indicated that they had participated in last year’s survey it is important to understand that this is not a true longitudinal study. This means that caution must be taken when interpreting Information that examines this year’s data relative to that collected last year.

Despite the possible sampling limitations, the results of this survey offer a lot of meaningful information that have allowed me to identify some interesting trends in the use of online screening technology. Sample Characteristics A total of 78 readers completed the survey (as compared to 65 last year). Of these, 62 were from the U.S., 2 were from New Zealand, and 1 was from Australia. The information in Figure 1 summarizes the relative percentages of the job titles held by survey respondents: The data in Figure 1 indicate that the majority of respondents were either recruiters (36%), HR executives (19%), or managerial level-staffing personnel such as hiring, HR, or, staffing managers (27%). These results are very similar to those obtained last year. As it did last year, this data suggests that respondents represent a highly relevant sample population and should provide a very good source of information on the topics of interest. The information in Figure 2 summarizes the number of persons employed by survey respondents’ companies: The data in Figure 2 indicate that 37% of respondents were employed by large companies, while the remaining respondents were fairly evenly split between small (24%), medium (23%) and medium/large (15%) size companies. These results were very consistent with those of last year’s survey. The information in Figure 3 summarizes the approximate number of hires per year made by respondents’ organizations: The data in Figure 3 indicate a wide range responses in terms of number of hires made per year (from a low of 0 to a high of 16,000). Despite this range, the data indicates that the majority of respondents (61%) made between 0 and 150 hires per year. However, the percentage of respondents who indicate making between 500 and 20,000 hires per year was also rather high (30%). Surprisingly, very few respondents (9%) fell into the middle range (151-500 hires per year) in their hiring volume. Additional analyses comparing the relationship between company size and number of hires made per year confirm the expectation that the smallest companies in the sample make fewer hires, while the largest companies make the most hires. However, the hiring patterns of medium-sized companies (500-5000 employees) are less clear. Although, as one would expect, the biggest companies tended to make the most hiring decisions and the smallest companies the fewest, results still indicate a significant drop-off in hiring amongst mid-sized companies. I feel this can be partially explained by the slow economy during the time the survey was completed, most notably the huge slowdown in hiring that was experienced during 2003. The fact that such a high percentage of respondents (61%) indicated making less than 150 hires per year helps to confirm the fact that overall hiring volumes were low in 2003. The information in Figure 4 summarizes the length of time respondents’ companies have used the web to collect information from candidates for the purpose of making staffing and hiring decisions: The data summarized in Figure 4 indicate that the majority of respondents’ organizations (56%) have been using the web as part of their hiring process for one to four years, while another 31% indicate having used the web for over five years. In the minority were organizations who either do not use the web for hiring or have done so for less than one year (13%). Additional analyses revealed no relationship between the size of a company and the time they have been using the web for hiring related purposes. Overall, the data indicate that the sample was relatively balanced in terms of respondents’ job titles and the size of their companies. Survey respondents represented companies of all sizes and the distribution of respondents amongst various sized organizations was relatively even. The fact that almost half of respondents are recruiters is complimented by the fact that about half of the sample hold managerial or executive-level positions. This suggests that the sample provides a good balance between respondents who are actually using online hiring technology on a daily basis and those who are making decisions about implementing such technology. Sample data also indicate that the majority of respondent’s organizations have significant experience using the web for making hiring decisions. Overall, the balance reflected in this sample means that the data represent an excellent cross-section of people who are likely to be using online screening. This suggests that the data summarized here offer excellent insight into high-level trends in the use of online screening and assessment technology. The fact that the sample characteristics are similar to last years’ should help to highlight ongoing trends in the usage of these tools. ATS Usage Survey results indicate that the majority of respondent’s companies (64%) currently use an ATS system, while another 8% indicate they are in the process of installing an ATS system. Additional analyses examining the usage of ATSs relative to company size revealed the following:

  • 97% of companies with over 5,000 persons have or are currently installing an ATS.
  • 75% of companies from size 500-5000 have or are currently installing an ATS.
  • 67% of companies from 50-499 have or are currently installing an ATS.
  • 37% of companies under 500 indicate they have or are currently installing an ATS.

These analyses confirm the notion that larger companies are more likely to have an ATS or be in the process of installing one. The relatively high percentage of usage amongst small and mid-sized companies also provides information that companies of all sizes are using ATS systems. Additional analyses provide the following information regarding the relationship between a company’s ATS usage and number of hires they make per year:

  • 95% of companies making between 501 and 20,000 have or are installing an ATS.
  • 100% of companies making between 151 and 500 hiring decisions per year have or are installing an ATS.
  • 54% of companies making less than 150 hiring decisions per year have or are installing an ATS.

Again, these data confirm the idea that companies making more hiring decisions are likely to have an ATS. They also reinforce the notion that even companies with medium and small hiring volumes (i.e., smaller or medium sized companies) are still likely to be users of ATS systems. Survey data also indicate a very interesting relationship between companies that use ATSs and the percentage of successful hires made by respondents’ organizations over the past two years. For example, amongst those with ATS systems:

  • 80% indicated a low percentage (below 30%) of hires were successful.
  • 63% indicated a medium percentage (50-70%) of hires were successful.
  • 54% indicated a high percentage (70% and above) of hires were successful.

These results are very interesting because they clearly indicate that the absence of a relationship between the use of an ATS and the perceptions of the success of hires made using the system. These results reinforce my strong opinion that, while ATSs are extremely useful, an ATS alone is not sufficient for increasing the overall quality of hiring decisions. Overall, these results confirm the fact that the adoption curve for ATSs has reached maturity and that almost all medium to large sized organizations are using some form of an ATS. Another interesting point is the fact that even small organizations are using ATS systems. This widespread usage of ATSs by companies of all sizes suggests that there are ATSs available to suit a wide variety of hiring needs. Of additional interest is the fact that there seems to be little relationship between the usage of an ATS and the actual quality of hires made using the system. This confirms my opinion that, while ATSs are extremely useful for management of candidate data, they do very little to help organizations predict which candidates will succeed. Current Usage of Online Screening Tools As with last year’s survey, one of the major goals of this survey was to collect data about usage rates and characteristics for online screening and assessment tools. Unlike last year’s survey, this year I split screening and assessment into two separate sections. I did this in order to help us drill down a bit more deeply into these two somewhat distinct types of online staffing tools. This section deals with the usage of online screening tools (online assessment tools will be addressed next month in Part 2 of our results). In the survey we defined these tools as: “Tools that gather information about, or ask candidates to respond to, questions about their experience, skills, and qualifications in order to identify if they meet minimum job requirements. These tools are typically used early on in the staffing process.” This section of our survey was designed to gather the information needed to answer the following general questions about online screening:

  • What is the present usage rate for online screening?
  • What are the characteristics of this usage (i.e., what type of screening is most popular?)
  • Do those who use online screening feel it is effective?

The data summarized in this section provides some interesting answers to these questions. For instance, 59% of respondents indicated that they use some form of automated screening tools. Table 1 summarizes the usage rates for 4 specific types of tools: Table 1: Type of Screening Tools Used

Type of Screening% Currently Using
Automated qualifications screening50%
Resume scanning tools28%
Biodata, personality questions14%
Index of “Job Fit”26%

This information suggests that qualifications screening was the most popular form of screening currently in use. The usage rates for each of these tools are very consistent with last years’ results, indicating a stability in their usage rates. Effectiveness of Screening Tools The survey also sought to gather information regarding the perceived effectiveness of screening tools and the usage of metrics to help evaluate their effectiveness. These survey questions also provided some interesting results. For instance:

  • 60% of respondents indicated that they do not collect any metrics regarding the effectiveness of the screening tools currently in use in their organizations.
  • Only 50% of respondents currently using screening tools indicated that they felt these tools are effective, 30% indicated they do not believe these tools are effective, and 20% were unsure about the effectiveness of their screening tools.

Additional analyses examining perceptions of screening effectiveness amongst respondents indicating that their organizations DO collect metrics provided some very interesting information. These analyses revealed that:

  • Amongst respondents whose organizations do collect metrics, 70% of respondents felt their screening solutions are effective, 19% felt they were not effective, and 11% were unsure of the effectiveness of their screening tools.
  • Amongst organizations who do not collect metrics, 39% of respondents felt their solutions were effective, 39% did not feel their tools were effective, and 23% were unsure of the effectiveness of their screening tools.

This information suggests that organizations that take the time to collect metrics have a better understanding of the effectiveness of their screening tools and seem more likely to use tools effectively. This information may seem like a no-brainer, but it clearly reinforces the importance of collecting metrics related to the usage of screening tools. Without hard data, it is much harder to know how effective your screening tools are. In my experience it is much more difficult to convince the powers-that-be to allocate more money for screening tools when you lack the ability to demonstrate the ability of these tools to provide ROI. The information in Table 2 summarizes respondents’ opinions regarding the effectiveness of screening tools currently used by their organization: Table 2: Perceived Effectiveness of Various Types of Screening Tools

Type of Screening% indicating this form of Screening is effective for their organization
Resume scanning tools55%
Automated qualifications screening65%
Index of “Job Fit”75%
Biodata, personality questions82%

The information in Table 2 suggests that the tools that collect more in-depth information from applicants are more likely to be seen as being effective. I do not think this is a coincidence. Tools that examine applicant characteristics such as personality and biographical history should do a much better job of predicting applicant success then tools that collect only cursory, surface information (i.e., resumes). It will be interesting to see the relationship between perceptions of effectiveness of these screening tools versus that of assessment tools (this information will be provided in Part 2). Finally, respondents were asked to indicate the single biggest problem they have experienced related to the use of screening tools. The most common responses to this question included:

  • Screening tools are too time consuming to set up.
  • It is hard to know what questions to ask.
  • Fear that candidates will not want to invest the time needed to complete screening questions.
  • An overall belief amongst staffing personnel that screening is ineffective.

Again, these responses are not surprising to me. I have been hearing these same reasons for several years now. While I agree that screening tools are time consuming and can be hard to set up, I also understand the many benefits of using these tools. As with anything else in life, it is very hard to get something for nothing and screening is no exception. An investment in overcoming the difficulties with setting up screening questions correctly and positioning the value of screening to applicants will pay off handsomely in the end. Conclusions I feel that the survey results presented here provide a very good reflection of the major hiring-related trends we all lived with over the past few years. These include:

  • Almost all organizations are using the web as a part of their staffing process.
  • Fewer hires are being made across the board, but especially amongst mid-sized companies.
  • ATS usage has become widespread amongst companies of all sizes and almost all medium and large companies are using some form of ATS.
  • There is little to no relationship between ATS usage and success of hires.
  • Metrics are a critical for understanding the ROI associated with your hiring process but few companies are using them. Those that are using them seem to feel more positive about the effectiveness of their hiring processes
  • Screening tools are used by many organizations, but their usage is still not as widespread as that of other technology based staffing tools such as ATSs and employment portals.
  • Many organizations report that the screening tools they are using are not effective. Screening tools that provide a more in-depth look at a candidate relative to job requirements are more effective than “on the surface” tools such as resume reviews.
  • People are still passing over dollars to pick up pennies by refusing to expend the effort needed to properly configure their screening tools.

Stay tuned, Part 2 will discuss results related to the current usage of assessment tools, and future trends in the usage of both screening and assessment tools.

This article is part of a series called News & Trends.