Get with the program

时间:2018-09-26 单词数:7240

双语 中文 英文




WANT a job with a successful multinational? You will face lots of competition. Two years ago Goldman Sachs received a quarter of a million applications from students and graduates. Those are not just daunting odds for jobhunters; they are a practical problem for companies. If a team of five Goldman human-resources staff, working 12 hours every day, including weekends, spent five minutes on each application, they would take nearly a year to complete the task of sifting through the pile.


Little wonder that most large firms use a computer program, or algorithm, when it comes to screening candidates seeking junior jobs. And that means applicants would benefit from knowing exactly what the algorithms are looking for.


Victoria McLean is a former banking headhunter and recruitment manager who set up a business called City CV, which helps job candidates with applications. She says the applicant-tracking systems (ATS) reject up to 75% of CVs, or résumés, before a human sees them. Such systems are hunting for keywords that meet the employer’s criteria. One tip is to study the language used in the job advertisement; if the initials PM are used for project management, then make sure PM appears in your CV.

维多利亚.麦克莱恩(Victoria McLean)曾是一名银行业猎头和招聘经理,她创办了一家名叫City CV的公司,帮助求职者制作申请。她说,在由人处理简历前,应聘者追踪系统(ATS)会拒掉其中多达75%的申请。这些系统会搜寻符合雇主标准的关键字。一个小建议是研究招聘广告中使用的语言,如果文中是用首字母PM指代项目管理,那么就确保你的简历中有PM的字眼。

This means that a generic CV may fall at the first hurdle. Ms McLean had a client who had been a senior member of the armed forces. His experience pointed to potential jobs in training and education, procurement or defence sales. The best strategy was to create three different CVs using different sets of keywords. And jobhunters also need to make sure that their LinkedIn profile and their CV reinforce each other; the vast majority of recruiters will use the website to check the qualifications of candidates, she says.


Passing the ATS stage may not be the jobhunter’s only technological barrier. Many companies, including Vodafone and Intel, use a video-interview service called HireVue. Candidates are quizzed while an artificial-intelligence (AI) program analyses their facial expressions (maintaining eye contact with the camera is advisable) and language patterns (sounding confident is the trick). People who wave their arms about or slouch in their seat are likely to fail. Only if they pass that test will the applicants meet some humans.


You might expect AI programs to be able to avoid some of the biases of conventional recruitment methods—particularly the tendency for interviewers to favour candidates who resemble the interviewer. Yet discrimination can show up in unexpected ways. Anja Lambrecht and Catherine Tucker, two economists, placed adverts promoting jobs in science, technology, engineering and maths on Facebook. They found that the ads were less likely to be shown to women than to men.

也许你以为AI程序能够避免传统招聘方法中存在的某些偏见,尤其是面试官倾向于选择与自己相似的求职者这一点。然而歧视会以意想不到的方式出现。安雅·兰布雷希特(Anja Lambrecht)和凯瑟琳·塔克(Catherine Tucker)这两位经济学家在Facebook上发布了宣传科学、技术、工程和数学领域工作机会的广告。她们发现这些广告被推送给男性的可能性比女性大。

This was not due to a conscious bias on the part of the Facebook algorithm. Rather, young women are a more valuable demographic group on Facebook (because they control a high share of household spending) and thus ads targeting them are more expensive. The algorithms naturally targeted pages where the return on investment is highest: for men, not women.


In their book* on artificial intelligence, Ajay Agrawal, Joshua Gans and Avi Goldfarb of Toronto’s Rotman School of Management say that companies cannot simply dismiss such results as an unfortunate side-effect of the “black box” nature of algorithms. If they discover that the output of an AI system is discriminatory, they need to work out why, and then adjust the algorithm until the effect disappears.

多伦多大学罗特曼管理学院(Rotman School of Management)的阿杰伊·阿格拉沃尔(Ajay Agrawal)、约书亚·甘斯(Joshua Gans)和阿维·戈德法布(Avi Goldfarb)在他们关于人工智能的合著*中写道,公司不能把这类结果视作算法的“黑盒子”特性带来的令人遗憾的副作用,就简单打发了。如果它们发现一个AI系统的输出带有歧视性,就要找出原因,然后调整算法直到影响消失。

Worries about potential bias in AI systems have emerged in a wide range of areas, from criminal justice to insurance. In recruitment, too, companies will face a legal and reputational risk if their hiring methods turn out to be unfair. But they also need to consider whether the programs do more than just simplify the process. For instance, do successful candidates have long and productive careers? Staff churn, after all, is one of the biggest recruitment costs that firms face.


There may also be an arms race as candidates learn how to adjust their CVs to pass the initial AI test, and algorithms adapt to screen out more candidates. This creates scope for another potential bias: candidates from better-off households (and from particular groups) may be quicker to update their CVs. In turn, this may require companies to adjust their algorithms again to avoid discrimination. The price of artificial intelligence seems likely to be eternal vigilance.