Unbiased Recruiter

The very idea of a robot conducting a job interview would unnerve most people. Of course, it is becoming commonplace for robots like Sophie and Pepper to be interviewed by humans, but one rarely finds a robo interviewer shooting job-related questions with no big bosses in sight. But all that may change soon.
TNG, one of Sweden's largest recruitment firms, is trialling Tengai, a robot that will ask an interviewee all the right questions. The goal is not to save the time of senior leaders or cut down on costs but to eliminate bias from the job interview process as much as possible. No matter how objective humans try to be, they cannot control their subconscious bias. A hiring manager can react to a candidate's appearance, accent, gender, religion, ethnicity and numerous other factors. Microexpressions and the more obvious body language could give away such prejudices even when the person in charge is trying his/her best to suppress the same. This, in turn, could make the interviewee feel extremely uncomfortable, and the interview might get derailed.
Tengai is different, though. The human-like computer interface (a disembodied head with a built-in computer and artificial intelligence or AI capabilities) sits on the top of a table, at the eye level - closer to candidates and not away from them (with a table in-between) as human recruiters do. The robo recruiter has been developed by Furhat Electronics, an AI and social robotics company and TNG's partner for the project.
The robot has been built to carry out thousands of tasks, from teaching a language to guiding passengers at an airport. It can change facial expressions, voice and even its identity as a new face can replace the current one. Although the face looks slightly stiff, Tengai is, nevertheless, very pleasant, putting one at ease as it tilts its glowing face, blinks and smiles before it starts to ask questions. But unlike its human counterparts, it does not make small talk before or after the interaction or forms first impressions - traps which lead to inadvertent bias.
There are more and better safeguards to avoid such pitfalls. For example, Tengai has been exposed to thousands of recruiters, candidates and interviews, which helps eliminate bias by sheer dint of data quantity and variety. It can also deliver bad news and rejection quite diplomatically.
If robo recruiters are set to become the new normal, career consultants are gearing up as well and beginning to advise on how to prepare for an interview with a robot or a chatbot. They ask candidates to do away with the chit-chat, for instance, which is hardly required when dealing with an AI system and could even confuse matters for the likes of Tengai. Instead, it would be wise to focus on keywords and repeat them often - after all, one is dealing with algorithms here. As social robots built on the Tengai concept will be able to see and respond to body language, it is also essential to get that in order and not look too panicked or upset.

Heart of the Matter
Now and then, one sees news reports on Apple Watch saving a life because it alerted the user to an irregular heartbeat, prompting the person to get help in time. But sceptics have expressed doubts on whether the wearable can pick up on such problems, especially those involving atrial fibrillation, or AFib, which can lead to cardiac issues. The danger of flagging false positives or not detecting changes in heart rhythm when users begin to rely on the device cannot be ignored.
In response, Apple funded and conducted a massive study at Stanford University, which involved 4,19,000 volunteer Apple Watch users. It was a large sample size, but the results showed that only 0.5 per cent of users received warnings about irregular heartbeats. The implication: The watch was not detecting abnormalities on a random basis or flagging false positives.
Users who got those health warnings were asked to consult a doctor involved in the study so that the issue could be investigated for at least a week. Out of those who followed the procedure, a third was found to have AFib. Although this does not conclusively indicate that the Apple Watch accurately detects the condition, doctors not involved in the study feel that the outcome is very encouraging and calls for further research.