Effective Techniques for Assessing Senior Software Engineers in Interviews

Challenges while Assessing the Candidate in Interviews

  • A lack of Structure on how to access the candidate leads to subjective judgments and Inconsistencies.

  • Lack of Consistency when multiple candidates and multiple Interviewers are involved.

  • Bias can creep in if Interviewers are not trained to look out for bias or if they are not careful.

  • Limited time to evaluate candidates.

All interviewers are different.

  • Some focus on thought processes, while others prioritize code quality.

  • Different interviewers may value aspects such as problem-solving speed or their approach.

  • Some emphasize deep knowledge, while others appreciate adaptability and the ability to ask insightful questions.

Clarity

Let's avoid judging candidates solely on what they don't demonstrate. Candidates often don't know what interviewers are looking for. Let's not expect the candidates to read the mind of the interviewer. The interviewer should lead when it is about defining the problem statement and the criteria they are looking to Judge. Candidate should lead while solutioning.

What can be done to bring in Clarity?

  • At any point guide the candidate on what are you trying to evaluate.

  • Ask them how would they verify the correctness of the Program when you expect them to write a unit test.

  • Let them know that a particular problem statement is very open-ended and if you want them to assess their ability to Ask clarifying questions, then judge how relevant and good the clarifying questions were.

Encouragement

  • Be encouraging when a candidate is struggling. Examples

    • You're doing great so far; keep up the good work!

    • Your thought process is clear and logical; stay confident in your approach.

    • You've got a solid grasp of this concept; let's explore it further

    • You're on the right track; just a few more steps to go.

    • You're almost there.

  • Avoid MicroAggression

    • Do not make assumptions or stereotypes based on a candidate's gender, ethnicity, or other personal attributes.

    • Avoid intimidating body language.

  • Treat the interview like a discussion, long pauses from the Interviewer's side are bad. Also, do not interrupt when a candidate is talking unless really required.

  • Don't check your phone/laptop for non-interview purposes during an interview, this shows disinterest, and candidates feel discouraged.

Hints

  • Do not give hints or guidance that take a candidate in the wrong direction just to see if they can call the bluff.

  • The hint should be directional just to unblock the candidate. It shouldn't reveal a lot of the solution.

  • Don't give too many hints as it makes it difficult to assess the candidate against other candidates.

  • Hints should be standardized for each question if possible. For example, Leetcode provides 1 or 2 hints that are incremental and standardized for all candidates.

  • The hint should only be given when a candidate is stuck and unable to make meaningful progress or when the candidate asks for hints. Offering hints prematurely can adversely impact the assessment.

Structured Feedback

Traditional interview feedback is highly unstructured. They are a rating system between 1 to 5, followed by a text review. This can bring in bias, as all interviewers are different and think differently.

A structured interview consists of a scoring rubric, borrowed mostly from Grading systems in Universities where Hundreds of Professors are required to Evaluate thousands of students on a variety of subjects, each consisting of multiple topics.

A Scoring Rubric

  • helps Interviewers evaluate all candidates fairly.

  • Ensures different Interviewers will all evaluate a candidate using the same criterion and goals.

  • More specific feedback to the candidates on what they did well and what could be improved.

  • Consists of mostly checkboxes and radio buttons rather than texts thus reducing the chances of inconsistency.

  • Can be used as feedback for Interviewers if one of the interviewers scoring rubric is an outlier.

Example of Scoring rubric in an Algo Interview.

MetricScore 1Score 2Score3Score 4
AlgorithmsStruggled despite substantial help, showing limited algorithmic understandingProposed an algorithm but couldn't optimize it, revealing knowledge gapsWith the interviewer's assistance, the candidate devised an optimal algorithm, displaying a strong graspIndependently crafted optimal solutions, demonstrating mastery
CodingCouldn't code struggled with basics, wrote non-idiomatic codePartially coded, struggled with logic, used non-idiomatic syntaxCoded the algorithm with minor struggles, non-ideal practicesCoded effortlessly with efficient practices.
CommunicationCommunication was very poor, often silent and lacking explanationsCommunication was sub-par, and they struggled to explain their thought processCommunicated well, occasionally needing prompts for clarity.Communicated clearly throughout, explaining thought processes and trade-offs effectively
Problem-SolvingLacked problem-solving skills, displaying disorganized and random approaches.Showed partial problem-solving skills but struggled to stay on track.Approached the problem sensibly, with occasional reminders to stay on course.Tackled the problem methodically, dividing it into logical subproblems and navigating it effortlessly
TestingLacks unit testing skills, rarely writes tests, and often misses important scenarios.Wrote basic unit tests but missed edge casesProficient unit testing with comprehensive coverage and effective bug identificationExcels in unit testing, covering edge cases, and maintaining well-structured, maintainable tests.

This scoring rubric can be weighted, for example:

Algorithms 30%, Problem Solving 25%, Coding 20%, Communication 15%, Testing 10%

So, If a candidate has a scoring rubric like this Algorithms 4, Problem Solving 3 Coding 3 Communication 3 and Testing 2 The final score from that interviewer would be ( 0.3 * 4 ) + ( 0.25 3 ) + ( 0.2 * 3 ) + ( 0.15 ** 3 ) + ( 0.1 \ 2 ) = 3.2 out of 4.

Consistency

If a candidate is Interviewed by 3 DIfferent Interviewers for the same round, the evaluation should be very consistent from all 3 Interviewers.

The state of Mind of Interviewers could be different for several reasons.

Example: Their kid is sick so they are stressed, They are sad as their Dog died, Not feeling well due to a Seasonal Cold, and Feeling very high as their favorite football team won the World Cup a day before.

Emotions shouldn't result in inconsistent assessment.

Relevance

Evaluate on Skills required for the Job rather than arbitrary hypothetical scenarios that are not relevant to the profile.

Focus on Fundamentals, Not Programming Language and Frameworks that can be learned in a week.

Candidate should be set up for success rather than failure, they should be encouraged to perform their best.

Post Interview Feedback

  • Providing feedback demonstrates respect and consideration for the candidate's time and effort.

  • Candidates receiving helpful feedback, whether hired or not, are more likely to reapply or refer other candidates.

  • Great Software Engineers often are not great at giving interviews. Giving them feedback will help them in their future interviews.

  • Provide Actionable feedback based on the Scoring rubric

  • Provide the feedback without too much delay.

  • Feedback should be balanced, highlighting both strengths and areas for improvement.

Feedback Loop

Nowadays, many tech companies follow a six-month appraisal cycle.

For all new employees that have joined within the last 6 months, When this cycle approaches, hiring managers should review performance feedback and compare it with the interview feedback provided by the panel to identify any necessary adjustments to the interview process.

DEI in Hiring Panel

  1. Diversity: Ensuring a mix of panel members from various backgrounds, and varying Years of Experience.

  2. Equity: Fair distribution of responsibilities and opportunities among the Panel Members

  3. Inclusion: Creating an environment where the voices of all Panel members are valued.

This approach fosters a more inclusive and balanced evaluation of candidates.

Final Thoughts

Crafting a structured and consistent interview helps in hiring the most suited candidate for the job and results in a good candidate experience. Employer Brand is impacted by how well the interview was conducted.

If a candidate had good interview experience, they are likely to tell their friends about how they felt in the interview. A bad Candidate experience can result in bad reviews on Social media and harm the employer's brand.