6 hires in 5 countries and 4 continents in 4 weeks
We recently scaled out our engineering team at Alooba, hiring 6 new team members, across 5 countries and 4 continents. This case study defines what we believe an ideal skills-focused hiring process would look like and how we stacked up against it.
We break down our hiring process into each of the stages and explain in detail what we did, how and why. We gave ourselves a rating for each stage, and thought about what worked well and what didn’t.
Transparency is one of our core values at Alooba, and I hope this unusually transparent review of our inner workings is valuable, especially for other startups, those looking to remote hire and anyone struggling to improve the diversity of their teams.
The single most important takeaway from this case study is that CV screening is now a redundant and actually completely counterproductive part of the hiring process. We have replaced it with skills assessments, which makes the hiring process more scalable, fair, fast, efficient and accurate.
- We received 571 candidates; that’s 571 candidates that we didn’t have to read the CVs of, saving at least 10 hours.
- A skills quiz was roughly 4x more predictive of someone’s skills than just a CV screen. The quality of our funnel improved so much that the number of interviews we needed per hire reduced by about 50%.
- 1 person we hired nailed our skills quiz, placing 1st out of ~500 candidates. Under our old CV screening process, they would never have made the first cut because their CV was very bare-bones and didn’t stand out at all.
- Because we have automated the initial screening step, we could widen our funnel significantly to promote diversity and try different hiring sources — we were able to hire people from 5 different countries.
- We reduced our average time to hire to 11 calendar days from 18 calendar days.
Would you like to know more about our skills assessments for data analytics and data science? Feel free to reach out with us to discuss how to break free from the shackles of legacy hiring.
Let’s Break It Down
Our process effectively consisted of these 9 stages. 9 sounds like a lot of stages, but once we explain them, you’ll see it’s not really.
- Introduce Alooba
- Technical Interview
- Meet & Greet
Stage 1 — Define
Before anything we first had to have some sense of what an ideal hiring process would look like. What exactly are we optimising for? Of course, this is something we think about every second of the day as we build out our own product and service offerings.
We believe an ideal hiring process is:
- Accurate — the best quality candidates move through each stage, and end up getting hired.
- Cost-effective — the explicit recruiting costs are minimised. For us this was mainly job ads and assessment services. Hiring fees also come in here if you use a recruitment agency.
- Efficient — the amount of time we internally had to spend on this process is minimised.
- Diverse — the candidates come from a variety of different backgrounds.
- Fast — from application to offer, the window is as short as possible.
- Fair — every candidate is afforded an equal opportunity and provided some relevant feedback at each stage. Unconscious bias is explicitly removed from the process where possible.
- Relevant — each stage is necessary, and candidates aren’t subjected to arbitrary steps that add no value.
- Transparent — candidates know what they’re getting into, what/when the relevant steps of the process are.
Quite obviously, there are trade-offs in each of these. E.g. the fastest process would be to hire the first person who applies. That’s not going to be very accurate though! So it’s a careful balancing act.
Roughly, I’d say we basically want to maximise ‘accuracy’, subject to all the other constraints.
Stage 1 Learnings
- Clearly articulated sense of what an ideal process is — this is our bread and butter, so we should know it.
- We have defined what we believe to be a good process, but we don’t necessarily have a clear and simple way to measure each of those things. What gets measured, gets managed, as they say, so this is super important.
Stage 2 — Plan
Having established what a good hiring process would look like, we then turned our attention to who we were looking to hire and why. At Alooba our current situation is that we have a very clear vision of what we’re building and a well-defined product backlog. We therefore need top quality engineers to build out our product.
Our CTO, Adric Schreuders, as the hiring manager for this role, was responsible for defining the role requirements in terms of skills and experiences in more detail. Although the specific skills were broad, we set our bar quite high. The skills we focused on as non-negotiables were:
- Good problem solving skills
- Solid understanding of Object-Oriented Programming principles and patterns
- Solid understanding of Relational Databases and SQL
- Grasp of critical internet security concepts
- Strong communication skills
In the past I have seen these requirements defined far too narrowly. I remember a manager at another company refusing to even interview anyone who didn’t have specifically T-SQL (MS SQL Server) skills. Even if the candidate was amazing with another version of SQL (MySQL, Oracle etc.), which is not that much different, they weren’t even considered. This massively narrows the audience and reduces your chance of finding a good quality candidate, blowing out time to hires and hiring costs.
Stage 2 Learnings
- We have quite a clear sense of what we needed in a candidate and why.
- Keeping the skills broad allowed us to find the best people regardless of their specific past experience.
- We didn’t have a set number of roles to hire when we started hiring, so our strategy of full stack vs front-end specialists might have changed a little.
- Given the size of our team now, perhaps we should have included a Q/A — time will tell.
Stage 3 — Source
Having figured out the candidate profile we were after, we then turned our attention to where to find these candidates. The last round of hiring we’d done in June/July 2020 was a lot harder than we were anticipating. We really struggled to find candidates that met our minimum skills level.
We found this surprising because a) we were sourcing from LinkedIn which has nearly 700M users b) we were willing to hire from anywhere in the world and c) we are in the throws of a massive global recession, so there should be an ample supply of talent.
This time we decided to broaden our search from the get-go, placing job ads on:
I once got rejected from a job in Frenchs Forest while living in Ultimo, which are suburbs of Sydney about 20km apart. The recruiter didn’t think I’d commit to the travel time. How the world has changed!
We started with LinkedIn again, but expanded our ads to target many more geographies. Each of our ads was also a ‘remote’ role, a feature that LinkedIn has added during the pandemic. It ostensibly allows the ad to appear in front of any candidates, but based on the applicants we received, it still seems biased to the location of the ad.
We used the standard LinkedIn job posting, although we did a demo for LinkedIn Recruiter and a few other products. Honestly, even after that demo the different models they have are still not clear to me.
We targeted various locations, including Ukraine, The Philippines, Indonesia, Thailand, Chile, Brazil, Mexico, El Salvador, Spain, Italy, Nigeria, Kenya and South Africa.
Why these locations?
- We are a remote-first team, and we hire technology people from anywhere.
- This was a nice spread which helped us achieve the diversity that we were looking for.
- Finally, the salaries in those markets are fair value for money and within our budget.
- Wide-reach with high volumes.
- Recently introduced 1 free job posting at a time, presumably to compete with Indeed.
- Very expensive.
- It’s a deliberately watered down version of the product, as LinkedIn would prefer to upsell you to one of their proper recruiter products. Simple things like exporting a list of candidates who applied to conduct further analysis was impossible. There’s not even a way to search for candidates — pretty annoying if you have 100s of candidates across several ads.
- Some buginess in the ‘duplicate’ job feature — the skills aren’t copied across between job posts, luckily we noticed. Instead, LinkedIn tries to infer them from the job description text ‘automagically’ — sadly lacking the ‘magic’ bit.
- Opaque pricing and highly suspicious reporting data. E.g. we’d seen an ad after an hour with 1 impression and the entire daily budget exhausted.
- There’s no concept of a shared company account — so Adric, for example, couldn’t access the candidates that had applied for the roles that I had posted, and vice versa.
- Despite there being a daily budget that is typically exhausted, the applications taper off quickly after a few days. As you manually re-post the ad, more candidates are attracted.
- Because of trade sanctions between the United States and Iran, Cuba, Syria and North Korea, we could not target candidates in those countries. It’s deeply disappointing that global politics would enter into something like this.
- No ability to not show different versions of effectively the same ad to the same candidates. This led to some candidates applying multiple times.
We posted on Indeed, in various cities in Mexico and some cities in the US. Indeed’s audience is massive in the United States, and postings are free. You can also ‘sponsor’ them with a daily budget, very much like LinkedIn’s model. It is definitely not necessary in the United States to sponsor ads because you will get enormous volumes anyway, especially in this economy.
We saw reasonably small volumes of candidates in Mexico.
- Very easy to use, especially for things like duplicating roles across different cities.
- Easily exportable data for additional analysis.
- Hard to link exportable candidate data to your own because they transform the email address of the candidate to route communications through their platform (e.g. from firstname.lastname@example.org → email@example.com).
- Low volumes for our target market Mexico.
UpWork is an outsourcing platform, the result of a merger a few years back between ODesk and Elance. It’s a competitor to the Australian local version Freelancer.com. We’ve hired one person from here before so gave it another shot.
On UpWork you post an ad, and then their existing freelancer user base can apply for it. They normally apply with a profile and a quick cover note. You can then manage the end-to-end job on the platform, including comms, payment (with milestones/escrow) and review.
These platforms are excellent for project-based work, but don’t really suit an ongoing employment model that well.
- Large audience
- Very easy to use
- The audience here is more interested in part-time project based work, rather than full-time employment, which to be fair, is the focus of the platform.
- A lot of the bids you get are from development agencies, which we don’t want to deal with and add no value to our situation.
- UpWork and other similar platforms have failed to solve the skills-validation problem. The review system is fundamentally flawed, which leads to a lot of low quality candidates who have 20+ 5 star reviews. The star system does not act as a valid signal.
- UpWork charges a substantial fee to the employee on an ongoing basis.
Freelancer is an outsourcing platform, and a direct competitor to UpWork. Features-wise, it’s more or less identical.
- Similar to UpWork, easy to use, free to post, large audience.
- Very similar to UpWork — not quite the right audience, generally low quality and large fee is charged to the freelancers.
- 0 candidates from here managed to meet our minimum benchmark on our screening quiz.
After slating LinkedIn for its outrageous cost per impression and other annoying features, a few people suggested we try other sources such as AngelList. We added ads there. We saw very limited activity and only a handful of candidates applied. We would need to try ‘sponsoring’ the job posting next time to see any real volumes. Nobody from there managed to pass our screening assessment.
- Very low volume, so not much to say
This site came onto our radar quite late in the piece. After a demo we also gave that a shot. They have an interesting model of $240USD per job post, and they guaranteed 6 validated candidates would apply within 48 hours. They source them from a combination of their own talent pool, and other sources. Some come pre-validated for at least some skills, having already taken a coding skills assessment.
This site has a lot of promise, and the general way they’re approaching the problem of skills assessment/hiring is quite similar to us, so that resonated well. We did struggle to get the full volume of candidates that were promised.
In future hiring rounds next year, I expect this startup will have progressed and perhaps we’d lean more heavily on this platform. It aligns nicely with our core beliefs at Alooba.
- Interesting platform that is attacking this problem in the right way we think.
- Hired 1 candidate from 1 attempt there, so great conversion rate.
- Focused purely on remote full time software engineering roles, which was a perfect fit for us.
- Not quite able to deliver the volumes needed.
- Some of the candidates didn’t have the pre-validation of skills promised.
- Fairly small audience at the moment.
Stage 3 Learnings
- Geographically, socially and ethnically highly diverse candidate pool.
- Attracted more high quality candidates than we were able to hire — 9 candidates were ‘hireable’, having reached our standard.
- The lack of an ATS limited our ability to reach many job sites, because we were managing job ads manually.
- Highly imbalanced gender ratio (90% males) — we’re looking into augmented writing tools to better word our job ads next time, hopefully to not deter some segments from applying.
- Our LinkedIn costs blew out quite a bit.
Stage 4 — Screen
So, we’d gotten ~500 applicants…what do we do now? This is the screening stage. Enter Alooba and this is our core value proposition.
The last round of hiring that we did in June Adric was still manually reviewing CVs. However, screening CVs comes with several fundamental flaws:
- It’s manual and so very time consuming and costly.
- A CV is not very predictive of someone’s skills & knowledge, so it’s not very accurate.
- It opens up a can of unconscious bias worms because of all the irrelevant personal information on a CV.
- Each CV is unique, so it’s impossible to compare candidates in an apples-for-apples way.
- Candidates can put literally anything they want on their CV, so they don’t represent reality.
- And many other reasons.
For this round of hiring, we decided to give every candidate the chance to shine, by giving them a fair, lightweight Alooba skills assessment as the first step. Having recently developed a public link feature, we simply added this to each of our job ads. Candidates go to the link and add their name and email address, receive the assessment invitation and then complete the assessment on Alooba. This meant there was no candidate admin needed on our part.
The assessment was only 20 questions and 25 minutes long. By keeping it short, we ensured a very high attempt rate. Having a really long (>2 hours) assessment as the first stage may deter some candidates, limiting your funnel.
We set our benchmark score to 70%. With strong historical data on how candidates typically performed for the questions in the assessment, we knew this would be about the 90–95th percentile.
Stage 4 Learnings
- Saved at least 10 hours manually screening 571 CVs.
- Drastically increased pipeline quality — when manually screening candidates with a CV, our technical assessment pass rate was 15%, compared to 55% when screening with our Alooba assessment. This led to a reduction of about 50% in the number of interviews needed to hire a role.
- 1 person we hired nailed our skills quiz, placing 1st out of ~500 candidates. Under our old CV screening process, they would never have made the first cut because their CV didn’t stand out at all. Our process was changed to select the best candidates, not the best CV writers.
- It was free for us to use our own product.
- We robustly tested our product outside of the analytics hiring use case and it was very successful.
- There were a few instances of false positives (candidates who did well on the screening quiz but failed subsequent stages), which was expected. We’re optimising the screening quiz to see where the gaps might have been.
- We didn’t closely track the exact source of the candidate (tying them to a specific job ad), which then made some of this analysis quite manual.
- We hadn’t finished our feature yet to expose test results to candidates. We have now — so every candidate that applies in the future can receive actionable feedback.
Stage 5 — Introduce Alooba
Having automatically identified the candidates as having some of the critical skills that we needed, it was then on to first round interviews. Adric contacted all candidates that met the screening benchmark and conducted video call interviews with Google Meet.
This is the first stage we looked at the candidate’s CV, and it wasn’t to scrutinise their experiences but more to add additional context to the candidate’s skills that may not have been covered by the screening assessment.
The interview aimed to:
- Explain Alooba, what our mission is, what we’re doing, and our development stack. Adric basically got a 10 minute explanation down pat which aimed at covering off the usual questions most candidates would typically want to know about the position.
- A few technical questions validating their screening results and other skills highlighted in their CV.
- Alignment on salary expectations and other working arrangements.
- Allow the candidate to ask any questions about the company or the role.
- Explain the rest of the hiring process.
Stage 5 Learnings
- Candidates were able to understand the role and our expectations.
- We were able to filter out candidates with unrealistic salary expectations and those unsuitable for remote work.
- We were able to get an understanding of the candidates’ relevant skills not directly covered in the screening assessment.
- We were able to get the candidates excited about the prospect of joining our team, which helps convince them to take the additional time to complete the next stage, our in depth skills practical assessment.
- Some candidates didn’t respond to emails inviting them to schedule a call with us. We found that the way to get a response from candidates is to contact them through the job ad they applied to, however given some of the limitations with the sourcing platforms used, it could be quite difficult finding exactly which ad they applied to.
- Scheduling these live video call interviews with candidates all over the world in different time zones was a bit difficult at times.
Stage 6 — Assess
Having pitched the candidate Alooba, validated their high level skills, their interest in the role, matched salary expectations, and suitability for remote work, now it was time to get an in depth understanding of the candidate’s skills.
For this stage, we used HackerRank which we’ve been using for a few months. We’ve found it to be an effective platform to assess coding skills. You can imagine HackerRank as like Alooba, but for programming skills. It’s a mature, high quality platform, similar to Codility. There are a tonne of tools in this crowded space, with these 2 being the leaders.
The assessment was mainly focused on, writing optimized code to solve real problems, database schema creation, querying databases, and object-oriented application design and the assessment lasted for around 3 hours.
This assessment allowed the candidates to select the specific programming languages and database system themselves based on whatever they’re more familiar with, as we’re more interested in assessing their general skills, rather than skills with our exact stack.
There’s a tricky balance between including enough questions to assess candidates across a breadth of different skills, with the amount of time candidates will be willing to set aside for completing a technical assessment.
It’s important to place the in-depth assessment at the correct stage of the process. If you place it too early, you’ll deter too many candidates from progressing, and it’ll end up proving costly if you’re paying per candidate. If you place it too late in the process, you’ll waste your own time interviewing candidates who don’t have the right skills for the role.
Stage 6 Learnings
- Once the assessment is set up it’s very easy to invite candidates to take the assessment and requires very little time on our part other than marking the few subjective questions.
- Provides a more practical way of assessing a candidate’s skills, getting them to actually write code and do other tasks.
- Candidates can do the assessment in their own time rather than trying to schedule a live assessment with one of our engineers.
- Very clear scoring making it easy to compare the performance of candidates that may have taken the assessment weeks apart.
- Filter out candidates that have a grasp of the general concepts but lack the ability to apply them to practical situations.
- Expensive per candidate.
- High drop off rate. A quarter of candidates invited to do the technical assessment never ended up completing it. Most likely due to the time commitment required.
- Very limited bank of questions. We ended up adding many of our own questions rather than just using available questions.
- Some candidates found the in browser IDE and diagram tool a bit difficult to use.
- Perhaps there is an alternative way of assessing the same skills — such as a live coding interview — that would attract some other candidates and reduce the drop-off rate. Canva did some interesting experiments here a few years back.
- It is a fair chunk of time for the candidate to dedicate to this, so we need to be mindful of their time. I’ve heard some companies propose compensating candidates for the assignment, which is interesting.
Stage 7 — Technical Interview
If candidates have performed well in the in-depth skills assessment, Adric then does a technical interview with the candidate. He runs through the candidate’s in depth assessment, and provides feedback. Candidates crave feedback during the process and it’s easier to elaborate while talking.
Adric will then drill down into certain questions that the candidate didn’t so well in, to see if he can get them across the line. Perhaps they misinterpreted the question or didn’t have enough time to complete it. If they can talk through how they’d solve it, then great, they know what they’re doing.
At this point the candidate has demonstrated that they have the general skills required for the role but we also use this opportunity to drill into the candidate’s technical experiences, having them explain how they solved real problems in their past experiences. The technical interview also allows us to assess the candidate’s technical communication skills and general thought processes.
Stage 7 Learnings
- Get a good understanding of candidates’ technical communication skills.
- This stage allows us to identify the strongest candidates that not only have the general skills required for the role, but are also able to communicate their problem solving thought processes.
- Most of the key questions that would normally be covered in a technical interview had already been covered in either the screening or assessment process, so sometimes this felt quite short, compared to technical interviews that normally cover the full technical assessment too.
Stage 8 — Founder Meet & Greet
Once Adric is happy with them, I will organise a quick chat for the same day or the next day. I’ve been asking candidates some problem solving questions which relate to features that we will build and are composed of problems that I could — at least logically — solve myself.
This gives me a decent baseline to understand their logical problem solving skills in a discipline that I’m not expert in. I also want to give them a flavour of Alooba, what we’re doing and why. I start by asking them if they have any questions. This is really just like a final check.
At this point, either we’ll make an offer or not. If we don’t of course we try to provide some meaningful feedback as to why. Typically candidates who’ve reached this stage will be made an offer, or at the least will be hireable. We typically made an offer within a day so as to expedite the process and minimise our time to hire.
Stage 8 Learnings
- Asking all candidates the same thing (a structured interview) allowed me to easily compare their responses.
- Clearly it’s nice to meet everyone at least once before hiring them into the business.
- Hopefully it was useful for the new hires to meet me to understand the history and vision a bit. We should measure the effectiveness of this stage in the future with structured interview feedback from candidates.
- It at times felt like a slightly redundant process from my side, because the candidates were typically really good, and we hired almost everyone who reached this stage.
- An extra step is an extra cost of time to both the candidate and us.
Stage 9 — Offer
If I was also happy with the candidate, then all that’s left is to offer them the role. We’d generally do that quickly, via email. We’d ping across the contract with the offer email. No faffing about.
Stage 9 Learnings
- This is the easy part.
- We delayed once for one candidate, just when we were waiting on a batch of others to get through the process. That was unnecessary.
So, there you have it. That’s how we hire at Alooba. As you can see the key takeaway from this case study is that CV screening is now a redundant and actually completely counterproductive part of the hiring process. We have replaced it with skills assessments, which makes the hiring process more scalable, fair, fast, efficient and accurate.
For more information how skills assessments can be used to improve your hiring process, please reach out here.
This was originally posted on alooba.com on 17th October 2020.