Having interviewed with TripleByte, I think this is a bold and poor decision.
Upfront, I didn't pass a TripleByte interview I had (one of the few companies I haven't passed).
My interviewer showed up late initially, then took a break and showed up 10 minutes late after the break. Further, the interviewer nit picked super irrelevant details, and acted exceedingly smug and condescending. Some of the stuff he told me I was wrong about was related to my research. Even after attempting to explain it several times, he just said, "No, you're wrong, you don't know what you are talking about."
I then literally brought up the paper and sent it to him, before he said something along the lines of... Oh, well I guess that is right.
Overall, it was one of the worst interview experiences I have had, and I don't believe they are good way to recruit. Hell, I even passed all their coding questions with flying colors. It was the silly video conferencing interview with a smug engineer who really made the interview fall apart.
I had an interesting experience with triplebyte which wasn't as objectively bad as yours, but it also makes me skeptical of the company.
First round was multiple choice questions, relatively straight-forward. Second-round was skype-call and just felt incredibly subjective. I was asked questions around building out memcached to support arbitrarily-sized values, and I got the same "smug" vibe you sensed.
The interview style was very
"Him: How would you do X?"
"me: Well that's not a simple problem, there are a lot of solutions each with tradeoffs."
"Him: Okay so name one"
"Me: So you could do X"
"Him: BUT THEN Y [GOTCHA!]"
"Me: Yes, that's one of the tradeoffs of X"
It wasn't clear to me what the heck he was even looking for. Was he hoping I'd list race-condition problems? Had he not even considered race-condition problems? Was he looking for a theoretical solution or a real-world solution? Also he kept going on random tangents ("That brings me to an interesting question, how would you shift a gigabyte of memory 1 bit?"). He seemed very concerned with efficiently bit-packing the header in this problem, which seems silly to me when we're talking about storing gigabytes.
My understanding was that triplebyte was seeking to be the SATs of engineering, however SATs do heavy validation with test-retest reliability and such, I had no particular reason to suspect triplebyte's interview was any more objective than any other company's.
We actually put a bunch of effort into consistency/repeatability checks. Every interview is recorded (video), and we re-watch and re-grade a percentage of them to measure the consistency. A long-term experiment we're running is comparing qualitative scores (code quality, good process, how good did the interviewer feel the candidate was) with quantitative features (which tests passed, how long did it take, what design--picked from a decision tree--did the candidate take). We calibrate the qualitative scores with the recorded interviews. So far, quantitative scoring is winning (when judged against predicting interview results at companies). We're waiting, however, until we can see which better predicts job success.
It sounds like your ability as an interviewer is pretty poor. There are several examples of the same smug behavior. What sort of training have you gone through to ensure that you're actually an appropriate and qualified person to be interviewing?
Nah, you can't use forum threads as proof. You need to do the analysis scientifically, which based on the feedback, it sounds like you need to do. You don't bother quantifying the ability of you and your interviewers, you just assume you're great.
I interviewed with Ammon. He did not come across as smug. I went through the process just as a curiosity because I was fed up with the typical interview process. I haven't followed through anything else related to Triplebyte in terms of an actual job in case people think I have a favorable view because I got a job through them.
Wasn't there a study by Google a while back, where they found, as a trend, that their most successful people had only marginally passed their job interview ?
The finding was that people who had received a "no hire" recommendation yet still received an offer tended to do well. The reason being: to compensate for the poor feedback, someone else on the hiring loop believed in the candidate so much and saw something exceptional and were willing to go to bat for them.
That sounds like the Pareto-optimal solution to a job interview. It's like how you're not doing grad school properly if your grades are better than C.
Also, candidates who do too well could be too good for the job, since Google supposedly likes having incredibly overqualified people maintain do-nothing internal apps.
Notably absent from that list is "Are we verifying we're doing a good job as interviewers?"
It doesn't matter how good the interviewer feels the candidate is, or whether a design was picked from a decision tree. All that matters is whether the candidate can do the work at actual companies.
I think people here are reacting to irrelevancies during the interview process -- questions which cannot possibly be reflective of a candidate's real-world competency. (When was the last time you shifted a gigabyte of memory? And even if you did, that's not what companies are going to employ people to do. So why ask the question? Are you sure it isn't trivia?)
Interviews absolutely should be grounded in trying to predict how a candidate would do on a job. That's the whole ballgame. The question is how to best do that. First, you need to run a repeatable process (my previews comment). Second, you need to look at the right skills. The approach we take is to track a lot, and figure out what works the best over time. What we've found to be most predictive (so far) is a base level of coding competency, plus max skill (how good an engineer is at what they are best at). So (beyond the coding portion) we don't actually care very much about what a candidate is bad at. We care about how good they are at what they are good at. To give as many candidates as possible the opportunity to show strength, we cover a number of areas. This includes back-end web development, distributed systems, debugging a large codebase, algorithms, and -- yes -- low-level systems (currency, memory, bit and bytes). We do not expect any one engineer to be strong in all of these areas (I'm weak on some of them). But they all are perfectly valid areas to show strength (and we work with companies that value each of the areas).
We've recently moved to a new interview processes organized around this idea of max skill. It's working great in terms of company matching and predictive ability. However, it seems we may have underestimated the cost to candidates of being asked about areas where they are weak. There's more negative feedback here than we've seen in previous HN discussions, and I think that the interview change may be behind that. I'm taking that to heart. I think we can probably articulate it better (that we measure in a bunch of areas and look for max strength). We're also running an experiment now where we ask engineers are the start of the interview which sections they think they'll do best on. I'm excited about this. If engineers can self-identify their strongest areas, we'll be able to make the process shorter and much more pleasant!
So, the bit shift question: that come up down one branch of a system design question that we used for a while (we've since moved to a more targeted version that is more repeatable). The (sub)issue involved adding a binary flag to a large data blob (this came up as part of a solution to a real-world caching problem). Adding a single bit flag to the front of a 1GB blob has a problem. To really add just one bit, you'd have to bitshift the entire 1GB. This is clearly not worth it to save 7 bits of storage (ignoring that that would not be saved in any case). You can just use a byte (or word), or add the flag at the end. When candidates suggested adding a bit flag at the front, we would follow up asking them how they'd do it (to unearth if they were using 'bit' as a shorthand for a reasonable solution, or if they really are a little weak in binary data manipulation). This was one small part of our interview. By itself it in no way determined the outcome of the interview, or even of the low-level systems section. Plenty of great engineers might get it wrong. But I don't think it was unfair.
> Interviews absolutely should be grounded in trying to predict how a candidate would do on a job. That's the whole ballgame.
This is directly in conflict with this comment from Harj:
> The metric we optimize for is our onsite success rate i.e. how often does a Triplebyte candidate onsite interview result in an offer.
If you want passing your interview to mean that a candidate will pass their onsite interview because your interview is tailored to deliver people who look good in an onsite, you're acting as a recruiter, trying to give companies whatever they say they want. This model provides a lot of value to companies but zero value to candidates, since anyone who passed your interview would have gotten the job anyway.
If you want passing your interview to mean that a candidate should pass their onsite interview because they would perform well in the job, you're acting as a credential, telling companies what they want. This model provides value to companies and to candidates (assuming you can tell who will perform well). These aren't the same model.
Your candidate-directed advertising leans heavily towards the second model ("just show us you can code!"). That makes sense, since that's the model that provides value to candidates. It's disappointing to hear Harj say that what you really believe in is the first model, and disconcerting to see you openly disagree with your cofounder about what your company is trying to do.
It seems a bit disingenuous to accuse them of openly disagreeing - if you assume that the client company's interview is optimized for finding folks who will perform well on the job and Triplebyte's process matches you to the employers who are most likely to hire you, the statements are perfectly consistent. Of those two assumptions the latter seems demonstrably true, while the former is obviously a bit suspect. I don't think Triplebyte is in a good position to change it right now, but maybe some day they will get enough actual performance data to start influencing it.
Given their existing positioning of "the interview process is broken and we're here to fix it", I see no reason to credit them with the opinion that client company interviews are optimized for finding people who will perform well. There is no way to reconcile "the current interview process is broken" with "our goal is to find people who do well under the current interview process".
"The current interview process is broken because strong candidates are excluded due to weak resumes and companies and candidates do a terrible amount of duplicative work just to confirm a candidate is baseline competent."
Assuming that, would you accept the position to be consistent?
Why? Sure you will get some things that might initially end up looking significant but aren't, but it's not that hard to retest those things from scratch to minimize that.
In general if you're going to provide a critical comment I think it would be better for the community here to expound on it a bit so everyone can understand your argument.
So, the bit shift question: that come up down one branch of a system design question that we used for a while (we've since moved to a more targeted version that is more repeatable). The (sub)issue involved adding a binary flag to a large data blob (this came up as part of a solution to a real-world caching problem). Adding a single bit flag to the front of a 1GB blob has a problem. To really add just one bit, you'd have to bitshift the entire 1GB. This is clearly not worth it to save 7 bits of storage (ignoring that that would not be saved in any case). You can just use a byte (or word), or add the flag at the end. When candidates suggested adding a bit flag at the front, we would follow up asking them how they'd do it (to unearth if they were using 'bit' as a shorthand for a reasonable solution, or if they really are a little weak in binary data manipulation). This was one small part of our interview. By itself it in no way determined the outcome of the interview, or even of the low-level systems section. Plenty of great engineers might get it wrong. But I don't think it was unfair.
Of course it's unfair. The candidate isn't actually programming a solution when they're talking to you. They're on a tight time crunch, under a microscope, in front of an interviewer. The answers to your questions will literally make or break their future with you. Did you specify to them in your original question that the entries in the cache are 1GB large? If you assigned them the task of implementing a solution to your caching question, they would immediately notice using a 1-bit flag is a poor design decision.
The point is:
Plenty of great engineers might get it wrong.
That says quite a lot more about Triplebyte's question than the engineers. A wrong answer doesn't mean they're weak in bit manipulation or that they decide to implement poor solutions. It says they're suffering from interview jitters. They're weak in the artificial environment you've constructed for the purposes of the interview, which may or may not correlate with their actual ability.
This may sound like useless theorizing, but unfortunately a massive number of excellent engineers are awful in an interview setting. But if you give them a problem to actually solve, they pass with flying colors.
Triplebyte does give problems to candidates to solve, but it sounds like you also care about whether they can pass your interview (by demonstrating sufficient max skill when prompted) instead of whether they can implement solutions to the problems you assign to them. This rules out candidates who would otherwise do very well, which is the type of candidate you're trying to find.
I know that you're saying the verbal section of the interview isn't the whole process, but are you sure it's an effective one?
It might be positively misleading. A candidate who is very strong in the area you're looking for is also likely to be someone who will get your questions completely wrong, because they're not programming. They're talking. So it sounds like you're selecting for people who can talk well: those who can show strength during your interview when prompted verbally. Is that the right metric to find talented candidates?
If you were to put together a pipeline where e.g. you give candidates an XCode codebase and say "There are bugs in this codebase, and <specific missing features>. Implement as many fixes or improvements as you wish or have time for, then send us the code," you would have a mechanism which selects for candidates who are ~100% competent, since that's exactly the type of work they'll be doing on a day-to-day basis.
Some candidates wouldn't want to do that, so perhaps there should be an alternative for them. But it'd be vastly more effective than quiz-style questions during a timeboxed interview.
It's possible to come up with endless reasons why it might be a bad idea to set up a pipeline like that. But all the companies that have set it up have been shocked how well it works when they rely solely on that test. Instead of an opportunity to show strength during an interview, the candidate is able to directly answer the question "Can they do the work?"
> I applied through their project track. It was described as a low-pressure way to write your code ahead of time and talk about it in the interview. The interview was, instead, about making changes to my project while Ammon watched. (Also, there was a request to derive a formal proof while Ammon watched. I didn't get it.) After which I got a rejection saying that my project was great but my interview performance was so poor that they wouldn't move forward.
It sounds like Triplebyte almost has the pipeline described above, but it won't work if you watch the candidate or ask them to do more work. The project alone has to be set up to be a sufficient demonstration of skill.
Well, my objection was much more fundamental. I never suggested packing a bit at the beginning, I suggested using a header of about 20 bytes or so (trivial against a GB). As for the shifting 1 bit question, my complaint with it was that I was in the middle of answering 1 question and then another question was brought up as a non-sequiter. The fact that I was asked it made me assume there must be some answer superior to iterate over all the data, which I don't believe there is.
The other reason I dislike this type of interview question is that if the interviewer never proposes a superior alternative at the end, you get no opportunity to challenge them. How do we know my solution didn't solve problems the other party didn't see?
For context, it seems the "preferred" solution was to build a wrapper around existing memcache. However as a real-world engineer the solutions I was solving for (90% of your clients won't use this wrapper at a company over 100 people, so we want to avoid key collision with people who aren't using this driver) were not the theoretical ones (How could we make the header only 4 bytes!) that the interviewer was evaluating on.
Plus on top of all this, I have no idea if the person interviewing me is unaware that memcached supports atomic increment, knows but doesn't care, or is deeply concerned about preserving this functionality. There are dozens of facets that an individual could consider "important" in this type of problem, and there's no objective basis for most of these concerns without a context of the user (because that's what all engineering comes down to after all).
My guidance to companies in this type of situation is: A great engineer can do 80 hours of work in 15 hours, but not necessarily do 30 minutes of work in 29.
This is all just really hard (and counterintuitive). We've tried take-home projects, with and without asking the engineer to make changes live. This is actually very controversial (a lot of engineer are, reasonably, against the time commitment). And (we found) to get an equivalent level of consistency (the 1st step toward accuracy) the project needs to be pretty big. The work really needs to be of the level of what engineers do in a full day or more on the job. Shorter projects than this introduce the noise of how much time the candidate spent on the project. You can introduce a hard time limit, but then you're back in interview stress territory.
Large take-home projects (and trial employment) totally are better ways to evaluate engineers than interviews. Unfortunately, they require major time commitments from the candidate that many engineers are not able to give. Most (about 80%) engineers select a regular interview if given the choice. I do think they are a good option to offer, but they can't (unfortunately) replace interviews in the majority of cases.
We've also tried debug sections (where we give the candidate a program with bugs in it and ask them to fix test cases). This works great as a portion of the interview (but misses some people with other skills, so it can't be the entire interview).
We have found that well-crafted small problems are enormously successful predictors of good hires. Sizing them to require an evening, possibly two, of work is what to aim for.
Studying for in-person interviews is a major time commitment. One that's reusable across employers, sure, but a Triplebyte take home project should be just as reusable as studying for a Triplebyte interview.
Have you guys considered that seeking to quantify engineering skill may be fundamentally at odds with the goal of hiring good engineers?
I'd be interested in seeing the argument for why this is a good thing, beyond the fact that it enables a company like triplebyte to exist.
To me it seems like these concerns always boil down to the same thing: tests that try to quantify something that may be unquantifiable. I suspect this is because it's not cost effective to facilitate a process that digs well beyond engineering trivia.
There are two things to measure - ability to get an offer (easy) and effectiveness on the job (hard). Triplebyte has clearly provided an improvement on the former. Seems like the latter is very difficult to measure, but I don't know why anyone would assume quantifying would negatively correlated with performance - at worst one might assume it is not correlated.
Have you guys thought about trial employment with payment -- that was an idea suggested a couple months ago. I think the upfront cost would be better than having a false positive.
Yeah. Trial employment better be paid! :) Even with pay, unfortunately, folks with an existing job often can't take the time off. And when people have the option between one company making an offer and another with a trial period, they often go for the offer (which makes sense). The other issue is that a company can't do a trial period with every person who applies (too much cost for the team), so there has to be screening step in front, which sort of just moves the problem to an earlier level.
I do think trial employment can be a great thing. But it's not a universal replacement.
Trial employment immediately screens out candidates who do not have the time or patience to spend an entire day working very hard with a prospect. It also introduces an NDA, a need for a sandbox, etc.
I'm sorry that the interview went poorly! I'm especially sorry about the lateness. I just pulled up our notes from your interview, and I was your interviewer! So that makes it especially bad :(
I'm not an expert in all areas. At the end of the interview we have a section where we let the discussion go into whatever technical area the engineer wants to talk about. It sounds like you're an expert in an area where I am not. In those cases I try to ask questions and push deep, but (depending on the topic) that can be hard.
edit: removed discussion of the specific topic discussed. Sorry, folks below are right
Expertise is something our model handles less well (it's much harder to standardize). This certainly results in us failing some great people (and it sounds like that may have happened here). I'm happy to talk about this more. Give me an email at ammon@triplebyte.com.
I have to say, I give you props for responding so cordially after I was overtly harsh. I'm sure you could have bashed me, but didn't so thank you.
That being said, I think the best way to improve the process is to try to standardize it by area of expertise. I'm sure that is done to some degree, but I think more pronounced may be better. I think of roles within companies are too static and don't actually represent role(s).
Regardless, best of luck. I didn't mean to be a dick, but I felt obligated to share my opinion in this case.
Based on your other replies, it sounds like the computer vision part of the interview was fine then. Did you struggle with other parts of the interview then technically speaking?
I'm sure I did, although I don't recall any hard fail. Could have been social skills too I suppose. Either way, I feel my original comment was the real gripe I had. It left a bad taste in my mouth and that damages reputations.
It's reasonable to want to defend yourself and set things right, but I don't think you should be sharing specific notes from an interview where the interviewee had a pretty reasonable expectation of privacy. (unless you already got permission?)
I agree with you, he could have been exceededingly harsh. He shared what he did and overall I'm not displeased, it was honest and didn't make me look too bad (which again, he could have done I'm sure)
On the other hand, sharing my info publically is not professional. I shared what I viewed as unprofessional, without specifics regarding interviewer or even questions. He shared my technical expertise and much more specific details regarding my question(s) and interview.
That kind of goes along with my original point, it seems abrasive and leaves a sour taste in my mouth.
Quite honestly, it probably damages their reputation more by responding. Why would someone want to interview somewhere, where they know if they criticize, their interview could be made public.
Sharing anything like this on a public forum is exceedingly unprofessional, regardless of whether the content is positive or negative. If candidates don't have an expectation of privacy, they won't provide honest answers, which drastically undermines Triplebyte's claims about more effective evaluation.
I took the original comment about the interview with a grain of salt(these things are inherently subjective). But it's fairly obvious now that they're as unprofessional as claimed. Hope no one from Apple or Facebook sees this.
Had a bad experience myself (with the obvious caveat that it will seem sour if you don't pass the the interview, and posting anon but will follow up privately for anyone posting contact info).
The challenge was to code up a regex parser in three hours then discuss in an interview.
During the interview I was asked to add (IIRC) a Kleene operator. I repeated back his explanation of what a Kleene operator is. I explained how that definition would impact my choice of how to implement it. During the implementation, I made repeated references to that same spec. I got it working.
Then, he told me that it didn't work, because a Kleene operator means something completely different than what I understood. He apparently wasn't listening the whole time because I repeated back his spec several times when implementing it and he never corrected it!
(Perhaps this was some subtle test of "see how they react to impoliteness"?)
More importantly though, it was rejected for not being an elegant state machine implementation of a parser, which made it hard to extend. Which is fair, in a way. I knew, abstractly, that that was a better way to do it and I would have gladly read up on the concept and written my implementation that way. But with the overhead of setting up the codebase, docs, and tests, I would have exceeded 3 hour limit that they trust applicants to hold themselves to.
Apparently, the right way to proceed here would be to learn state machines, severely exceed the 3 hour limit, and then lie and say it took me 3 hours. Is that what they're selecting for? Or perhaps for people who already know state machine implementations?
I also had a failed interview with Triplebyte that left a bad taste in my mouth. I experienced similarly condescending tones and disrespect from the interviewer. I was told to prepare and share/explain code I'd already written, but when the call started I was instead given three problem options for a live-coding session.
To be fair, they were super friendly and responsive to my critique, and I found their interview notes/follow-up helpful and accurate. Still, if you're going to provide interviews as a service, you should work hard to make the interview a positive experience for the candidate. I'm loath to apply to any company using Triplebyte for now, but I'd probably do it if I really wanted the job.
As far as I know, no company uses Triplebyte exclusively. It just provides a backdoor to onsite interviews with several companies (which was really useful for me cause coming from a non-traditional background I wasn't getting any callbacks from my direct applications to selective companies). There isn't really any downside to giving Triplebyte a go, aside from a few hours of your time.
This was an interviewer working for Triplebyte? I thought their interviews were standardized. Can you tell us more about the experience, and what they asked? I'm super interested in this.
I don't know if you can standardize peoples personalities... The real issue here (in my opinion) was the guy was not a good interviewer.
> Can you tell us more about the experience, and what they asked?
I honestly don't remember, they asked about active record (as I do some web dev), but also stuff related to describe on the spot some SQL queries and why one would be faster than the other. All of that appeared fine, the real issue was our debate related to computer vision (where my research was done). Sorry, I can't really dive more in, but it was probably a year ago.
> the real issue was our debate related to computer vision
Oof. I feel like computer vision is a particularly misunderstood field. In a way, it reminds me of statistics - I feel like people purposefully obfuscate things to make their results sound more impressive. The great thing about computer vision is you can almost immediately see through the smoke and mirrors - ask for a live demo. I'm amazed how many people refuse to show me a demo of something that is inherently meant to be viewed.
Did you feel like there was a point to the questions they asked you? If you were being as rigorous as you could about designing the interview they delivered (I know your memory on this is hazy), what do you think they'd be trying to learn about you from the interview?
Did they give you homework challenges? Did you clear those but then bomb the interview?
I had the opposite experience with Triplebyte. I found every part of it enjoyable and really liked the interviewer.
What I didn't like was the chats with the companies that followed. It felt like the whole Triplebyte interview process never happened and we were starting from scratch..
My sentiments exactly, although I've heard even worse (not through Triplebyte) - people put through 6! onsite interviews, so I can't help but feel like I circumvented something. I think though that this is to be expected, because Triplebyte is selling an alternative narrative to the hiring and recruiting folkways of the Valley and many companies will be trying to fit a square peg into a round hole around what Triplebyte is doing.
I don't understand this. Do you expect to get hired at companies without ever getting interviewed by them? Triplebyte culls out as much of the process as you can realistically expect to be culled i.e. everything before the final onsite interviews.
For me personally though the main value was in getting contacted by companies that wouldn't have otherwise contacted me. I would have been willing to go through technical phone screens for those companies if needed, but not having to do so was an extra bonus.
> Triplebyte culls out as much of the process as you can realistically expect to be culled i.e. everything before the final onsite interviews.
Which is what? A short chat with the recruiter and an 1 hour phone interview.
With TripleByte instead you get something like, an online questionnaire, 1 two-week project and one 4 hour video interview. (Please correctly me if I don't have it exactly right).
How is this an advantage if you still have to do a call with the recruiter/company, and then on-site interviews?
> the main value was in getting contacted by companies that wouldn't have otherwise contacted me
I do remember doing a two(or maybe one) week project, and could swear that my final interview was more than 2 hours. Maybe the process changed since I last interviewed?
> Do you expect to get hired at companies without ever getting interviewed by them? Triplebyte culls out as much of the process as you can realistically expect to be culled
Almost all other fields are capable of trusting third parties to certify competence. They interview for fit, but with a basic understanding that the candidate is qualified.
Employers waste less time giving interviews, and benefit from a more accurate filter than they could develop in-house.
Employees waste less time giving interviews, and will find out early (i.e. by failing college classes, prompting them to switch majors) if the work isn't for them. Good practitioners are unlikely to be denied their dream jobs by the unreliability of the interview process, and truly incompetent practitioners aren't allowed to keep trying until something sticks because (below some threshold) they'll never get in the door. They're given the benefit of the doubt for making it through the main gate (undergrad/bar exam/whatever) and from there, judged on the basis of their performance in their actual role, not how well the study for a song-and-dance routine that approximates the role.
Triplebyte's value proposition is to be such a trusted third party for software engineering, since universities aren't. If companies don't trust it, and view people who passed Triplebyte with the same "incompetent fraud until proven otherwise" lens they apply to everyone else, then Triplebyte isn't adding value for anyone in the ecosystem.
I think you can realistically expect the skills test to be culled from interview processes: in other fields, it never even appeared.
>Almost all other fields are capable of trusting third parties to certify competence.
Employers in pretty much every other field complain that credentials are close to meaningless. As a result, they heavily emphasize work experience and demonstrable work accomplishments in hiring. This creates a problem for newcomers who have difficulty bootstrapping experience. Software engineering is unique in that you can at least get some idea of a person's ability through a day of interviewing (whiteboard interviews may not be a perfect measure of competence/ability, but they're way better than what is available in most fields), and don't need to use experience or credentials as a gauge of ability (though they use them as screening criteria to sift through the many applications they get, which is where Triplebyte comes in handy). The fact that employers directly assess the skills of job candidates is a benefit, not a drawback, for the field of software engineering.
>Triplebyte's value proposition is to be such a trusted third party for software engineering, since universities aren't. If companies don't trust it, and view people who passed Triplebyte with the same "incompetent fraud until proven otherwise" lens they apply to everyone else, then Triplebyte isn't adding value for anyone in the ecosystem.
Triplebyte is adding clear value to the ecosystem by finding candidates who would have otherwise fallen through the cracks and never gotten an interview in the first place, and matching them with potential employers. In my case, for example, Triplebyte has provided value to both me and Asana by matching us up together. Because of my nontraditional background, had I applied directly to Asana, my application would likely have never made it past their resume screen, and they would have missed out on a great candidate and I would have missed out on a great opportunity.
lol. It's funny how TripleByte claims no whiteboarding no interviews yet at the end you still have to be vetted the same way. Why would anyone use TripleByte then? Do you enjoy being tortured via whiteboarding?
Almost exact same situation here. I was rejected by TripleByte and ended up getting offers at many of the companies I applied to. I had the same feelings about the interviewer as the parent comment.
I wouldn't say it was particularly bad, but it was worse than average compared to my other interviews.
For what it's worth, I had the video conference part go poorly. For one I was really nervous with the interviewer after they exhibited smugness and I was sure the thing went south from that instant. Funnily enough the person claimed they weren't fully awake and needed coffee and what not; not sure if it was because of an early morning call, but 10am PST isn't really early, especially not when you setup an interview in advance. I passed interviews at a couple of other places so while this incident is attributable to the interview-loop variance I'm skeptical of spending time with their process if that's the case.
Don't worry not passing it is probably for the best, as if you interview through companies with them, good luck trying to get reimbursed for your interview expenses. They promise that they will handle everything and it will be super "convenient", but it's only convenient if you are fine with getting nothing in the end. I am somewhat convinced part of their business model relies on saving money by not paying people.
Interesting, I interviewed with one of the cofounders and felt like it was one of the best interviews I had. Lengthy, moderately difficult, but fair. I passed, but remember feeling that way even immediately after the interview.
As someone who went through the Triplebyte interview process not long ago, I actually had a positive video screening experience with Guillaume as my interviewer. He was definitely focused on getting as much signal from the interview as possible in the time we had (which is a plus imo, since I don't have tons of spare time to waste). IIRC I didn't complete 100% of the coding part, but he seemed genuinely interested in how I was approaching things and the underlying algorithmic concepts more than in the nitty-gritty of the code itself.
That session was overall certainly energy-intensive, but then no more than a good interview session with someone who knows what they're doing interview skills wise. But at a more general level only Triplebyte would know about interviewer variance and whatever secret sauce they have to maximize SNR during the screens, so I can't speak about that.
Upfront, I didn't pass a TripleByte interview I had (one of the few companies I haven't passed).
My interviewer showed up late initially, then took a break and showed up 10 minutes late after the break. Further, the interviewer nit picked super irrelevant details, and acted exceedingly smug and condescending. Some of the stuff he told me I was wrong about was related to my research. Even after attempting to explain it several times, he just said, "No, you're wrong, you don't know what you are talking about."
I then literally brought up the paper and sent it to him, before he said something along the lines of... Oh, well I guess that is right.
Overall, it was one of the worst interview experiences I have had, and I don't believe they are good way to recruit. Hell, I even passed all their coding questions with flying colors. It was the silly video conferencing interview with a smug engineer who really made the interview fall apart.