Notes on the Pivotal Interview Process
The Pivotal interview process was often a reason that people joined Pivotal. It was an unusually structured process and was unusually enjoyable as a candidate, at least for candidates who were good fits for Pivotal.
Hey there! I'm Nat Bennett, former Pivot, current DevOps & XP consultant. You're reading Simpler Machines, a newsletter about how to survive making software.
I'm continuing my tear of writing up some of the things I've been meaning to write up about Pivotal β today, it's interviews β but this week I've also got a couple of pages up.
I wrote some notes on running Galera. These are super rough but if ya'll are interested I have a bunch more to say on this topic.
I've also been updating my consulting pitch and as part of that have been collecting examples of other consultants' public pitches into a shareable and referencable page. If you know of good examples of this sort of thing or have them yourself I'd like to add them here.
Anywayβ to interviewing.
What was special about interviewing at Pivotal?
The Pivotal interview process was often a reason that people joined Pivotal. It was an unusually structured process and was unusually enjoyable as a candidate, at least for candidates who were good fits for Pivotal.
It was also unusual in that it didn't look for "intelligence" or "ability to solve hard technical problems" hardly at all. It was much more interested in evaluating people for the ability to solve "normal" problems quickly and in a group.
What I mean when I say "the Pivotal interview process"
One thing I should note is that the process I'm talking about did not apply to all R&D positions at Pivotal, or all engineering positions at Pivotal. There were parts of Pivotal that developed software in pretty different ways than the ones I'm most familiar with β Data, the Spring org, various parts of the field.
When I talk about "the interview process" I'm talking about the process for Pivotal Labs, and the process for Cloud Foundry, which was the R&D organization most heavily influenced by Labs. There were some differences between what these two organization hired for but the basic process was pretty similar and they shared a lot of personnel.
The structure of the Pivotal Engineering Interview
Again there are some variations on this across times and places but the basic structure of the interview process had four steps.
- Pre-screen/recruiting/sourcing
- Resume screening
- The technical screen
- The all-day pairing interview
Pre-screen/recruiting/sourcing
This is the part that I know the least about and I would very much appreciate corrections/additions if there's stuff I'm wrong about or am missing.
The main reason I mention it is that there were some things that Pivotal did differently from other companies I'm experienced with, that had downstream consequences on the rest of the process.
First, recruiters and interview coordinators were assigned to work with specific offices and, by extension, specific Engineering Directors. (Engineering Directors were the hiring managers β you would be hired by an ED and then assigned to a manager.) I think sometimes they were assigned on a regional basis but I know that in ~2019 in LA there was a specific recruiter the EDs worked with to source candidates. This meant that engineering management could give specific feedback to recruiters and they would respond to that feedback, vs. accepting candidates from a black box sourcing process managed by a central recruiting agency.
I emphasize this because I'm going to be talking a lot about how standardized this process was, but the actual operation of that process was hyper-localized. This was a pattern that I saw a lot at Pivotal, but haven't seen much elsewhere. The more typical thing appears to be non-standardized processes, or standardized processes operated by a distant and hard-to-influence external entity.
Second, Pivotal didn't recruit from Stanford, or many of the other "name brand" CS programs. Pivotal did recruit pretty heavily from Berkeley, as I understand it because there was one specific professor who taught TDD and other XP practices and would refer folks to Pivotal. But in general Pivotal recruited more from specific, small liberal arts colleges β I know multiple Harvey Mudd and Oberlin grads whose first jobs were at Pivotal. I worked with exactly one Stanford undergrad during my time at Pivotal, and he was grandfathered in from the original acquisition of Cloud Foundry from VMware.
Pivotal also basically didn't recruit from Google or the other big tech companies. We had a handful of ex-Googlers and ex-etc.s hanging out but they were often folks who had worked at Labs "back in the day," left to go to Google/etc., and then come back.
Pivotal also hired some from bootcamps but had mixed results and had mostly stopped hiring fresh-out-of-bootcamp candidates by the time I worked there. My understanding is that certain bootcamps explicitly taught people how to pass the Pivotal technical screen, so it was believed to be difficult to accurately assess bootcamp grads capabilities.
Resume Screening
Once a candidate got sourced their resume would get dumped into a queue and reviewed by an Engineering Director to get passed into the next step, the technical screen.
One thing that could automatically get you promoted from this step was a referral from an existing Pivot. This was partly, "Well if they're referred they're probably good" and partly a favor to the referring Pivot.
My understanding is that what screening Directors looked for varied from CF to Labs and from office to office, so I can only speak to what the Cloud Foundry Director I worked with looked for on a resume, which was, in approximately this order:
- The candidate had shipped code to production on a team. Side projects and similar were interesting but didn't count for nearly as much as evidence that the candidate had experience working in a group.
- The candidate had experience with web development. This was explained to me as, "At Cloud Foundry we're making a system for running web applications. It's much easier to ramp up from already familiar with the thing that we're running, and it's just doing it automatically and at scale that's new, than if you're also learning about HTTP and whatnot for the first time."
- The candidate had some interesting or unusual experience. Things like "wrote code that runs municipal sewage systems" or "ran a bakery before attending a bootcamp." I think the Director I worked with especially valued these things, but in general Pivotal really valued evidence of ability to get things done on teams regardless of what that team was doing, and varied life experience. I worked with a lot of self-taught engineers and folks who had previously worked in things like Astrophysics or Bioengineering.
That last one was also a big factor in who got the "not right now, but apply again in a year" e-mail which we sent pretty regularly. (And which I got the first time I applied to Pivotal.)
Oh, and, we also looked at the cover letter but for basically one thing: Evidence that the candidate understood what job they were applying to. If the we saw something about pairing or testing or cloud engineering or Golang we were very likely to pass that person on to the RPI, whereas something very generic, or that mentioned something that we didn't do, would generally get marginal candidates rejected.
The Remote Pairing Interview
I'm not going to describe the technical screen in detail but I will describe its general outline:
- solve a small problem
- that we expected the candidate to have seen before
- by taking small, test-driven steps
- with a pair
The exercise was always in Java, but the candidate wasn't expected to know Java β the interviewer did all the typing, because the point wasn't to test how well folks knew their editors.
The problem was standardized so that the exercise could be scored. The scoring was on a scale of 100. 99. (Because "Rob says no one is perfect.") Organizations using this system usually had a cut off that was required to pass on to the next interview stage β IIRC it varied between 95 and 98, depending on how aggressively the org was hiring. usually this was either 93 or 92, but could go a little higher if the pipeline was full. It was possible for an individual Director to promote someone who had a lower score than the cutoff but VPs kept an eye on the scores of every candidate who got passed through the pipeline and would ask about it if you passed someone who was "too low."
At the time I was most involved with hiring, there was a pool of interviewers who could conduct the scored interview, and a formal training process. The pool was shared between Cloud Foundry and Labs. I think this changed over time β my sense was that early in Cloud Foundry's history Directors did all the RPIs themselves, but switched to the pool system as both Cloud Foundry and Labs grew.
The technical screen was not a puzzle. It did look for some specific technical skills β mostly, the ability to handle encapsulation in some way β but it was intended to be a problem that candidates had handled before.
There was basically one way to pass the RPI and a bunch of standard ways to fail. Typical ways included:
- The candidate didn't complete the exercise
- The candidate didn't understand the difference between test code and production code
- The candidate didn't understand the difference between a correctly & incorrectly failing test
- The candidate tried to write the whole solution out at once instead of taking small test-driven steps
- The candidate didn't refactor
- The candidate didn't take correction or suggestions from the interviewer or was otherwise a jerk in some way
There are definitely more ways, these are just the ones that happened enough that I'd hear about them.
I have heard that the RPI's scoring system was so good and so standardized that candidates would always get the same score, even if they retook the RPI years later. I'm not sure I believe this since it doesn't fit with my understanding of how any other similar assessment works, but its something that folks incorrectly claim about educational assessments a lot.
I have also heard that the RPI worked best at taking folks who were experienced software engineers but who didn't necessarily know how to do TDD and filtering for the folks who had strong TDD aptitude.
This is basically the most structured technical screen that I have ever heard of. I would love to hear about places that had similar structured screening stages but AFAIK they don't exist.
Pivotal was very unusual in that it was a large software R&D organization that grew out of a scaled consultancy, and this is one of the places where this history most obviously shows. Scaled consultancies have to have standardized processes for hiring; new engineers are one of the basic inputs in the machine that makes them money, even more so than SaaS or similar shops. Growing out of that meant that even at ~35 people, Pivotal Cloud Foundry didn't need to develop a totally bespoke hiring process, it could adapt an already unusually standardized one and adapt it to its needs.
The All-Day Pairing Interview
If you passed the RPI then you got invited for an all-day pairing interview. When I was interviewed in 2014 this was always in-person, but by 2019 it was also sometimes conducted remotely.
It was actually two pairing interviews: The morning interview and the afternoon interview, each with a different engineer. There was a subset of engineers in the office who were part of the "interview pool." These were usually hand-selected per candidate. The process for becoming this kind of interviewer wasn't especially formal β I started interviewing one day when the Director who handled hiring for our office was short one "ops-y" interviewer and I said, sure, I'll do my first interview. I had seen them done before by teammates so I knew the general gist.
The all-day pairing interview was generally just regular work. The interviewer would pick a story out of their backlog and work on it together with the candidate. Then they filled out a set of questions about how the interview had gone, and met the next day with the hiring manager, the other interviewer, and an HR rep to give a "hire or no hire" recommendation.
Sometimes teams would have a "canned" story that they'd done before and that they knew made a "good interview story" that they'd pull out for the occasion, but it was also common to literally pull a story that needed to be worked out of the backlog. I made my first commit to a Pivotal project before I got hired there. In retrospect I'm not sure this was legal but it was very compelling to me as a candidate β I got an unusually good idea of what the job would actually be like.
When I was interviewing for Cloud Foundry I mostly interviewed very early career people, and I was usually on an ops-ier team, while my interviewer "pair" was on a more code-heavy team. I basically looked for three things:
- Do they ask questions and "manage their own learning?" Do I have to put a bunch of energy into making sure they understand what we're doing, or are they able to communicate when they're confused?
- Are they familiar with at least the basics of web applications? Do they seem to basically understand what e.g. an HTTP request is, or a router, even if they don't have command of the details?
- Can they take their programming skills and apply things that don't immediately look like coding problems but very much are, like writing YAML configuration?
(I also told people at the beginning of the interview that this is what I was looking for, which I think is a good interviewing practice generally.Β If your interview is at all valid telling people the criteria won't let them game it, and it will help you get a better read on people who are shy or have a tendency to psych themselves out in an interview situation.)
I know, incidentally, that the process was a little different for more experienced hires. If someone out there knows about how we evaluated people's architectural skills I'd love to hear about it. E-mail me at nat@simplermachines, or leave a comment.
A lot of folks tried to pick a story specifically for TDD skills which I never really understood myself β everyone had passed the RPI so we knew they could do the basic write-tests-make-pass thing. I actually thought the more important thing was to make sure that candidates understood that the role they were interviewing for had a lot of ops work and infrastructure work β we were Enterprise YAML Engineers before Kubernetes made that a mainstream thing. I'm not sure I'd consider this nearly as important today, but again at the time, I thought it was a serious problem how many people there were who complained about "not getting to write code" and that our interview process was part of what was setting the wrong expectations for folks.
A lot of this interview stage was actually about selling the candidate on coming to work at Cloud Foundry. Usually within an hour or two I could tell whether I wanted to work with them more and then would switch my focus from assessing them to making sure that we had an enjoyable and productive working session. The kind of person who did well at Pivotal found "wow, we really got some stuff done, and I learned a bunch in that interview" to be an extremely compelling pitch.
Directors also often tried to schedule the interviews such that at least one interviewer was a woman. This was tough on the women interviewers because there were relatively fewer of them so they ended up interviewing a lot but it both communicated to women candidates that they would not be the "only" or "first" and filtered out people who e.g. couldn't make eye contact with women.
Okay, so?
I had intended to include some notes on how I think this impacted Cloud Foundry's R&D culture and more notes on how it compared to other processes I've seen, and then move on to the Ask system and how Cloud Foundry's integration testing worked, but, frankly, this is already over 2,000 words soβ more on interviewing next week?
In the meantime send me an e-mail, leave a comment, or send me a message on the Pivotal Alumni Slack (now open to anyone who ever worked at Pivotal, even if you're still at VMware) if you've got a question, comment, or correction.
I think more companies ought to at least consider hiring this way or in a way that's based on it so I'm especially interested in hearing from you if you're actively hiring and interested in borrowing pieces of this. (And, can put you in touch with folks who have a lot more experience running this process than I did, if that's the case.)
- Nat