Most interview prep advice tells you to ask “What does a typical day look like?” or “What are the biggest challenges in this role?” Those questions are fine for learning about the job. They’re terrible for learning about the culture.
Culture questions are different. They need to be specific enough that a polished PR answer is impossible. They need to reveal the gap between what a company says it values and what it actually does. And they need to be organized around your priorities—because asking someone at a 3,000-person company about psychological safety is a very different question than asking it at a 40-person startup.
This list is organized by the culture values we track at JobsByCulture. Pick the 2–3 values that matter most to you, bring your top 5 questions into your next interview, and listen carefully to how they answer—not just what they say.
How to Use This List
- Pick the values that matter most to you and bring 1–2 questions per value per interview round.
- Ask teammates, not just hiring managers. Culture answers diverge most when you ask the same question to someone in the room vs. someone above it.
- Evasiveness is data. A good interviewer who believes in their culture will answer these eagerly. Hesitation tells you something.
- Cross-reference with research. Look up the company’s culture profile before your interview. Glassdoor scores and verified reviews should inform which questions you prioritize.
“Remote” is the most abused word in job listings. A company can call itself remote-first while mandating Tuesday–Thursday in-office and scheduling a 9am standup in one time zone. These questions force specificity.
Question 1 of 50
“What percentage of meetings held last month could have been a Loom, doc, or Slack thread instead?”
Why ask: Async culture isn’t about removing meetings—it’s about defaulting to written communication and only meeting when synchronous discussion adds unique value. Someone in a genuinely async culture will answer with a percentage and a real example. Someone in a meeting-heavy culture will be confused by the question.
Good answer“Honestly, probably 60%+. We default to writing things in Notion and loom walkthroughs. Meetings are mostly for real-time decisions or retros.”
Red flag“We have a great culture of collaboration. People really enjoy connecting as a team.” (No number, pivots to culture speak.)
Question 2 of 50
“If I’m in a different time zone from my manager, how does that work day-to-day?”
Why ask: Forces them to describe the actual operating model, not the stated one. Does the team have overlap hours? Is there an expectation to be “always available”? Do async handoffs work in practice?
Good answer“We have a 2-hour overlap window we protect. Outside that, everything is async by default. I have a teammate in Lisbon and we’ve never had a problem.”
Red flag“We’d want you to overlap with core hours in [one time zone].” If they haven’t said it in the job description, this is a hidden constraint.
Question 3 of 50
“How responsive are people to Slack messages after 6pm local time?”
Why ask: The honest answer tells you whether “flexible” is real or whether there’s an implicit always-on culture hiding under it. Compare the answer to what actual Glassdoor reviews say about responsiveness expectations.
Good answer“It varies. I personally don’t respond after hours unless it’s a P0 incident. Nobody has ever commented on it negatively.”
Red flag“It’s not required but most people stay pretty connected—we move fast and things come up.”
Question 4 of 50
“Where does your team’s institutional knowledge live—in people’s heads or in written docs?”
Why ask: Documentation discipline is the backbone of async work. Teams that rely on tribal knowledge are de facto office-dependent, regardless of stated remote policy. The answer also predicts your onboarding experience.
Good answer“We’re pretty obsessive about Notion. Every decision has a doc trail. Onboarding is mostly self-service because of how much we’ve written down.”
Red flag“It’s a mix—we’re working on it. You’d learn a lot from the team.”
Question 5 of 50
“Has your remote policy changed in the last two years? What drove that?”
Why ask: Many companies retracted remote policies after 2022–2024 without fully updating their public messaging. This question surfaces any backslide and reveals whether the current policy is genuinely stable or quietly under pressure.
Good answer“We went from 3 days/week in-office to fully distributed in 2023 and haven’t looked back. Our revenue actually grew faster with the talent pool expansion.”
Red flag“We brought people back a bit more—leadership felt collaboration was suffering.” Ask follow-up: what specifically are the current in-office expectations?
Unlimited PTO sounds great until you realize the team averages 8 days taken per year. These questions find the gap between stated policy and lived reality.
Question 6 of 50
“How many days of PTO did you personally take last year?”
Why ask: The single most revealing WLB question. Asking about the interviewer specifically (not “the team”) makes it impossible to give a vague answer. You’re not looking for a specific number—you’re looking for whether they can answer comfortably and whether the number matches the stated culture.
Good answer“Around 20 days. My manager genuinely pushes us to take time and we get coverage when someone’s out.”
Red flag“Hmm, probably... 8? 10? It was a busy year.” Or deflecting: “The team culture is really supportive.”
Question 7 of 50
“When was the last time someone on this team took a full week off completely disconnected?”
Why ask: Being reachable “just in case” is the enemy of real rest. This question checks whether disconnecting is actually possible, not just theoretically allowed.
Good answer“My colleague just got back from two weeks in Japan. She didn’t open her laptop once—we planned coverage ahead of time and it was totally fine.”
Red flag“People are pretty reachable—even on vacation, they stay looped in on big things.”
Question 8 of 50
“What time does your team’s latest recurring meeting happen? Who is it with?”
Why ask: Recurring meeting schedules are a proxy for work-life boundary norms. If the latest recurring meeting is a 7pm sync with a client or leadership, that tells you something the job description won’t.
Good answer“Friday all-hands at 4pm. Everything else is before 5. We have a norm against scheduling after 5.”
Red flag“We have a standing EOD sync with the US team at 6:30pm [for someone in a European timezone].”
Question 9 of 50
“How does the team handle on-call? What did your last incident look like?”
Why ask: On-call obligations are the most common WLB surprise in engineering roles. This question reveals rotation frequency, severity expectations, and whether the team has actually invested in reducing alert noise.
Good answer“We rotate weekly among 8 engineers. Last month we had one P1—resolved in 40 minutes at 11pm. We do postmortems and actively reduce alert noise each quarter.”
Red flagVagueness about frequency, no mention of postmortems, or “it’s not that bad” without specifics.
Question 10 of 50
“What’s the most recent thing that caused genuine crunch on your team, and how long did it last?”
Why ask: Every team has crunches. The question isn’t whether they happen—it’s whether they’re bounded, acknowledged, and followed by recovery time. “We never crunch” is a worse answer than an honest account of a two-week push.
Good answer“Launch sprint in January, about two weeks of 50-hour days. Leadership gave the team two weeks of flexible time afterward to recover. People appreciated the honesty about it.”
Red flag“We’re always kind of in crunch mode—we move really fast” or a claim that crunch never happens (implausible for most product teams).
In an engineering-driven culture, engineers influence product direction, can push back on bad ideas, and have real ownership over technical decisions. In a non-engineering-driven culture, they implement specs from product or sales. Here’s how to tell the difference.
Question 11 of 50
“When was the last time an engineer killed or significantly changed a product feature based on technical concerns?”
Why ask: If engineers have real influence, this story should come quickly. If the interviewer struggles to think of an example—or the example is from years ago—that’s a signal about who actually calls the shots.
Good answer“Last quarter, actually. We were going to ship a feature that would have required rebuilding our auth layer under the hood. The engineer leading it flagged it and we scoped it down to a 2-week version. Product was frustrated but agreed it was the right call.”
Red flagLong pause, then: “I’m sure it’s happened, we really value engineering input.”
Question 12 of 50
“How are tech stack and infrastructure decisions made? Who has final say?”
Why ask: In engineering-led companies, tech decisions are made by engineers with input from leadership. In sales-led or product-led companies, tech choices often follow customer requests or business timelines without engineering input.
Good answer“Engineers write RFCs and the team debates them. CTO weighs in on big bets but generally trusts the people doing the work. We recently switched infra providers based on an IC engineer’s proposal.”
Red flag“Leadership decides, but they take our input”—especially if followed by a story about a decision that overrode engineering consensus.
Question 13 of 50
“What’s the ratio of time your team spends on new features vs. technical debt and reliability?”
Why ask: Companies that say they value engineering will let engineers invest in the work that makes the codebase better. A company that never allocates time for reliability work is revealing that it treats engineering as a feature-factory.
Good answer“Roughly 70/30 right now—we just came off a product push. We dedicate every 4th sprint to reliability and debt exclusively.”
Red flag“We try to chip away at tech debt when we can”—meaning it never actually happens on a schedule.
Question 14 of 50
“Does engineering have a seat at the table when quarterly priorities are set?”
Why ask: Quarterly planning reveals power dynamics. Do engineers help shape the roadmap, or do they receive it? This question is better answered by a team lead or senior IC than by a hiring manager who may not know the real answer.
Good answer“Engineering proposes technical initiatives directly. We do a joint planning session with product where both sides defend their asks equally. Engineering wins about half the contested items.”
Red flag“Product drives the roadmap and engineering executes, but we’re pretty aligned.”
Question 15 of 50
“Can you point me to your engineering blog or any public technical writing from the team?”
Why ask: Engineering-driven companies publish. They talk publicly about their technical problems and solutions because they’re proud of how they solve them. This isn’t always true of smaller companies—but if the blog exists, read it before your interview. It tells you more than any answer will.
Good answerThey name it immediately and it has recent posts from individual engineers, not just marketing content about the product.
Red flag“We’ve been meaning to start one” or a blog with the last post from 2022.
True flat organizations are rare and usually constrained to companies under ~300 people. Most companies that call themselves flat have simply renamed their managers as “leads.” These questions find the real org chart.
Question 16 of 50
“How many approval layers does a new feature go through before it ships?”
Why ask: Approval layers are a proxy for organizational trust and velocity. The more layers, the less flat the org is in practice. Flat orgs typically have IC engineers who can ship to production without a manager signing off on every commit.
Good answer“For most features: code review from one engineer, QA sign-off, then deploy. For major architectural changes we have an RFC process that can take a few days, but that’s it.”
Red flag“Tech lead, then product manager, then VP sign-off, then a weekly release window.”
Question 17 of 50
“Have you ever disagreed with a decision made by someone senior? What happened?”
Why ask: This question tests whether dissent is actually safe—not just theoretically permitted. A flat culture means junior people can push back on senior people without consequences. Listen for whether the outcome was a genuine dialogue or a polite capitulation.
Good answer“Yes, about the data model for our new product line. I wrote up my concerns, presented them in a team meeting, and we actually changed the approach based on my feedback. The VP was the one who said ‘you’re right, let’s do it differently.’”
Red flag“I shared my opinion in a 1:1, but ultimately decisions come from leadership and I respect that.”
Question 18 of 50
“How often do ICs interact directly with the founders or C-suite?”
Why ask: In genuinely flat orgs, individual contributors have real access to founders and leadership—not in a performative all-hands way, but in day-to-day work. Answers that describe only structured, top-down communication suggest more hierarchy than the “flat” label implies.
Good answer“Regularly. The CEO does a weekly open Q&A on Slack and responds to DMs from anyone. She sometimes comments directly on PRs. It’s very informal.”
Red flag“We have quarterly all-hands where you can submit questions. Leadership is very approachable.”
Question 19 of 50
“What was the last major company decision that was made bottom-up rather than top-down?”
Why ask: Real flat orgs have concrete examples of decisions that originated with ICs, not leadership. If the interviewer can’t name one quickly, the culture is probably more traditional than the marketing suggests.
Good answer“Our engineering oncall process was designed entirely by the on-call engineers who were frustrated with the old system. They ran a working group, proposed the new system, and leadership adopted it without changes.”
Red flag“Leadership is very good at soliciting feedback before making decisions.”
Question 20 of 50
“How does the company handle someone who is a great IC but doesn’t want to become a manager?”
Why ask: In hierarchical orgs, the only path to senior comp is management. In flat orgs, strong IC tracks exist that are genuinely respected—not dead-end positions. The existence of a real staff/principal IC track is a flat-org indicator.
Good answer“We have a Staff Engineer track that tops out at the same comp as Director of Engineering. Some of our most respected people are Staff ICs with no direct reports and they’re not expected to become managers.”
Red flag“We support individual contributors but most senior people do end up managing eventually.”
Ship-fast culture isn’t about cutting corners—it’s about reducing time between idea and user feedback. These questions find teams that have genuinely built systems to move quickly vs. teams that are simply understaffed and overwhelmed.
Question 21 of 50
“How often does this team deploy to production?”
Why ask: Deployment frequency is the most objective proxy for ship-fast culture. Teams that deploy multiple times per day have genuinely invested in CI/CD, feature flags, and automated testing. Teams that deploy monthly or quarterly have organizational friction that no amount of “we move fast” marketing can overcome.
Good answer“Multiple times per day for most services. We have feature flags so shipping and releasing are decoupled—code goes to production constantly but features are gated.”
Red flag“We do a release every two weeks” combined with “we move really fast” in the same breath.
Question 22 of 50
“What does your feature flag and rollout process look like?”
Why ask: Feature flags are the infrastructure of ship-fast culture. Without them, every deployment is a potential all-or-nothing risk, which slows shipping. Teams that have invested in launch infrastructure ship faster because the risk of each deployment is lower.
Good answer“We use LaunchDarkly for feature flags. Everything ships behind a flag, we can roll out to 1% of users, watch metrics for 24 hours, then ramp up or kill it.”
Red flagNo feature flag system, or shipping everything to all users simultaneously with manual monitoring.
Question 23 of 50
“Tell me about a feature that flopped. How quickly did the team recognize it and cut it?”
Why ask: Ship-fast culture requires equally fast killing of things that don’t work. Teams that hold onto failing features for political or sunk-cost reasons are not actually ship-fast—they’re accumulating product debt as fast as they accumulate technical debt.
Good answer“We shipped a dashboard redesign last year that users hated based on support tickets and usage metrics. We reverted within 3 weeks. No drama—we just moved on.”
Red flagCan’t name a failed feature, or the story is about a feature that took 6+ months to sunset after everyone knew it wasn’t working.
Question 24 of 50
“How long does it typically take from ‘idea approved’ to ‘in front of users’?”
Why ask: This is the simplest measure of organizational velocity. Time from concept to user is affected by everything: planning overhead, review cycles, QA processes, deployment windows, and access to data. A short cycle time means the org has minimized all of these friction points.
Good answer“For a small feature, maybe a week end-to-end including design, engineering, and QA. For a bigger initiative, 4–6 weeks. We time-box ruthlessly to avoid scope creep.”
Red flag“Depends on the feature. It can be a few months sometimes for anything significant.”
Question 25 of 50
“What’s the biggest thing slowing down shipping right now?”
Why ask: Every team has bottlenecks. A self-aware ship-fast team knows what theirs are and is actively working on them. A team that says “nothing really” either doesn’t measure it or isn’t honest with you. The answer also tells you what you’d be dealing with in the role.
Good answer“Honestly, our test suite is too slow. We’re at 25-minute CI runs and it’s killing iteration speed. One engineer is working on parallelization this quarter.”
Red flag“Nothing major, we’re pretty well-oiled”—or a vague answer about “coordination.”
“We invest in our people” is one of the most generic lines in any company careers page. These questions find the specifics.
Question 26 of 50
“What did you personally learn in the last 6 months that you didn’t know before?”
Why ask: In learning cultures, people can answer this instantly—and they’re excited about it. In stagnant orgs, people have to think hard or pivot to talking about the company. The specificity of the answer (a technology, a domain, a skill) tells you whether learning is real or aspirational.
Good answer“I got really deep into distributed tracing with OpenTelemetry. The company gave me a week to do a proof-of-concept and then we adopted it. I didn’t know anything about observability 6 months ago.”
Red flag“We have a strong learning culture here”—without a personal example.
Question 27 of 50
“What’s the learning budget, and what did someone on your team use it for last year?”
Why ask: Many companies have learning budgets on paper that nobody actually uses due to approval friction, embarrassment, or busyness. Asking for a concrete example from the team verifies that the budget is real and accessible, not just a benefits page bullet point.
Good answer“$2,500 per person per year. My teammate used hers for KubeCon attendance and another engineer used his for a machine learning certification. No approval needed under $500.”
Red flag“We have a budget, you’d need to check with HR on the details.”
Question 28 of 50
“How does the promotion process work? Give me a concrete example of someone who was promoted recently and what drove it.”
Why ask: Promotion processes reveal whether growth is meritocratic and transparent or political and opaque. A concrete example—a real person, a real timeline, real criteria—tells you whether the process actually works. Abstract descriptions of criteria without examples are a warning sign.
Good answer“We promote based on documented impact. We have a promotion doc template—you write up your achievements against the level criteria and two peers vouch for you. Someone on my team went from mid to senior last quarter based on leading a platform migration.”
Red flag“It’s based on performance reviews and manager recommendation”—without any specifics on timelines, criteria, or examples.
Question 29 of 50
“Does the team do internal talks, reading groups, or knowledge-sharing sessions?”
Why ask: Companies that invest in learning create structures for it: tech talks, reading groups, lunch-and-learns, demo days. Ad-hoc knowledge transfer is not the same as a learning culture. The existence of regular structured learning signals intentionality.
Good answer“We have a weekly eng talk on Fridays—any engineer can propose a topic. Last week was about LLM evals. We also have a paper club that meets bi-weekly.”
Red flag“We try to share things when relevant, it’s pretty organic.”
Question 30 of 50
“Who’s the best engineer you’ve worked with here, and what makes them exceptional?”
Why ask: The answer reveals what the organization actually values. Do they describe technical excellence? Mentorship? Cross-team impact? Communication? The attributes they list for “exceptional” are the attributes the culture rewards—and that you’ll be expected to develop.
Good answerA specific person, specific attributes, including something about how they make others better (not just their own output).
Red flagPure technical output without any mention of mentorship, communication, or team impact—suggests the culture rewards lone-wolf contributors.
Every company has a diversity page. These questions find the ones where it’s real.
Question 31 of 50
“What does your most senior non-male engineer’s career path at this company look like?”
Why ask: This question forces specifics about real career outcomes for underrepresented engineers, not diversity programs or hiring initiatives. A concrete example of a senior woman’s or non-binary person’s growth is more informative than any DEI statement.
Good answer“Our CTO is a woman who joined as a senior engineer five years ago. Three of our six engineering team leads are women. I can connect you with any of them.”
Red flag“We’re working on increasing diversity at senior levels”—meaning the pipeline hasn’t translated to senior representation yet.
Question 32 of 50
“Does the company publish a diversity report with specific numbers?”
Why ask: Publishing diversity data is the most basic form of accountability. Companies that measure and publish are more likely to actually improve. Those that cite “ongoing efforts” without data have no external accountability.
Good answer“Yes, we publish annually. I can send you the link. We also include a breakdown by level, not just company-wide, which matters more.”
Red flag“We haven’t published one yet, but DEI is a priority and we’re building a program.”
Question 33 of 50
“Have you had any retention issues specifically with underrepresented employees in the last year? How did leadership respond?”
Why ask: Inclusion isn’t just about hiring. Companies with real D&I commitment track and respond to disproportionate attrition among underrepresented groups. Companies without it are surprised by the question or deny the issue exists.
Good answer“We did have some retention issues in 2024 that our ERG surfaced. Leadership did listening sessions, changed our review process to reduce bias, and hired a Head of DEI. We’ve seen improvement in the last two quarters.”
Red flag“I think attrition has been pretty even across the board”—suggesting they don’t track it by demographic.
Question 34 of 50
“What ERGs exist at the company, and how are they resourced?”
Why ask: ERGs (Employee Resource Groups) are a real inclusion infrastructure investment. But the difference between a performative ERG and a meaningful one is resources: dedicated budget, executive sponsorship, and time to participate without career penalty.
Good answer“We have 5 ERGs—each gets an annual budget and a dedicated executive sponsor. ERG leads get 10% of their time allocated officially. It’s in their job descriptions.”
Red flag“We have a few groups—they’re volunteer-driven, very organic.”
Question 35 of 50
“How does the interview process itself account for bias?”
Why ask: Companies with genuine inclusion commitments have invested in structured interviews, blind resume reviews, and calibration processes. This question also signals that you care about the interview process being fair—which is a useful signal for the company too.
Good answer“We use structured interviews with rubrics. All interviewers submit independent scorecards before any debrief. We use the same questions for all candidates for a given role.”
Red flag“We have a strong hiring culture, everyone is very professional.”
Open-source culture means engineers write public code, engage with the community, and have a meaningful contribution beyond using OSS tools. These questions separate real OSS companies from those that merely use open-source software.
Question 36 of 50
“What is the most significant open-source project the company maintains or contributes to?”
Why ask: Real OSS companies can name their projects immediately and with pride. The project should have external contributors, active maintenance, and real users outside the company. A GitHub org with a handful of lightly-starred repos from 4 years ago doesn’t qualify.
Good answerThey name a specific project with GitHub stars, external contributors, and recent activity. They can describe the community around it.
Red flag“We open-source some things when we can” or pointing to a repo with no external contributors and no commits in 18 months.
Question 37 of 50
“Are engineers permitted and encouraged to contribute to external OSS projects during work hours?”
Why ask: Companies that genuinely value open source let engineers contribute upstream on company time. This benefits both the company (fixes and improvements flow back) and the engineer (community reputation, public portfolio). The answer reveals whether OSS is a value or just a marketing bullet point.
Good answer“Yes, explicitly. We have a policy that says you can contribute to relevant OSS projects during work time as long as it’s disclosed and related to our tech stack. Several engineers are maintainers of upstream dependencies.”
Red flag“On your own time, yes” or “it would need to go through legal review.”
Question 38 of 50
“Does the company sponsor or support any OSS foundations, conferences, or maintainers?”
Why ask: Financial support for the OSS ecosystem is one of the clearest signals of genuine open-source values. Sponsoring CNCF, Apache, NumFOCUS, or individual maintainers via GitHub Sponsors takes money and deliberate intent. It’s hard to fake.
Good answer“We’re a CNCF silver member, we sponsor the maintainer of [relevant tool we use heavily], and we sent 4 engineers to KubeCon to speak last year.”
Red flag“We try to give back when we can.”
Question 39 of 50
“Has any engineer here become a named maintainer of a significant external project? Who?”
Why ask: Maintainers have outsized influence in the OSS ecosystem. Companies that produce maintainers have deep OSS DNA—the time, recognition, and support to do serious community work is built into the culture. If no one can name a maintainer, the OSS commitment is shallow.
Good answerSpecific names and projects. Bonus points if they’re well-known in the relevant ecosystem.
Red flag“I’m not sure of specific names off the top of my head.”
Question 40 of 50
“If I want to open-source a tool I build here, what’s the process?”
Why ask: Real OSS-friendly companies have a clear, lightweight process for releasing new open-source projects. Companies that haven’t thought about it—or where the process is a 6-month legal review—are not genuinely OSS-oriented regardless of their GitHub activity.
Good answer“There’s a short form and a legal review that typically takes a week. We actually encourage engineers to consider it from the start of projects rather than retrofitting it later.”
Red flag“I’m not sure, it would probably go through legal and take a while.”
Compensation conversations are uncomfortable—which is exactly why interviewers give vague answers. These questions get specific.
Question 41 of 50
“Does the company publish salary bands, and can I see the band for this role?”
Why ask: Pay transparency is correlated with pay equity and trust. Companies that publish bands are less likely to underpay based on negotiating style or demographic. In states with pay transparency laws (CA, NY, CO, WA), this is now required in job postings—but many companies still resist discussing it directly.
Good answer“Yes, we publish all bands internally and I can share the range for this level: $[X]–$[Y] base. Equity and bonus add [Z]%.”
Red flag“Compensation is competitive and we’ll make a strong offer based on your experience.”
Question 42 of 50
“How are equity refreshers handled? What does a typical refresh grant look like after year 2?”
Why ask: Most offer letters describe initial equity grants. The question is what happens after. Companies that offer meaningful annual refreshers maintain employee total comp at market; companies that don’t create a cliff where employee compensation drops sharply in year 4 when initial grants are fully vested.
Good answer“We do annual performance reviews with equity refreshers. A typical mid-level performer gets roughly 25–30% of their initial grant as a refresher each year. It’s structured to keep you at or above your original equity level.”
Red flag“Refreshers are based on performance and are at the company’s discretion”—with no specifics on typical amounts or frequency.
Question 43 of 50
“What’s the strike price of current options, and when was the last 409A valuation?”
Why ask: For private companies offering stock options (not RSUs), the strike price and last valuation determine how much upside actually exists. A $10 strike price on options when the company last valued at $8 per share means your options are underwater. This is publicly available for 409A valuations and you should know the answer before signing.
Good answerThey answer directly with a specific price and date. They can also discuss the implied valuation and recent fundraising if relevant.
Red flagEvasiveness, or the strike price is extremely close to the most recent preferred share price (thin margin on potential returns).
Question 44 of 50
“What is the post-termination exercise window for options?”
Why ask: The industry standard is 90 days to exercise options after leaving the company. The best companies have extended this to 10 years. A short exercise window means that leaving the company often means forfeiting all your options (which can cost tens of thousands of dollars). This is a negotiable term that signals how employee-friendly the equity structure is.
Good answer“We extended to 5 years. We think the 90-day window is unfair—it effectively punishes people for leaving.”
Red flag“Standard 90 days.”
Question 45 of 50
“How does the company think about keeping compensation at market? What happens if the market moves?”
Why ask: Comp that was competitive in 2021 might be 30% below market today in AI engineering. Companies that regularly benchmark and adjust proactively retain people. Companies that wait for employees to bring competing offers create a culture where the best people leave to get market corrections externally.
Good answer“We run a comp benchmarking study every 6 months and proactively adjust. We don’t wait for people to bring outside offers—it’s embarrassing if it gets to that point.”
Red flag“We do comp reviews at annual performance reviews”—annual cycles in fast-moving markets mean guaranteed drift below market.
Psychological safety—the belief that you won’t be punished for speaking up, making mistakes, or disagreeing—is one of the strongest predictors of team performance. It’s also one of the easiest things to fake in an interview. These questions find the real answer.
Question 46 of 50
“Tell me about the last significant mistake you made on this team and what happened afterward.”
Why ask: In psychologically safe cultures, people talk about their mistakes openly because there’s no blame attached to them. The willingness to share a real mistake in an interview context tells you something. The story itself—what the team response was, whether there was blame or learning—tells you the rest.
Good answer“I accidentally took down a service last year by pushing a bad config change. We did a blameless postmortem, added a staging check, and I wrote up the learnings for the whole team. Nobody made me feel bad about it.”
Red flagRefusal to give a personal example, or a story where the resolution involved punishment, blame, or the person having to “prove themselves” afterward.
Question 47 of 50
“Has anyone been recognized or rewarded recently for raising a concern or flagging a problem early?”
Why ask: Positive reinforcement for speaking up is the clearest signal of psychological safety. If raising concerns is rewarded, the culture is actively building safety. If it’s merely not punished, the safety is passive. If it’s punished, the culture is hostile regardless of what the careers page says.
Good answer“Yes—a junior engineer flagged a security issue in a third-party dependency six months before it became a published CVE. She got a public shout-out in all-hands and a spot bonus.”
Red flag“We have an open-door policy and people feel comfortable raising things.”—without a concrete example of it being rewarded.
Question 48 of 50
“How does the team handle postmortems? Walk me through a recent one.”
Why ask: Postmortem culture is the most reliable institutional measure of psychological safety. Blameless postmortems—where the process is examined rather than the person—require genuine psychological safety to work. If postmortems are just blame sessions with extra steps, the underlying safety is absent.
Good answer“We follow a blameless postmortem template: timeline, contributing factors, action items. The doc is published internally. Nobody’s name goes next to a failure as the cause—only system factors.”
Red flag“We do postmortems after incidents” with no specifics on blamelessness, or a story where a specific person’s mistake was the “root cause.”
Question 49 of 50
“If I thought a strategic direction the company was taking was a mistake, what would be the right way to raise that?”
Why ask: This tests whether dissent has a legitimate channel at the organizational level, not just the team level. Companies with strong psychological safety have clear mechanisms for IC-level strategic disagreement: open Q&A, anonymous feedback, skip-level meetings. Companies without it expect alignment and compliance.
Good answer“Write it up and share it. We have an internal RFC process that anyone can use. If it’s urgent, you could DM the relevant VP or bring it up in the weekly open Q&A. People do this and get real responses.”
Red flag“I’d probably raise it with my manager first and see what they think”—suggesting there’s no legitimate channel for direct strategic input from ICs.
Question 50 of 50
“What’s something about the culture here that you wish were different?”
Why ask: This is the purest test of an interviewer’s own psychological safety. Someone who feels genuinely safe in their culture will answer this honestly—because they know the organization is capable of hearing it. Someone who doesn’t feel safe will give a polished non-answer. The meta-signal is as important as the content.
Good answerAny honest, specific answer. “Roadmap visibility is still too opaque” or “we could do better at welcoming new hires”—something real and improvable, not an attack but not a dodge either.
Red flag“Honestly I can’t think of anything, I’m really happy here” or an immediate pivot to a strength. Nobody works anywhere with zero criticisms.
The Meta-Skill: Listen to How They Answer, Not Just What
The best interviewers lean forward when you ask these questions. They laugh when they remember a good example. They pause to think of a real person’s name. They sometimes say “that’s a good question” not as a stall but because they genuinely haven’t been asked it before. The worst answers are polished non-answers delivered in under 3 seconds. If someone can answer “when was the last time someone killed a feature?” in 2 seconds with a generic culture statement, they’re not thinking about reality—they’re managing your perception of it. Trust your gut on the difference.
Frequently Asked Questions
What are the best questions to ask about company culture in an interview?+
The best culture interview questions are specific, behavioral, and hard to spin. Instead of asking “What’s the culture like here?” (which invites marketing speak), ask questions that force concrete examples: “When was the last time an engineer killed a product feature based on technical concerns?”, “How responsive are people to Slack after 6pm?”, or “What did you personally learn in the last 6 months that you didn’t know before?” The more specific the question, the less room there is for a polished non-answer.
How many questions should you ask an interviewer about culture?+
Pick 3–5 culture questions per interview and prioritize them by the values that matter most to you. Asking all 50 questions on this list would be impractical and off-putting. Instead, identify your top 2–3 culture values (e.g., remote-first, engineering-driven, flat hierarchy), and ask 1–2 targeted questions per value. Save the rest for the offer stage, when you can ask hiring managers or future teammates directly.
What are red flags in interviewer answers about culture?+
Key red flags include: vague, marketing-speak answers without concrete examples (“We’re very collaborative and supportive”); hesitation or inconsistency between interviewers; pivoting from your question to company achievements; and answers that describe the ideal rather than the reality (“We aspire to be remote-first”). A good interviewer who believes in their culture will answer culture questions eagerly and with specifics. Evasiveness is itself data.
When in the interview process should you ask culture questions?+
Early-round culture questions (recruiter screen, hiring manager) should focus on basics: remote policy, team structure, pace. Mid-process interviews with potential teammates are the best time for depth: ask about day-to-day reality, psychological safety, and learning culture. At the offer stage, you have maximum leverage—ask the hard questions about comp transparency, equity structure, and anything unresolved. Never save all your culture questions for a single conversation.
How do you research company culture before an interview?+
Start with the company’s culture profile on
JobsByCulture, which aggregates Glassdoor ratings, culture values, and employee reviews into a single view. Then check Glassdoor directly for recent reviews (last 6 months), filtering for your role type. Read the engineering blog if one exists. Look at the careers page for signals like specific remote policies, stated values, or documented processes. Check LinkedIn to see how long employees stay—high turnover is a culture signal. Finally, search “[Company name] culture” on Hacker News and Reddit for unfiltered takes.