Together AI has quietly become one of the most important infrastructure companies in the AI ecosystem. While Anthropic and OpenAI build frontier models and dominate the headlines, Together AI is building the platform that lets everyone else run open-source models at scale. Co-founded by Vipul Ved Prakash and Ce Zhang, the company has grown from a research project into a full-stack AI cloud with $300M in annualized revenue and a $3.3B valuation.
But what is it actually like to work there? We pulled Glassdoor data, employee feedback, compensation signals, and culture indicators to give you the most complete picture of working at Together AI in 2026. Whether you’re weighing an offer, prepping for an interview, or just curious about what an AI infrastructure startup looks like from the inside, here’s what we found.
The Numbers at a Glance
Before we go deeper, here are the numbers that matter.
| Metric | Detail |
|---|---|
| Founded | 2022 |
| Headquarters | San Francisco, CA |
| Company Size | ~300 employees |
| Glassdoor Rating | 4.1 / 5.0 (4 reviews) |
| Work-Life Balance | 3.8 / 5.0 |
| Valuation | $3.3B (Feb 2025) |
| Total Funding | $554M across 5 rounds |
| Revenue | $300M annualized (Sep 2025) |
| Revenue Growth | ~130% YoY |
| Recommend to Friend | ~82% (estimated) |
| CEO | Vipul Ved Prakash |
A 4.1 Glassdoor rating is solid but comes with a major caveat: Together AI has only 4 Glassdoor reviews. That’s an extremely small sample size, which means any single review — positive or negative — can swing the numbers significantly. For comparison, Anthropic has 222 reviews behind its 4.4 rating, which gives much more statistical confidence. Take Together AI’s ratings as directional indicators, not gospel.
The 4.2 culture and values score is the highest sub-category, which aligns with what we know about the company’s open-source DNA and technical culture. The 3.8 across work-life balance, career opportunities, and senior management is consistent — decent but not exceptional. The 4.0 compensation score suggests competitive pay, though not at the extreme top of the AI market.
What Makes Together AI Different
Together AI occupies an unusual and increasingly important position in the AI landscape. While companies like Anthropic and OpenAI are building proprietary frontier models, Together AI is building the infrastructure that makes open-source models accessible and economically viable. They call it the “AI Acceleration Cloud” — a full-stack platform for training, fine-tuning, and deploying open-source AI models.
The company was co-founded by Vipul Ved Prakash and Ce Zhang, bringing together deep expertise in both systems engineering and AI research. This dual DNA matters: Together AI isn’t just renting out GPUs. They’re building sophisticated infrastructure that optimizes how models run, from kernel-level CUDA code up to the API layer.
The business metrics tell a compelling story. Together AI hit $300M in annualized revenue by September 2025, up from $130M at the end of 2024 — that’s roughly 130% year-over-year growth. They’ve raised $554M across five funding rounds, reaching a $3.3B valuation in February 2025. With ~300 employees, that’s roughly $1M in annualized revenue per employee — an impressive efficiency metric for an infrastructure company.
Their key open-source contributions have been genuinely impactful:
- FlashAttention — A breakthrough that speeds up LLM training by up to 9x, achieving 72% model FLOPs utilization on A100 GPUs. This has been adopted across the entire AI industry.
- RedPajama — An open-source dataset project aimed at making high-quality training data accessible to everyone, not just companies that can afford to build their own data pipelines.
- 200+ models available via their unified API — from Llama to Mistral to specialized fine-tuned variants, all served through Together’s optimized inference infrastructure.
The core value proposition is straightforward: companies that want to use open-source models but don’t want to build and manage their own GPU clusters can use Together AI instead. It’s the AWS of open-source AI — and in a world where more and more organizations are choosing open models over proprietary APIs, that’s a large and growing market.
The Culture: Research Lab Meets Cloud Infrastructure
Together AI’s culture sits at an unusual intersection: it’s a GPU infrastructure company with a genuine research lab. Most cloud infrastructure companies are pure engineering operations — they optimize for reliability, latency, and cost. Together AI does all of that, but also publishes research papers and contributes to open-source projects that advance the field.
This research-meets-infrastructure dynamic shapes the culture in several ways. Based on employee feedback and our analysis of Together AI’s culture profile, the company’s core values cluster around:
At ~300 employees, Together AI still has genuine startup energy. The team is small enough that individual contributions are visible, and the flat hierarchy means engineers have real ownership over their work. The many-hats mentality is unavoidable at this size — you won’t just be working on one narrow problem. Engineers routinely cross boundaries between kernel optimization, distributed systems, and API design.
The open-source DNA is a genuine differentiator. FlashAttention and RedPajama aren’t marketing projects — they’re real contributions that thousands of teams use in production. For engineers who care about building things that the broader community benefits from, this is meaningful. It’s one thing to work on proprietary infrastructure; it’s another to work on infrastructure that becomes part of the open-source ecosystem.
The engineering-driven culture means technical decisions come from the people doing the work, not from product managers or executives. Combined with ship-fast velocity, the result is a company that moves quickly on hard technical problems — a cadence that attracts strong engineers but can feel chaotic to people who prefer more structure.
Engineering Culture & Tech Stack
Together AI’s engineering challenges are among the hardest in the infrastructure space. The company operates at the intersection of distributed systems, GPU programming, and machine learning — a combination that requires deep technical breadth.
Tech Stack
The stack reflects the full-stack nature of the work. CUDA and C++ for GPU kernel optimization — this is where FlashAttention lives. Python and PyTorch for model fine-tuning and the research layer. Kubernetes and Ray for orchestrating inference across thousands of GPUs. This isn’t a typical web application stack; it’s systems programming meets ML infrastructure.
Core Engineering Challenges
The technical work at Together AI spans several areas:
- Inference optimization — The ATLAS system optimizes how models run across GPU clusters, squeezing maximum throughput from expensive hardware. This involves custom CUDA kernels, memory management, and scheduling algorithms.
- GPU cluster management — Coordinating thousands of GPUs for training and inference, handling failures gracefully, and maximizing utilization. This is distributed systems engineering at a high level.
- Model fine-tuning pipelines — Building tools that let customers fine-tune open-source models on their own data efficiently and cost-effectively.
- Serverless inference — Providing API endpoints that scale automatically, with cold start optimization and intelligent model caching.
- Research — FlashAttention achieving 72% model FLOPs utilization on A100s is a genuine research contribution, not just engineering. The team publishes regularly.
The technical bar is very high. The company serves 200+ models through a unified API, which means the inference engine must be flexible enough to handle different model architectures while remaining optimized for each. For engineers who want to work on genuinely hard problems at the intersection of systems and ML, this is one of the strongest environments available.
For engineers who value this type of engineering-driven culture, Together AI ranks among the most technically challenging workplaces in our Culture Directory.
Compensation & Benefits
Compensation is rated 4.0/5.0 on Glassdoor, which puts Together AI in solid territory — competitive but not at the very top of the AI market. For context, Anthropic scores 4.8 on compensation and pays $300k–$490k TC for engineers. Together AI, as a smaller and earlier-stage company, likely sits below that range but compensates with meaningful equity in a high-growth business.
A few things worth considering about Together AI’s compensation structure:
- Equity is the key differentiator. At $3.3B valuation with $300M in revenue and 130% year-over-year growth, Together AI equity carries significant upside potential. If the company continues on this trajectory, early and mid-stage equity grants could be very valuable.
- San Francisco-based comp levels. As an SF-headquartered company, base salaries are calibrated to the Bay Area market. This is competitive for AI infrastructure roles.
- Startup-stage benefits. Benefits are typical of a well-funded SF startup — health insurance, PTO, and standard perks. Don’t expect the Googleplex, but the $554M in funding means the company isn’t cutting corners.
- Revenue efficiency. $300M in revenue with ~300 employees means the company is generating real money, not just burning venture capital. This is a meaningful signal for equity value.
The bottom line on comp: if you’re optimizing purely for cash compensation today, larger companies like Anthropic or big tech labs will pay more. If you’re willing to take some cash-comp risk in exchange for equity in a high-growth infrastructure company, Together AI’s package becomes more attractive. The $300M revenue and 130% growth make the equity story more credible than most startups at this stage.
Work-Life Balance
Work-life balance is rated 3.8/5.0, which lands Together AI in the middle of the pack for AI companies. It’s better than Anthropic (3.7), OpenAI (3.6), and significantly better than Scale AI (2.7). But it’s not in the relaxed territory of companies like Notion or HubSpot.
The 3.8 score makes intuitive sense given Together AI’s position. The company is racing to capture the open-source AI infrastructure market while competitors like AWS, Google Cloud, and Azure pour resources into their own AI platform offerings. That creates urgency. But unlike frontier model labs that are racing to build AGI, Together AI’s work is infrastructure — which tends to have more predictable timelines and fewer existential-deadline-driven crunch periods.
At ~300 employees managing rapid revenue growth (from $130M to $300M in a year), every person is doing more than their job title suggests. That many-hats reality means the work can bleed into evenings and weekends, especially during product launches or when managing infrastructure at scale. But the absence of “we need to ship this model before the competitor does” pressure likely keeps the intensity below what you’d experience at a frontier lab.
If work-life balance is your top priority, explore our WLB rankings for companies that score 4.0+. If you’re comfortable with moderate intensity at a high-growth company, Together AI is in a reasonable range.
Career Growth & Concerns
Career opportunities are rated 3.8/5.0 — decent but not a standout. This is where Together AI’s youth as a company becomes most visible. With ~300 employees and rapid growth, career paths, promotion criteria, and management layers are still being defined.
There are a few specific concerns worth flagging, drawn from the limited Glassdoor data and broader signals:
- “Closed circle leadership” is a phrase that appears in Glassdoor reviews. This suggests potential inner-circle dynamics where influence and decision-making access may not be evenly distributed. In a company founded by researchers, it’s not uncommon for the founding technical team to retain outsized influence — but it can create frustration for people outside that circle.
- Interview process issues. Some candidates have reported being ghosted during the hiring process. This is a common problem at fast-growing startups where recruiting infrastructure hasn’t caught up to hiring volume, but it’s worth noting as a signal about organizational maturity.
- Limited transparency. With only 4 Glassdoor reviews, it’s hard to get a clear picture of the internal culture. Most 300-person companies have significantly more reviews, which raises the question of whether the company discourages external feedback or simply hasn’t built enough internal culture to generate it.
- People processes still forming. At this stage, expect that performance reviews, career ladders, and promotion processes are either very new or don’t exist yet in some teams. This is normal for the company’s age but means your career growth depends more on your manager and your own initiative than on institutional frameworks.
The upside of working at a young, fast-growing company is that career trajectories can be nonlinear. People who join early and perform well often end up in leadership roles that wouldn’t be available at larger companies. The downside is that there’s less structure to support your growth if you need it.
What Employees Love
The recurring theme is technical quality. Together AI attracts engineers and researchers who care about building sophisticated infrastructure, not just wrapping APIs. The combination of open-source contribution, revenue growth, and genuinely hard technical problems creates an environment where strong engineers can do their best work while building something with real business value.
What Employees Warn About
The concern about “closed circle leadership” deserves attention. In a company where the founders are respected researchers, it’s easy for the founding team’s technical preferences and personal relationships to create an implicit hierarchy that exists alongside the formal one. If you’re considering Together AI, it’s worth asking specific questions during the interview process about how decisions are made, who has influence, and how the company handles disagreement.
The competitive landscape concern is real but cuts both ways. AWS, Google Cloud, and Azure have essentially unlimited resources, but they’re also slow-moving and focused on many products at once. Together AI’s advantage is focus and speed — they do one thing (open-source AI infrastructure) and do it well. Whether that focus is enough to win long-term is the existential question for the business.
How Together AI Compares
Here’s how Together AI stacks up against other AI infrastructure and frontier companies in our Culture Directory.
| Company | Glassdoor | WLB | Employees | Valuation | Open Roles |
|---|---|---|---|---|---|
| Together AI | 4.1 | 3.8 | ~300 | $3.3B | 46 |
| CoreWeave | 3.6 | 3.2 | ~1,200 | $50B+ | 275 |
| Modal | 4.0 | 4.0 | ~50 | — | 29 |
| Anthropic | 4.4 | 3.7 | ~1,500 | $61.5B | 443 |
| Scale AI | 3.5 | 2.7 | ~1,000 | $14B | 166 |
Together AI compares favorably on Glassdoor rating and work-life balance against CoreWeave (the GPU cloud giant) and Scale AI. It’s slightly behind Anthropic on overall rating but ahead on WLB. Modal, another developer infrastructure company, scores similarly but is much smaller (~50 employees). For a detailed side-by-side, use the interactive comparison tool.
The key differentiator in this comparison is Together AI’s position as an open-source-first infrastructure company. CoreWeave is primarily a GPU cloud provider without the research component. Modal is developer infrastructure but focused on serverless compute rather than AI specifically. Together AI occupies a unique niche: infrastructure built by researchers, optimized for open-source models, with real revenue to back it up.
Open Roles at Together AI
Together AI currently has 46 open positions spanning engineering, research, sales, and operations. For a ~300-person company, that’s a significant hiring push — roughly 15% headcount growth from current openings alone.
Key hiring areas include:
- Infrastructure Engineers — GPU cluster management, distributed systems, and Kubernetes orchestration
- ML/AI Engineers — Inference optimization, model fine-tuning pipelines, and CUDA kernel development
- Research Scientists — Working on projects like FlashAttention, training efficiency, and model architecture
- Sales & Go-to-Market — Enterprise sales for the AI Acceleration Cloud platform
- Operations — Scaling the company’s internal systems as headcount and revenue grow
For the full list of live openings with filters, visit the Together AI jobs page or explore the Together AI culture profile.
The Bottom Line
Choose Together AI if you want to work on open-source AI infrastructure with brilliant researchers, strong revenue growth, and genuinely hard technical challenges. The $300M revenue and 130% growth rate put the company in a strong financial position, and the equity upside at $3.3B valuation is real. But expect a young company still figuring out its leadership dynamics and people processes — and do your due diligence on team culture before joining. With only 4 Glassdoor reviews, the internal culture is less transparent than most companies at this scale.
Together AI is not for everyone. It’s not the highest-paying option, it’s not the most structured, and it doesn’t have the name recognition of an Anthropic or OpenAI. What it offers is a rare combination: the chance to work on hard infrastructure problems with talented researchers, contribute to open-source projects that the entire AI community uses, and do it at a company with real revenue and growth. For the right engineer, that’s a compelling package.
Explore Together AI opportunities
See all 46 open roles at Together AI or browse jobs from 35+ AI & tech companies in our culture directory.
Explore all 46 Together AI jobs → Read the full culture profile →