DataOps Engineer
Job description
CoreWeave is The Essential Cloud for AI™. Built for pioneers by pioneers, CoreWeave delivers a platform of technology, tools, and teams that enables innovators to build and scale AI with confidence. Trusted by leading AI labs, startups, and global enterprises, CoreWeave combines superior infrastructure performance with deep technical expertise to accelerate breakthroughs and turn compute into capability.
Founded in 2017, CoreWeave became a publicly traded company (Nasdaq: CRWV) in March 2025. Learn more at www.coreweave.com.
We're proud to be a Living Wage accredited Employer.
What You’ll Do The Monolith AI Platform Engineering Team at CoreWeave is responsible for building and scaling the data and workflow backbone that powers the world’s most advanced engineering simulation and AI workflows — our ambition is to become the super‑intelligent AI test lab for the engineering industry, helping customers ship science, faster. From high‑throughput data ingestion and feature pipelines to model training and real‑time inference, our platform delivers the performant, reliable, and trustworthy data foundation trusted by the world’s largest engineering companies.
The Senior DataOps Engineer II will own and drive all things data observability and operations across our client estate — building the practices, tooling, and culture that make Monolith’s data flows debuggable, auditable, and safe to evolve. You’ll sit at the intersection of platform engineering, data engineering, and reliability, implementing end‑to‑end lineage and DataOps practices while mentoring data producers and consumers on how to manage data as a first‑class product.
You’ll partner closely with Monolith’s Product, Engineering and forward‑deployed teams, as well as with CoreWeave’s infrastructure and AI platform groups, to turn fragmented, real‑world engineering data into well‑governed, observable, and operationally robust pipelines powering our SaaS platform and client‑specific deployments.
About The Role
We’re seeking an Senior DataOps Engineer II who can act as the hands‑on owner for Monolith’s data observability and operational surface: from batch and streaming pipelines running on our platform, through to the lineage, quality, and runbooks that keep customer environments healthy.
You’ll define and roll out DataOps practices (CI/CD, infra‑as‑code, data SLOs, incident response) across the Monolith estate, implement end‑to‑end data lineage and observability, and serve as the go‑to mentor for engineering teams and client‑facing colleagues on best‑practice data management.
In This Role, You Will
- Operations Surface
- Governance
- Establish CI/CD patterns for data pipelines and related infrastructure, including testing strategies, promotion workflows, and change‑management guardrails.
- Drive adoption of infra‑as‑code for data infrastructure (e.g., pipeline orchestration, storage, observability components), reducing manual drift across environments.
- Define and continuously improve DataOps processes — incident response, post‑incident review, change review, on‑call rotations — with a focus on learning rather than blame.
- Evaluate and integrate best‑of‑breed DataOps and observability tooling where it accelerates our teams, balancing build vs. buy pragmatically.
- Partner Across Monolith, CoreWeave &
- Clients
- Work with Monolith platform, data, agent, and reliability teams to expose observability and lineage as shared services and patterns other engineers can build on.
- Collaborate with CoreWeave infrastructure and AI platform teams to leverage underlying storage, compute, networking, and observability in service of robust data flows.
- Serve as a technical escalation point for forward‑deployed and customer‑facing engineers when data issues cross service boundaries or require deeper architectural insight.
- Mentor data producers (product teams, integrations, forward‑deployed engineers) and data consumers (data scientists, analysts, client engineers) on resilient schemas, contracts, and operational practices.
Who You Are
- Level
- Tooling
- Data Quality
- Automation
- Comfortable working in containerised, cloud‑native environments (Kubernetes plus at least one major cloud provider);
experience with GPU‑ or compute‑intensive workloads is a bonus.
- Strong automation mindset: infra‑as‑code, CI/CD, and configuration management for data infrastructure and observability components.
- Proficient in Python for building tooling, pipeline glue, and platform integrations; additional languages are a plus.
- Collaboration, Mentorship &
- Communication
- Clear communicator who can explain complex data flows and failure modes to both deeply technical and non‑specialist audiences.
- Experience mentoring engineers and data practitioners on better data management, observability, and operational hygiene — through documentation, examples, reviews, and office hours.
- Comfortable working in a fast‑moving, high‑ambiguity environment where we balance rapid iteration with the safety and reliability demanded by enterprise engineering clients.
Preferred
- Experience in ML/AI platforms or MLOps environments where data pipelines power experimentation, training, and inference at scale.
- Background with test, simulation, or time‑series data (e.g., physical test benches, battery labs, automotive/aerospace R&D).
- Familiarity with feature stores, experiment tracking, or model registries and their interaction with upstream data pipelines.
- Prior work in multi‑tenant SaaS platforms, especially those with strong compliance, observability, and uptime requirements.
- Experience supporting or partnering closely with forward‑deployed / professional services teams in complex customer environments.
We Believe In Investing In Our People, And Value Candidates Who Bring Diverse Experiences — Even If You Don’t Tick Every Single Box. Here Are a Few Qualities We’ve Found Compatible With Our Team.
If Some Of This Sounds Like
You, We’d Love To Talk
- Data‑obsessed operator – You care deeply about making data systems observable, predictable, and easy to reason about, not just “working most of the time.”
- Systems thinker – You enjoy mapping complex data flows across services, understanding failure modes, and designing for graceful degradation and rapid recovery.
- Pragmatic – You know when to build the ideal abstraction and when to ship the smallest change that meaningfully reduces risk or toil.
- Collaborative mentor – You get energy from helping other teams level up their data practices, and you can influence without heavy process or authority.
- Owner’s mindset – You feel responsible for the outcomes of the platform as a whole, not just the code you write, and you follow issues across boundaries until they’re truly resolved.
About
At CoreWeave, we work hard, have fun, and move fast! We’re in an exciting stage of hyper‑growth that you will not want to miss out on. We’re not afraid of a little chaos, and we’re constantly learning. Our team cares deeply about how we build our product and how we work together, which is represented through our core values:
- Be Curious at Your Core
- Act Like an Owner
- Empower Employees
- Deliver Best‑in‑Class Client Experiences
- Achieve More Together
To fulfil our obligation to protect client data, successful applicants offered employment with CoreWeave will be required to complete a basic criminal record check, conducted in compliance with GDPR. Employment offers are conditional upon receiving satisfactory check results
What We Offer In addition to a competitive salary, we offer a variety of benefits to support your needs, including:
- Family-level Medical Insurance
- Family-level Dental Insurance
- Generous Pension Contribution
- Life Assurance at 4x Salary
- Critical Illness Cover
- Employee Assistance Programme
- Tuition Reimbursement
- Work culture focused on innovative disruption
CoreWeave is an equal opportunity employer, committed to fostering an inclusive and supportive workplace. All qualified applicants and candidates will receive consideration for employment without regard to race, color, religion, sex, disability, age, sexual orientation, gender identity, national origin, veteran status, or genetic information.
CoreWeave does not accept speculative CVs. Any unsolicited CVs received will be treated as the property of CoreWeave and your Terms &
- Conditions associated with the use of CVs will be considered null and void.
Any unsolicited CVs sent by your company to us – that is to say, in any situation where we have not directly engaged your company in writing to supply candidates for a specific vacancy – will be considered by us to be a “free gift”, leaving us liable for no fees whatsoever should we choose to contact the candidate directly and engage the candidate’s services, and will in no way establish any prior claim by your company to representation of that candidate should the candidate’s details also be submitted by any other party.
Export Control Compliance
This position requires access to export controlled information. To conform to U.S. Government export regulations applicable to that information, applicant must either be (A) a U.S. person, defined as a (i) U.S. citizen or national, (ii) U.S. lawful permanent resident (green card holder), (iii) refugee under 8 U.S.C.
- 1157, or (iv) asylee under 8 U.S.C.
- 1158, (B) eligible to access the export controlled information without a required export authorization, or (C) eligible and reasonably likely to obtain the required export authorization from the applicable U.S. government agency. CoreWeave may, for legitimate business reasons, decline to pursue any export licensing process.
Such processing is legally permissible under Art. 6(1)(f) of (i) Regulation (EU) 2016/679 (General Data Protection Regulation (“GDPR”) and (ii) the GDPR as it forms part of the laws of the UK (“UK GDPR”), as necessary for the purposes of the legitimate interests pursued by the Controller, which are the solicitation, evaluation, and selection of applicants for employment. Your personal data will be shared with Greenhouse Software, Inc., a cloud services provider located in the United States of America and engaged by Controller to help manage its recruitment and hiring process on Controller’s behalf. With respect to transfers originating from the UK or the European Economic Area ("EEA") to a country outside the UK or the EEA, we implement the appropriate transfer mechanism(s) and other appropriate solutions to address cross-border transfers as required by applicable law.
You may request a copy of the suitable mechanisms we have in place by contacting us at
privacy@coreweave.com Your personal data will be retained by Controller as long as Controller determines it is necessary to evaluate your application for employment. Where permitted by applicable law, we may also retain your personal data for a limited period after the recruitment process ends in order to consider you for future job opportunities, respond to legal claims, or comply with record-keeping obligations. Under the GDPR and the UK GDPR, you have the right to request access to your personal data, to request that your personal data be rectified or erased, and to request that processing of your personal data be restricted.You also have the right to data portability. In addition, you may lodge a complaint with the relevant supervisory authority: (i) A list of Europe’s data protection authorities can be found
here; and (ii) for the UK, this is the Information Commissioner's Office. For additional information, please see our Privacy Policy.¿Te interesa este puesto?