Sr. Data Scientist - Capacity Data
Confirmed live in the last 24 hours
CoreWeave
Compensation
$143,000 - $210,000/year
Job Description
Job Summary
CoreWeave is seeking a Sr. Data Scientist to architect and scale the data infrastructure that serves as the backbone of our Integrated Planning and Capacity teams. As an AI hyperscaler, our ability to manage GPU, CPU, and storage assets in real-time is the difference between leading the market and falling behind.
Currently, our systems are in a nascent, "bare minimum" stage. You will be responsible for leading the transition to a centralized, production-grade database system. This is a heavy software engineering-focused role where you will build the pipelines and transformations necessary to unify disparate metrics—ranging from power utilization and node health to customer capacity allocation data—into a single, accurate source of truth. Your work will directly enable the dashboards used by operational planners and executives, and provide the high-fidelity data required to develop autonomous, data-based agents for quantitative decision-making.
Key Responsibilities
- Architect Scalable Pipelines: Design, build, and maintain robust end-to-end data pipelines and transformations that aggregate data from across the organization, including data center operations, power metrics, node provisioning, and finance.
- Enable Advanced Analytics: Build and optimize the data structures that power real-time and near-real-time dashboards for operational capacity and demand managers.
- Support AI Agent Development: Ensure that data pipelines are architected to support the ongoing development of quantitative AI agents designed to synthesize complex data and inform high-stakes infrastructure decisions.
- Cross-Functional Synthesis: Partner with Data Center Operations, Supply Chain, Product, and Finance to ensure technical pipelines accurately reflect the physical realities of data center bring-up and node health.
- System Robustness: Implement rigorous testing, monitoring, and quality-control frameworks to ensure the accuracy of metrics like power utilization, storage consumption, and quality fall-out.
Qualifications & Skills
Required:
- Education: Master’s or PhD in Computer Science, Computer Engineering, Operations Research, or a related quantitative field.
- Experience: 6+ years of professional experience in data engineering or data science, with a heavy emphasis on building production-grade planning systems (ERP, MRP, or similar operations tools).
- Software Engineering Expertise: Expert-level proficiency in Python and SQL is mandatory. Experience with modern data stack tools (e.g., dbt, Airflow, Snowflake/BigQuery) and software development best practices (version control, CI/CD).
- Systems Thinking: Proven track record of taking "weak" or unstructured data environments and scaling them into robust, automated architectures.
- Attention to Detail: A fanatical focus on data accuracy and the ability to audit complex transformations for logical consistency.
Desired:
Similar Jobs
SoFi
Senior Manager Marketing Data Scientist, AI Enablement
Five9
Forward Deployed AI Engineer/Data Scientist
Gartner
Data Scientist (Classical +GenAI & MLOPs/Deployment)
CarMax
Sr. Data Scientist - Pricing Algorithms
SoFi
Senior Staff Marketing Data Scientist, AI Enablement
Gartner