Skip to main content

Evnek - Data Platform Engineer

Technology
Evnek
Pune, India3 weeks agoUntil 6/6/2026
Full time

Job description

Job Title : Data Platform Engineer

Experience : 5 to 8 Years

Location : Pune

Notice Period : Immediate Joiners Only

About the Role :

We are seeking a skilled and driven Data Platform Engineer to design, build, and optimize scalable data platforms and pipelines. The ideal candidate will have strong expertise in Terraform, CI/CD, and DevOps practices, along with hands-on experience in AWS and Snowflake environments.

In this role, you will be responsible for building robust data infrastructure while ensuring high performance, reliability, and cost efficiency. You will also play a key role in troubleshooting pipelines and collaborating with cross-functional teams to enhance the overall data ecosystem.

Key Responsibilities :

Platform Design & Development :

  • Collaborate with the Data Tech Lead to architect and build next-generation data platforms using AWS and Snowflake.
  • Design scalable, secure, and high-performance data infrastructure.

Infrastructure as Code (IaC) :

  • Develop and maintain infrastructure using Terraform.
  • Ensure automation, scalability, and reliability of data systems.

CI/CD & DevOps :

  • Build and manage CI/CD pipelines for data infrastructure and workflows.
  • Apply DevOps best practices across the data platform lifecycle.

Data Pipeline Engineering

  • Design, develop, and maintain scalable ETL/ELT pipelines using Python and SQL.
  • Work with orchestration tools like Apache Airflow, dbt, or sqlmesh.

Monitoring & Troubleshooting :

  • Proactively monitor data pipelines and infrastructure.
  • Troubleshoot issues to ensure high availability and adherence to SLAs.

Performance & Cost Optimization :

  • Optimize Snowflake queries, storage, and compute usage.
  • Improve efficiency and reduce costs within the AWS ecosystem.

Collaboration & Mentorship :

  • Work closely with engineering teams and stakeholders.
  • Provide technical guidance and contribute to best practices.

Required Skills & Qualifications :

  • Strong experience with Snowflake, including its architecture and data platform components
  • High proficiency in Terraform for Infrastructure as Code
  • Hands-on experience with CI/CD pipelines and GitHub (including GitHub Actions)
  • Strong programming skills in Python
  • Advanced knowledge of SQL
  • Experience with Apache Airflow or Managed Airflow (MWAA)
  • Good understanding of AWS services such as S3 and Lambda
  • Experience in building and maintaining scalable data pipelines
  • Strong problem-solving skills and ability to work independently in a fast-paced environment
Keywords
Data EngineeringAWSSnowflake DBData InfrastructureData PipelineETLPythonSQLData Build ToolApache AirflowOrchestrationScalabilityDevOpsAirflowSqlApache LicenseApache Http ServerCI/CDBig data

¿Te interesa este puesto?