Senior Data DevOps Engineer (GCP)
Descripción del puesto
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture.
Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.
We are looking for an experienced Senior Data DevOps Engineer with expertise in Google Cloud Platform to strengthen our growing team. You will focus on managing cloud data infrastructure, automating workflows, and enhancing data operations. Join us and contribute to building efficient and scalable data solutions.
Responsibilities
- Design and implement cloud data infrastructure using GCP services like DataFlow, BigQuery, and Cloud Composer
- Deploy and manage Infrastructure as Code with Terraform to automate provisioning and monitoring
- Collaborate with data engineers to develop automated data pipelines using Python
- Establish CI/CD pipelines using Jenkins, GitLab CI, or GitHub Actions for smooth deployments
- Optimize data platform performance and reliability working with cross-functional teams
- Configure cloud data tools such as Apache Spark and Apache Airflow
- Troubleshoot and resolve issues related to scalability and reliability of cloud data systems
- Minimum 3 years working experience with GCP services including BigQuery, Cloud Composer, and Dataproc
- Proficiency in Python programming and strong SQL skills for pipeline management
- Experience with Infrastructure as Code tools like Terraform or CloudFormation
- Knowledge of CI/CD pipeline tools such as Jenkins, GitHub Actions, or GitLab CI
- Familiarity with Linux operating systems and shell scripting
- Understanding of networking protocols including TCP/IP, DNS, and NAT
- Competence in using data processing tools like Apache Spark, Apache Airflow, or ELK Stack
- Experience with AWS or Azure platforms including ECS, S3, Data Lake, or Synapse
- Ability to work with additional IaC tools such as Ansible
- Experience with alternative data workflow automation technologies
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000 courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn
¿Te interesa este puesto?