Senior Data DevOps (Azure)
Descripción del puesto
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture.
Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.
We are seeking an experienced Senior Data DevOps Engineer (Azure) to join our team and play a crucial role in building and optimizing data infrastructure and workflows.
In this role, you will design, implement, and manage scalable data solutions in the Azure cloud environment. You will work closely with data engineering and cross-functional teams to ensure efficient data pipelines, system reliability, and performance optimization.
Responsibilities
- Design, deploy, and manage data infrastructure using Azure services, including Data Lake (ADLSv2), Databricks, Synapse, and Data Factory
- Collaborate with the data engineering team to build and maintain efficient data workflows and pipelines
- Automate data processes using Python to improve efficiency and reliability
- Set up and manage CI/CD pipelines using tools such as Jenkins, GitHub Actions, or similar platforms
- Work with cross-functional teams to enhance the performance, scalability, and reliability of data systems
- Install, configure, and maintain data tools such as Apache Spark and Apache Kafka in both cloud and on-premises environments
- Monitor data systems to proactively identify and resolve performance and scalability issues
- Troubleshoot and resolve complex issues across data platforms and pipelines
- At least 3 years of experience in Data Engineering or related roles
- Strong expertise in Python programming and batch processing workflows
- Proficiency in SQL for managing and querying large datasets
- Advanced experience working with Azure cloud services for data infrastructure management
- Hands-on experience with Infrastructure as Code tools such as Ansible, Terraform, or CloudFormation
- Skilled in setting up and managing CI/CD pipelines using tools like Jenkins or GitHub Actions
- Practical experience with data tools such as Spark, Airflow, or R for data processing and workflow management
- Advanced knowledge of Linux operating systems, including scripting and system management
- Strong understanding of network protocols and mechanisms, including TCP, UDP, ICMP, DHCP, DNS, and NAT
- Fluent English communication skills, both written and spoken, at a B2 level or higher
- Familiarity with additional cloud platforms such as AWS or GCP
- Experience with container orchestration tools like Kubernetes for managing data workflows
- Knowledge of monitoring and observability tools such as Prometheus, Grafana, or Azure Monitor
- Exposure to Big Data technologies and advanced data analytics workflows
- Hands-on experience with data governance and security best practices in the cloud
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000 courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn
¿Te interesa este puesto?