Skip to main content

Lead Data Engineer

Technology
EPAM Systems
Bucharest, Româniaacum 3 săptămâniPână la 01.05.2026
La sediu

Descrierea postului

EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture.

Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.

We are seeking an experienced and innovative Lead Data Engineer to play a critical role in our Data Transformation program.

You will be at the forefront of designing and implementing scalable, cutting-edge data solutions, leveraging advanced technologies to optimize our cloud-based analytics platform. This role offers a unique opportunity to contribute to the transformation of our data platform and develop expertise in emerging technologies.

Responsibilities

Design and develop automated data pipelines and data structures for modern data solutions

Deliver business tenancies as part of the data platform strategy

Build and optimize cloud data platforms leveraging AWS and Snowflake

Collaborate with product teams to ensure alignment with business goals and objectives

Migrate data from legacy platforms to cloud-based solutions

Design and operate event‑driven or streaming systems on Kafka with focus on delivery semantics and tuning throughput

Implement robust CI/CD pipelines using GitLab to support development processes

Create and maintain automated testing frameworks for data pipelines

Adapt ITIL processes into a 2nd/3rd line support environment to ensure system reliability

Drive innovation by staying updated with technology standards and emerging trends in software and data engineering

Contribute to Agile delivery teams utilizing methodologies such as Scrum and Kanban

Requirements

5+ years of Python engineering experience with a focus on performance optimization, object-oriented design, and production-grade reliability

2+ years of hands-on expertise in Snowflake, SQL, and data engineering tools like dbt and Airflow

2+ years designing and managing Kafka systems with an understanding of message delivery guarantees and DLQs

1+ years implementing CI/CD pipelines using GitLab in production environments

Familiarity with Agile methodologies, including Scrum or Kanban, and JIRA for workflow management

Knowledge of automated testing frameworks for validating data pipelines

Expertise in modern cloud computing technologies, with a focus on AWS and Snowflake

Demonstrated proficiency in ITIL processes within a support setting

Strong written and verbal English communication skills (B2+)

We offer

Full access to cutting-edge tools and technologies

Competitive compensation depending on experience and skills

All-around Social package: professional & soft skills training, medical & family care programs, sports

Free English classes

Unlimited access to LinkedIn learning solutions

Continuous experience exchange with experts and professionals worldwide

Friendly team and comfortable working environment

Engineering, corporate, and social events within and outside the Company

Flexible working schedule

Opportunities for self-realization

Keywords
PythonSnowflakeSQLData Engineering ToolsKafkaCI/CD PipelinesGitLabAgile MethodologiesAutomated Testing FrameworksCloud Computing TechnologiesAWSITIL ProcessesCommunication SkillsData EngineerData TransformationData SolutionsCloud-Based AnalyticsData PipelinesData StructuresBusiness TenanciesCloud Data PlatformsCI/CDAgileScrumKanbanITILAutomated TestingPerformance OptimizationObject-Oriented DesignProduction-Grade ReliabilityEvent-Driven SystemsStreaming SystemsData Migration

¿Te interesa este puesto?