Skip to main content

Senior Data Software Engineer

Technology
EPAM Systems
43 200 € - 66 000 € /metaiprieš 2 mėnesiųIki 2026-05-04

Darbo aprašymas

We are looking for an experienced Senior Data Software Engineer who thrives on solving complex data transformation challenges. In this role, you will play a critical part in designing and implementing sophisticated data pipelines and analytics solutions that empower key decision-making processes in the client’s business unit. This is a unique opportunity to work on a high-impact Data Transformation project that leverages cutting-edge technologies, including Big Data, Palantir, and Machine Learning.

This role is part of the Property & Casualty Data Integration and Analytics project under the Data & Foundations division. The team is working on a global initiative to harness Property & Casualty data to drive enhanced decision-making across the reinsurance lifecycle. You will collaborate with industry experts, data scientists, and engineers to develop transformative data and analytics capabilities for the client’s strategic priorities.

If you are a passionate engineer with hands-on experience in large-scale data solutions, we invite you to join our global, multicultural, and dynamic team driving transformative outcomes. This role offers the flexibility to work remotely or collaborate with colleagues at one of our offices.

Feel free to work remotely from anywhere across Lithuania or connect with colleagues at our Vilnius and Kaunas offices.

Responsibilities

Design, develop, and optimize robust data pipelines using PySpark and Python to process large datasets

Collaborate with cross-functional teams to understand complex business requirements and transform them into scalable technical solutions

Utilize Palantir Foundry to build and manage analytics applications that enable strategic and operational insights

Manage data integration workflows across distributed computing systems, building high-quality ETL/ELT processes

Develop advanced SQL queries for data querying, transformation, and warehousing

Stay engaged with Agile methodologies and participate in Scrum ceremonies to align work with the broader project goals

Document technical solutions and workflows to ensure knowledge sharing and long-term maintainability

Troubleshoot and resolve data processing or platform issues in a fast-paced, production-grade environment

Stay up-to-date with the latest advancements in cloud technologies, Big Data processing, and Machine Learning

Participate in code reviews and promote best practices in software engineering

Requirements

Palantir Foundry expertise—prior hands-on experience is essential for this role

5+ years of experience, with a focus on large-scale distributed computing or analytics systems

Strong proficiency in Python and PySpark for building scalable data workflows

In-depth knowledge and experience in SQL (preferably Spark SQL), enabling efficient data querying and warehousing

Experience designing and implementing ETL/ELT processes for large datasets

Solid understanding of Scrum and Agile development principles

Bachelor’s degree in computer science, Data Science, or a related technical field

Strong analytical and problem-solving skills, with a strategic mindset for tackling complex challenges

Highly self-driven and capable of managing workload independently while delivering on commitments A collaborative mindset, paired with clear and effective communication skills, including experience working in global, multicultural settings

Eagerness to learn and stay current with emerging technologies and best practices in data engineering and analytics

Nice to have

Familiarity with insurance, financial industries, or finance-related data workflows

Knowledge of front-end technologies like HTML, CSS, JavaScript, and build tools like Gradle

Experience with Microsoft Power BI for building data dashboards and reports

Hands-on experience with Machine Learning or implementing Generative AI models

Understanding of statistical models and their applications in data pipelines

Exposure to Azure, AWS, or GCP cloud platforms, enabling high-quality engineering solutions

We offer

Engineering Heritage: Best-in-class experts sharing a culture of engineering excellence and tackling complex engineering challenges for over 30 years.

Advanced Tech Stack: Innovative projects where you can apply or enhance your expertise in Cloud, Data, AI, and other emerging technologies

World-Class Clients: Work closely with 295+ of the Forbes Global 2000 on creating disruptive solutions that make a global impact

Professional Growth: Exceptional support for career development with comprehensive resources for upskilling or reskilling in pioneering practices

GenAI Community: Strong AI competencies with 600+ experts across 55+ locations driving GenAI-enabled transformation journeys

Entrepreneurial Culture: If you're passionate and dedicated to improving business transformation, we provide the support you need to bring your ideas to life

Hybrid Setup: The flexibility to work from any location in Lithuania, whether it's your home or our dynamic offices in Vilnius and Kaunas

Other Benefits: Additional vacation and trust days, private health insurance, Employee Stock Purchase Plan and more

Salary range €3.6K-€5.5K gross, based on your experience and interview results.

About EPAM

EPAM is a leading global provider of digital platform engineering and development services. For over 30 years, our team has helped leading brands navigate the waves of digital transformation, building solutions that help them stay competitive through constant market disruption.

With offices in 55+ countries, EPAM has grown in Lithuania to over 1,200+ talented innovators in just 4 years. We foster creativity and unconventional ways of doing things, welcoming like-minded professionals to join us

Keywords
PySparkPythonPalantir FoundrySQLSpark SQLETL/ELTBig DataMachine LearningAgileScrumData PipelinesDistributed ComputingData TransformationCode ReviewsAnalytical SkillsProblem-SolvingData Software EngineerAnalytics SolutionsPalantirProperty & Casualty DataData IntegrationCloud TechnologiesReinsuranceData WarehousingSoftware EngineeringData ScienceAzureAWSGCPGenerative AI

¿Te interesa este puesto?