Skip to main content

Data Engineer

Technology
Xcede
Budapest, Magyarország1 hónapjaEddig 2026. 05. 25.
SzerződésesHibrid

Munkakör leírása

Data Engineer

Location: Budapest, Hungary (Hybrid – 3 days per week in office)

About the Company

Our client is a global technology and digital transformation consultancy, supporting enterprise organisations in delivering large-scale data, cloud, and AI initiatives. They partner with businesses across multiple industries to modernise legacy platforms, unlock data value, and deliver measurable outcomes.

Operating at enterprise scale, their teams work across international locations, solving complex data and platform challenges within regulated environments, while maintaining a strong engineering culture and collaborative approach.

Role Overview

We are seeking an experienced Data Engineer to build, optimise, and support enterprise-scale data platforms and pipelines. This is a hands-on role focused on developing robust, production-ready data solutions and supporting the migration of legacy data platforms to modern cloud-based architectures.

The role is based in Budapest, with a hybrid working model requiring three days per week on site, working closely with local stakeholders and global teams.

Key Responsibilities

• Build, maintain, and optimise enterprise-grade data pipelines using modern ELT/ETL patterns

  • Develop and support data solutions on Snowflake, ensuring performance, reliability, and scalability
  • Contribute to data migrations from Hadoop, Spark, and other large-scale big data platforms to cloud-native architectures
  • Implement data ingestion, transformation, and orchestration processes for high-volume, business-critical datasets
  • Collaborate with Data Architects, Analytics teams, and business stakeholders to deliver production-ready data solutions
  • Ensure pipelines meet enterprise standards for data quality, monitoring, security, and resilience
  • Work effectively within distributed global teams across regions and time zones
  • Troubleshoot and resolve data pipeline and platform issues in production environments

Required Skills & Experience

• Strong hands-on experience with Snowflake in enterprise production environments

  • Proven experience supporting or delivering migrations from Hadoop, Spark, or other large-scale big data platforms
  • Experience working in large, enterprise-scale data environments (not small or start-up platforms)
  • Solid track record building and maintaining production-grade ELT/ETL pipelines
  • Good understanding of modern data engineering practices, distributed systems, and cloud data platforms
  • Comfortable collaborating with globally distributed teams
  • Fluent Hungarian (spoken and written)
  • Willing and able to work three days per week from the Budapest office

Nice to Have

• Experience in consulting or multi-client delivery environments

  • Exposure to AWS, Azure, or GCP
  • Familiarity with data governance, security, and enterprise data management frameworks
Keywords
OrchestrationApache HadoopApache SparkScalabilityHadoopBig dataData managementAWS

¿Te interesa este puesto?