Skip to main content

Data Engineer

Technologie
Dedale Intelligence
Il y a 1 moisJusqu'au 11/05/2026
Temps pleinHybride

Description du poste

About Dedale

Dedale is a fast-growing intelligence and strategy firm operating at the intersection of data, AI, and decision-making. We help leading organizations and investors navigate complexity through sharp insights and a cutting-edge product. As we scale, data is at the heart of everything we build — and we need someone to own it end-to-end.

Your Role

As Data Engineer, you will be the primary owner of Dedale's data stack. You will define how we govern, model, maintain, migrate, and enrich our data (proprietary and public data) - setting the foundation for everything from analytics to AI-powered products.

Joining

Data and Tech Teams of 20+ people, you will collaborate closely with our Product and Research Teams. This is a high-impact, high-autonomy role, with a clear path toward Data Architect.

  • Data Governance

Define and enforce access policies: who can see what, who can change what – aligned with GDPR standards

Implement data quality checks and completeness monitoring across all sources

Build and maintain a data catalogue — making data discoverable and trustworthy

Own data observability: detect anomalies, lineage issues, and freshness gaps proactively

  • Data Architecture – in close coordination with our product team

Design and maintain Dedale's canonical data model — the single source of truth

Own the full data pipeline: ingestion, transformation, storage, and serving layers

Select and evaluate the right tools for each layer of the stack (ETL, warehouse, orchestration)

Ensure coherence, consistency, and scalability as we grow

  • Data Pipelines Automation and Enrichment

Automate data workflows, reducing manual steps, improving reliability and building scale

Design and build new pipelines that enrich raw data with external sources and computed signals

Partner with analysts and product teams to deliver clean, ready-to-use proprietary datasets

  • Data Modernisation & Migration

Audit and progressively migrate legacy data infrastructure toward a modern, scalable architecture

Identify technical debt in the data stack and drive its resolution methodically

Ensure zero data loss and high availability during migration phases

Your Profile

Must-haves:

3 to 5 years of experience in a data engineering role

Proficiency in Python and SQL

Solid experience with data modelling, pipeline design, and warehousing (e.g. Airtable, dbt, Airflow, BigQuery, Snowflake, Spark)

Interest and exposure to AI/ML pipelines and feature engineering

Hands-on knowledge of data governance principles and tooling

Track record of working in a start-up or scale-up environment — you thrive with ambiguity

Entrepreneurial mindset: you see the big picture, propose solutions, and take ownership

Nice-to-haves:

Experience with data migration projects

Familiarity with data cataloguing and observability tools (e.g. DataHub, Monte Carlo, Great Expectations)

Interest in growing into a Data Architect role — we will invest in your development

Who you are

You take pride in clean, reliable data and treat it as a product

You communicate clearly with technical and non-technical stakeholders alike

You balance speed with rigour — you can ship fast without cutting dangerous corners

You are proactive, curious, and always looking for ways to improve the stack

What We Offer

High autonomy and direct impact from day one A clear career path toward Data Architect A collaborative, intellectually stimulating environment

Competitive compensation package

Flexible working arrangements

Keywords
PythonSQLData ModellingPipeline DesignWarehousingETLData GovernanceData MigrationData ArchitectureData AutomationData EnrichmentData ObservabilityAI/MLAnalytical SkillsProblem SolvingCollaborationData EngineerData PipelinesBigQuerySnowflakeSparkData WarehousingAIMachine LearningData Quality

¿Te interesa este puesto?