Experience & QualificationsOverall Experience: 10+ years of progressive experience in Data Engineering or Data Architecture, with a minimum of 4+ years in a senior or architect-level role focused on cloud-based data solutions, preferably leveraging Databricks.
Deep, hands-on experience across the Azure data ecosystem, including:
Azure Data Lake Storage (ADLS Gen2)
Azure SQL and related Azure data services
Databricks Proficiency:Demonstrated expertise in architecting and implementing large-scale data processing and analytics solutions using Azure Databricks, including ETL/ELT development, performance optimization, and distributed data processing.
Strong practical experience with dbt for data transformation, modeling, testing, and deployment within modern cloud data warehouse environments.
Data Warehousing & Modeling:
Solid understanding of enterprise data warehousing principles, including:
Star and Snowflake schema design
Data lifecycle and governance best practicesInfrastructure as Code (IaC): Hands-on experience with Infrastructure as Code tools such as Terraform or ARM templates to provision and manage Azure cloud resources.
Roles & Responsibilities
Define scalable, secure, and high-performing architecture patterns aligned with business objectives.
Establish coding standards, CI/CD best practices, and reusable frameworks for data engineering initiatives.
Define and enforce enterprise-wide data modeling standards and governance frameworks.
Ensure adherence to data privacy, access control, and monitoring best practices.
Minimize downtime, ensure data accuracy, and implement validation and reconciliation processes.
Mentor and guide junior team members on cloud data architecture principles, performance optimization, and industry best practices.
¿Te interesa este puesto?