Lead Data Engineer – Databricks
Job description
Job Title: Team Lead, Data Engineering – Databricks
Job ID: 86747
Location: Toronto, OntarioOverview:
Our client has provided specialist financial services to alternative investment funds, investors, multinationals, and private clients worldwide. Our data engineering initiatives represent one of our most critical technical focus areas, driving innovation across our global operations. As a Team Lead, you will?lead a technical team?of 3-5 data engineers, ensuring best practices in performance, security, and scalability while working on an enterprise wide Centralized Data Platform (CDP) built on Databricks.
This role requires a deep, hands-on understanding of Databricks internals and a track record of?delivering large-scale data platforms in a cloud environment.What you will be doing:
- You will lead and mentor a team of data engineers, conducting code reviews, design reviews, and knowledge-sharing sessions across multiple locations
- Drive the Agile/Scrum SDLC process and collaborate with team members
- Design and develop Databricks solutions leveraging Lakehouse architecture for enterprise data processing and analytics
- Develop and optimize ETL/ELT pipelines
- Create and manage structured streaming pipelines for real-time data processing
- Configure and optimize Databricks clusters and Spark jobs for optimal performance
- Utilize Delta Live Tables for data ingestion and transformations
- Apply Unity Catalog features and IAM best practices for security governance and access control
- Support infrastructure and resource management using Terraform
- Implement monitoring solutions for pipeline performance and data quality
- Contribute to code reviews and knowledge-sharing sessions
- 8+ years of experience in data engineering
- 3+ years of hands-on experience with Databricks platform
- Proven experience leading a team
- Strong expertise in Python and Spark programming
- Demonstrable experience in using AI in development
- Proven experience with AWS or other similar cloud services
- Deep understanding of data modeling and SQL
- Experience with Delta Lake and Lakehouse architecture
- Strong knowledge of ETL/ELT principles and patterns
- Experience with version control systems (Git)
- Demonstrated ability to optimize data pipelines
- Strong problem-solving and analytical skills
- Excellent communication and collaboration abilities
- Financial services industry experience
- Experience with multiple cloud providers
- Knowledge of AI/ML implementation patterns
- API development experience
- Experience with real-time data processing
- Data governance framework experience
- Active interest in emerging technologies and industry trends
- Experience implementing complex data solutions
- Track record of continuous learning and skill development
- Strong technical documentation abilities
- Experience using AI tools to enhance productivity
- Ability to mentor junior team members
- Primary Platform: Databricks
- Cloud Platform: AWS (S3, Glue, Lambda)
- Languages: Python, SQL
- Tools: Delta Lake, Unity Catalog, Git
- Additional: Real-time processing, API integrations
- Hybrid – 3 days/week onsite downtown Toronto
- 40 hour work week
TEEMA
¿Te interesa este puesto?