Job Title: Data Engineer (Snowflake, Big Data & ETL) Company: Cogency Inc.
Location: Greater Toronto Area (Hybrid - 4 days office) Job Summary
We are seeking a skilled and detail-oriented Data Engineer with strong expertise in Snowflake, Big Data technologies, and ETL processes. The ideal candidate will be responsible for designing, building, and optimizing scalable data pipelines and ensuring high data quality, performance, and security across the platform.You will work in an Agile environment, collaborating with cross-functional teams to deliver robust data solutions that support analytics, reporting, and business intelligence initiatives. Key Responsibilities
Design, develop, and maintain data ingestion pipelines and ETL workflows
Build and optimize data pipelines, frameworks, and transformation processes
Ensure data quality, integrity, security, and performance across systems
Develop and optimize SQL queries, stored procedures, and data models
Integrate Snowflake with various data sources and BI/reporting tools
Monitor, troubleshoot, and resolve data pipeline and platform issues
Collaborate with cross-functional teams in an Agile delivery environment
Maintain comprehensive documentation for pipelines, transformations, and data models
Implement best practices in DevSecOps, CI/CD, and data engineering standards Required Skills & Experience
5+ years of experience in Data Engineering / ETL development
Strong hands-on experience with Big Data technologies:
Solid experience working with Snowflake data platform
Strong expertise in SQL and data modeling concepts
Programming experience in Scala or Java, including API development
Experience with ETL tools such as:
Familiarity with cloud platforms (preferably AWS)
Understanding of DevSecOps practices, CI/CD pipelines, and Infrastructure-as-Code
Knowledge of GenAI tools for code generation and developer productivity enhancement
Excellent problem-solving, analytical, and communication skills Preferred / Nice-to-Have Skills
Experience with AWS services (e.g., Glue, S3, Lambda)
Hands-on experience with Apache Airflow or similar workflow orchestration tools
Experience with CI/CD tools (GitHub Actions, Git, automated testing tools)
Exposure to containerization technologies (Docker, Kubernetes, OpenShift)
Knowledge of Python or scripting languages
Experience with shell scripting Education
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field Key Competencies
Strong analytical and problem-solving mindset
Ability to work independently and in team environments
Excellent communication and stakeholder collaboration skills
Attention to detail and commitment to data quality
¿Te interesa este puesto?