Skip to main content

Big Data Developer

Technology
Quantum World Technologies Inc.
Toronto, Canada4 weeks agoUntil 2026-05-26
Contract

Job description

Skills - Snowflake/Hadoop/Scala/Spark/Hive/AWS

Detailed Job Description

Work within an Agile cross functional team to design, develop and maintain data ingestion flows and evolve the platform to orchestrate them.

Design and implement data pipelines, framework and ETL processes.

Ensure data quality, security, and performance. Develop and optimize SQL queries, stored procedures, and views.

Integrate Snowflake with other data sources and BI tools.

Monitor and troubleshoot data jobs and platform issues.

Develop and maintain comprehensive documentation for data pipelines, transformations, and data models.

What do you need to succeed

5+ years of Experience with Big Data technologies used for ETL Hadoop, Spark, Hive

Well versed with Snowflake Platform.

Strong SQL skills and knowledge of data modeling.

Programming Experience with Scala or Java an API Development.

Experience with ETL tools (e.g., Informatica, Talend, Apache Airflow).

Familiarity with any of the cloud platforms (Preferably AWS).

Knowledge on Gen AI to generate code and improve developer proficiency

Knowledge of Python or other scripting languages is a plus.

Excellent problem-solving and communication skills.

Knowledge of SCM, Infrastructure-as-code, and CICD pipelines.

Nice to Have

Experience with workflow management tools like Apache Airflow

Experience with Continuous Integration tools Git Actions, GitHub, Automated Testing tools, Git, or similar tools.

Experience with DockerKubernetesContainersOCP4

Shell Scripting Bachelors or masters degree in computer science, Data Engineering and, or a related field

Keywords
monthsOfExperience: 60ScalaApache HadoopApache SparkApache AirflowAirflowPythonSqlHadoopApache LicenseHiveApache Http ServerJavaCI / CDBig dataShell scriptAWSGitGithubCI/CD

¿Te interesa este puesto?