Skip to main content

Data Engineer

Technology
4 Corner Resources
Orlando, United States1 months agoUntil 5/16/2026
Full time

Job description

Data Engineer

Location: Hybrid - 3 days onsite, 2 days remote

Pay Rate: Up to $52/hr (based on experience)

Position Type: 1-year contract

Introduction 4 Corner Resources is seeking a Data Engineer for one of our clients to support the design, development, and maintenance of enterprise data platforms and analytics infrastructure. This role focuses on building scalable data pipelines, integrating new data sources, and ensuring reliable movement of data across applications and systems.

The ideal candidate will have strong experience working with modern data warehouse technologies, cloud platforms, and scripting languages to support data engineering and analytics initiatives. This role works closely with cross-functional teams including project managers, business analysts, and data scientists to translate business requirements into scalable technical solutions.

Required Qualifications

  • Bachelor’s degree in Computer and Information Science or a related field

  • Minimum 3 years of experience with Snowflake and Python

  • At least 3 years of related data engineering or IT experience

  • Strong SQL skills and experience working with relational databases

  • Experience building and optimizing data pipelines and large-scale datasets

  • Experience designing and managing data warehouses and data models

  • Ability to perform root cause analysis on internal and external data processes

  • Experience working with structured and unstructured datasets

  • Strong analytical, problem-solving, and communication skills

  • Ability to work both independently and collaboratively within cross-functional teams

  • Strong attention to detail and organizational skills

  • Ability to translate business requirements into technical solutions

Preferred Qualifications
  • Master’s degree in Computer Science or related field

  • Experience with Apache Spark, Hadoop, Java/Scala, Python, and AWS architecture

  • Experience with Microsoft .NET technologies (C#, VB.Net) and development of Windows or web applications

  • Experience with PL/SQL, SQL Server 2016 or later, and Snowflake

  • Experience building ETL pipelines using Snowpipe, Informatica, Airflow, Kafka, or similar tools

  • Experience building and consuming APIs for data integration

  • Experience working with cloud and on-premise data infrastructure

Day-to-Day Responsibilities
  • Maintain and monitor analytics data warehouses and data platforms

  • Design, implement, test, deploy, and maintain scalable data engineering pipelines

  • Integrate new data sources into the central data warehouse and distribute data to applications and partners

  • Develop scalable code and automation to streamline repetitive data management tasks

  • Build processes supporting data extraction, transformation, and loading (ETL)

  • Collaborate with project managers, business analysts, and data scientists to translate requirements into technical specifications

  • Develop and support cloud and on-premise data infrastructure solutions

  • Build and maintain APIs to move data across systems and platforms

  • Analyze large and disconnected datasets to extract meaningful insights

  • Monitor technical strategy, identify infrastructure gaps, and propose scalable solutions

    4CR2

Keywords
monthsOfExperience: 36ScalaApache KafkaApache HadoopApache SparkAirflowPl / sqlPythonSqlHadoopApache LicenseApache Http ServerJavaRelational grammarData managementAWSPl/sql

¿Te interesa este puesto?