Skip to main content

Data Engineer

Technology
Interactive Resources - iR
Tempe, United States$100,000 - $110,000 /year2 weeks agoUntil 6/4/2026
Full timeHybrid

Job description

πŸš€ Junior Databricks Data Engineer (Azure)

πŸ“ Location: Tempe Arizona- Onsite with the team

πŸ’Ό Employment Type: Full-Time

We are partnering with an innovative organization to find a Junior Databricks Data Engineer who is passionate about building modern data solutions in the cloud. This is a great opportunity for someone with 3–4 years of experience looking to grow their expertise in Azure and Databricks Lakehouse technologies while working on impactful, enterprise-scale data initiatives.

What You’ll Do:

  • Design, build, and maintain scalable data pipelines using the Databricks Lakehouse platform
  • Support the modernization of a cloud-based data ecosystem in Azure
  • Develop ETL/ELT pipelines using Python, PySpark, and SQL
  • Work with Delta Lake to ensure data reliability, performance, and scalability
  • Assist in implementing data governance, security, and cataloging practices (Unity Catalog exposure is a plus)
  • Use tools like Apache Airflow (or similar) for workflow orchestration
  • Build and support data ingestion pipelines from APIs, databases, and file-based sources
  • Collaborate with cross-functional teams to enable analytics, BI, and data science use cases
  • Write and optimize SQL queries and help create curated datasets for reporting
  • Contribute to CI/CD pipelines and automation efforts for data workflows
What You Bring:
  • 3–4 years of experience in data engineering, data warehousing, or related roles
  • Hands-on experience with Python, SQL, and PySpark (or Spark)
  • Exposure to Databricks and/or Azure cloud services
  • Understanding of ETL/ELT processes and data pipeline development
  • Familiarity with Delta Lake or modern data lake architectures is a plus
  • Experience with orchestration tools like Airflow is preferred
  • Basic knowledge of data governance, security, and best practices
  • Strong problem-solving skills and eagerness to learn in a fast-paced environment
Nice to Have:
  • Exposure to Unity Catalog, Databricks Workflows, or Delta Live Tables
  • Experience working with regulated or financial datasets
  • Familiarity with DevOps/CI/CD concepts in a data environment
Keywords
monthsOfExperience: 36OrchestrationApache SparkScalabilityCloud computingApache AirflowDevOpsAirflowPythonSqlApache LicenseApache Http ServerUnityCI / CDCatalogingReplicationCI/CD

ΒΏTe interesa este puesto?