Data Engineer - Senior (Python, SQL, Azure)
Technology
Arch Global Services Philippines
Pasig, Philippines1 months agoUntil 4/4/2026
Job description
Schedule: Mid Shift
This position develops, implements, and maintains software solutions that enable business operations to realise company goals & objectives. The incumbent performs analysis, design, coding, debugging, testing, and support of software application systems. He/she may be assigned to develop new applications, enhance existing applications and/or provide production support.
The incumbent works independently on projects of moderate scope or complexity and receives detailed instructions on new and/or more complex assignments.
Job Responsibilities:
- Design and develop data pipelines using Apache Airflow to orchestrate complex workflows and ensure reliable data delivery
- Build and maintain transformation logic using dbt Core, supporting the infrastructure needed, implementing best practices for modular, tested, and documented analytics code
- Develop and optimize data models in Snowflake, leveraging cloud data warehouse capabilities for performance and cost efficiency
- Write complex SQL for data transformation, quality validation, and business logic implementation
- Collaborate closely with the infrastructure team to ensure the data platform remains modern, well‑monitored, and fully optimized, with industry best practices consistently applied
- Collaborate with analytics and business teams to understand requirements and translate them into scalable data solutions
- Implement data quality checks, monitoring, and alerting to ensure data reliability
- Document data pipelines, models, and processes for knowledge sharing
- Optimize query performance and manage Snowflake resource utilization
- Participate in code reviews and contribute to data engineering best practices
Required Skills:
- 3+ years of experience in data engineering or related role
- Strong proficiency in SQL with experience writing complex queries, CTEs, and window functions
- Proficiency in Python for data engineering tasks, scripting, and automation
- Understanding in Azure OR astronomer
- Hands-on experience with dbt (Core or Cloud) for data transformation and modeling
- Experience orchestrating workflows with Apache Airflow or similar tools
- Working knowledge of Snowflake or similar cloud data warehouses (Redshift, BigQuery)
- Understanding of infrastructure requirements for data engineering, including deployment strategies, environment configuration, and resource management
- Understanding of dimensional modeling and data warehouse design patterns
- Experience with version control (Git) and CI/CD practices
- Strong problem-solving skills and attention to data quality
Desired Skills:
- Experience with data replication tools such as Qlik Replicate, Fivetran, AWS DMS, or similar CDC solutions
- Experience setting up and managing infrastructure for dbt Core, including deployment automation, testing frameworks, and orchestration integration
- Knowledge of real-time data streaming and event-driven architectures
- Knowledge of containerization (Docker) and infrastructure as code (Terraform, CloudFormation)
- Experience with cloud platforms (AWS, Azure, GCP)
- Knowledge of data governance and security best practices
- Familiarity with DataOps practices and testing frameworks
- Understanding of software engineering principles and agile methodologies
Education:
- Required knowledge and skills would typically be acquired through a Bachelors degree in computer science, business, or related field
¿Te interesa este puesto?