Data Engineer
Infinite Computer SolutionsJob description
Job Description
Data Engineer Snowflake
Join an amazing company where you can work with cutting-edge technologies and platforms. Give your career an Infinite edge, with a stimulating environment and a global work culture. Be a part of an organization where we celebrate integrity, innovation, collaboration, teamwork, and passion. A culture where every employee is a leader delivering ideas that make a difference to this world we live in.
In the Database Analysis - Sr Professional II role responsibilities include, although not limited to:
- Architect and build enterprise-grade data solutions leveraging Snowflake as the core data platform.
- Lead the implementation of scalable data models, robust data pipelines, and governed Lakehouse architectures supporting analytics and machine learning.
- Design and optimize Snowflake schemas using best practices in secure data sharing, micro-partitioning, storage strategies, and compute tuning.
- Build and manage real-time ingestion and event-driven architectures using Kafka integrated with Snowflake.
- Implement and enforce enterprise data governance, privacy, data quality, lineage, and strong access controls (RBAC) in Snowflake environments.
- Establish standards for Snowflake performance tuning, cost optimization, workload management, and query efficiency using platform metrics.
- Integrate Snowflake with APIs, cloud services, microservices, and ingestion/orchestration tools such as Databricks, ADF, Spark, or Kafka Connect.
- Solve complex issues related to ingestion failures, pipeline orchestration, schema evolution, and transformation logic across platforms.
- Mentor developers and data engineers on Snowflake usage patterns, modeling strategies, and best practices for building reusable data products.
- Strong analytical mindset with ability to optimize systems at scale.
- Passion for platform maturity, best practices, and mentoring others.
- Ability to troubleshoot complex data challenges across distributed systems.
- Clear communication when collaborating with engineering, architecture, and business teams.
- Ownership-driven and proactive approach to cost efficiency, data quality, and platform security.
Preferred qualifications are in addition to the minimum requirements and are considered a plus factor in identifying top candidates.
Minimum Qualifications- Bachelor’s degree in Computer Science, Data Engineering, Information Systems, Industrial Engineering, or related field.
- 7+ years in data engineering, including hands-on experience designing and optimizing Snowflake environments.
- Expert-level SQL skills, with strong knowledge of micro-partitions, warehouses, clustering, caching, and query tuning in Snowflake.
- Proven experience building ETL/ELT pipelines specifically optimized for Snowflake workloads.
- Experience using Kafka for real-time or event-driven ingestion into Snowflake or dependent systems.
- Proficiency in Python, Scala, C#, or Java for transformations, orchestration logic, and automation.
- Solid knowledge of data warehousing, dimensional modeling, and/or Lakehouse architectures.
- Strong Snowflake security expertise (RBAC, encryption, masking, access policies) and governance frameworks.
- Ability to independently lead design decisions and drive Snowflake platform adoption.
- Strong English verbal and written communication skills, including executive presentation capabilities.
- Experience with advanced Snowflake capabilities: Snowpipe, Streams, Tasks, Replication, Time Travel, Search Optimization Service.
- Background with complementary tools such as Spark, Kafka Connect, Databricks, Azure Data Factory, Airflow, or Prefect.
- Experience with Snowflake cost governance, workload isolation, marketplace integration, or cross-cloud data sharing.
- Familiarity with Agile/Scrum methodologies and ability to collaborate in cross-functional engineering teams.
Range Of Year Experience-Min Year
8
Range Of Year Experience-Max Year
10
¿Te interesa este puesto?