Data Engineer (AWS | PySpark | Airflow) Location & Mode: Pan India and UK (Remote Model) 3+ years of experience. Immediate Joiners. Data Engineering, PySpark (hands-on projects), AWS (S3 + Glue/EMR + Athena), Airflow, Data modeling,SQL+ETL
3,658 job opportunities for airflow in India updated today
job offers found
· Page 1 / 183Data Engineer (AWS | PySpark | Airflow) Location & Mode: Pan India and UK (Remote Model) 3+ years of experience. Immediate Joiners. Data Engineering, PySpark (hands-on projects), AWS (S3 + Glue/EMR + Athena), Airflow, Data modeling,SQL+ETL
Job Description Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insig
Job Description Role Summary We are looking for a Principal AWS Solution Architect – Data & Analytics to serve as the technical authority and strategic leader for enterprise data platforms on AWS. This role is responsible for defining the long-term data & analytics architecture vision, estab
Job Description For GCP requirement, GCP hands on exp in mandatory including Data Flux, Fusion and Big Query. • Overall 8 to 12 years of experience . • Strong experience in Python and SQL for data engineering and data processing. • Hands-on experience with Apache Spark / PySpark for large-scale data
Job Description Role Summary We are looking for a Principal AWS Solution Architect – Data & Analytics to serve as the technical authority and strategic leader for enterprise data platforms on AWS. This role is responsible for defining the long-term data & analytics architecture vision, estab
Job Description Roles & Responsibilities: Design, build, and deploy AI/ML applications, pipelines, and production-grade ML systems Develop scalable microservices-based architectures for model serving using REST and gRPC Containerize and deploy ML workloads on Kubernetes (EKS) using Docker and He
Data Engineer (AWS | PySpark | Airflow) Location & Mode: Pan India and UK (Remote Model) 3+ years of experience. Immediate Joiners. Data Engineering, PySpark (hands-on projects), AWS (S3 + Glue/EMR + Athena), Airflow, Data modeling,SQL+ETL
Job Description Design, develop, and maintain ETL/ELT data pipelines to ingest data from various sources, including databases and flat files, into our data warehouse. Utilize AWS services such as AWS Lambda, AWS DMS, and AWS Glue for data ingestion and transformation processes. Develop and optimize
Job Description Role Summary We are looking for a Principal AWS Solution Architect – Data & Analytics to serve as the technical authority and strategic leader for enterprise data platforms on AWS. This role is responsible for defining the long-term data & analytics architecture vision, estab
Job Description Role Summary We are looking for a Principal AWS Solution Architect – Data & Analytics to serve as the technical authority and strategic leader for enterprise data platforms on AWS. This role is responsible for defining the long-term data & analytics architecture vision, estab
Job Description Role Summary We are looking for a Principal AWS Solution Architect – Data & Analytics to serve as the technical authority and strategic leader for enterprise data platforms on AWS. This role is responsible for defining the long-term data & analytics architecture vision, estab
Job Description You will have to work closely with your peers and team members in security domain presales engineering. You will be responsible for the following critical activities to deliver best-in-class solutions with your technical expertise. Analyze and review project inputs and specifications
Job Description For GCP requirement, GCP hands on exp in mandatory including Data Flux, Fusion and Big Query. • Overall 8 to 12 years of experience . • Strong experience in Python and SQL for data engineering and data processing. • Hands-on experience with Apache Spark / PySpark for large-scale data
Job Description Job description What can you expect in a Director of Data Engineering role with TaskUs: Key Responsibilities: Manage a geographically diverse team of Managers/Senior Manager of Data Engineering responsible for the ETL to process, transform, and derive attributes for all operational d
Role Overview We are seeking an experienced GenAI Developer to design, build, and deploy scalable generative AI solutions for enterprise use cases. The ideal candidate will have strong hands-on experience in building production-grade AI/ML systems, with deep expertise in LLMs, RAG architectures, and
Role Overview We are seeking an experienced GenAI Developer to design, build, and deploy scalable generative AI solutions for enterprise use cases. The ideal candidate will have strong hands-on experience in building production-grade AI/ML systems, with deep expertise in LLMs, RAG architectures, and
Job Description Roles & Responsibilities: Design, build, and deploy AI/ML applications, pipelines, and production-grade ML systems Develop scalable microservices-based architectures for model serving using REST and gRPC Containerize and deploy ML workloads on Kubernetes (EKS) using Docker and He
Job Description What you will do The primary responsibility for this role is to support pre-sales engineering for Building Automation System global projects as per country specific standards. Handling daily routine activities related to presales estimation support by preparing technical assessments.
Job Description What you will do The primary responsibility for this role is to lead Building Management System global projects detailed engineering as per country specific standards from India Engineering center. Independent execution of project throughout the lifecycle , handle first level escalat
Job Description Roles & Responsibilities: Design, build, and deploy AI/ML applications, pipelines, and production-grade ML systems Develop scalable microservices-based architectures for model serving using REST and gRPC Containerize and deploy ML workloads on Kubernetes (EKS) using Docker and He
Job Description What you will do The primary responsibility for this role is to support pre-sales engineering for Building Automation System global projects as per country specific standards. Handling daily routine activities related to presales estimation support by preparing technical assessments.
Job Description Job Overview Having 3-5 years of experience in implementingBig Data solution in Cloud environments. Hands-on in implenting Big data solution using Airflow DAG,Spark,Python,Hive,Trino,S3 Bucket,etc Having experience Cloud Data platform like SnowFlake , Databricks is a plus. Knowledge
Job Description Roles & Responsibilities: Chunking (primarily focused on VectorDB storage). Document Parsing and OCR Document Parsing with VLMs (Vision Language Models) Function Calling with LLMs Retrieval Augmented Generation Traditional Search (BM25, NER based parsers, Keyword based search ind
Job Description • Translate customer requirements into scalable data architecture and high-performance cloud data solutions • Design and implement data ingestion pipelines using Snowpipe and SnowSQL • Build and manage data platforms using AWS services such as EMR, Redshift, S3, Kinesis, Glue, Athena
Job Description • Translate customer requirements into scalable data architecture and high-performance cloud data solutions • Design and implement data ingestion pipelines using Snowpipe and SnowSQL • Build and manage data platforms using AWS services such as EMR, Redshift, S3, Kinesis, Glue, Athena
Role: AI Data Engineer Location: Rai Durg, Hyderabad Work mode- Hybrid model working (3 days work from office) Experience: 5 - 8 Years (Minimum 5 years- AI Data Engineer) Mandatory Skills: DVC (Data Version Control) and Airflow, Apache Spark, Flink, and Kafka, Advanced level Python and AI logic and
₹5,40,000 – ₹8,60,000per year
Estimation confidence: Low
Estimate based on market data. Actual salaries may vary depending on experience, company, and location.
Find airflow job offers in the top cities in India