Senior Data Engineer | AWS | Azure | Databricks
Send a job offer directly to this candidate
I am a Data Engineer professional with over 8 years of industry experience in developing large-scale data processing and analytics, creating end-to-end data solutions. Graduated as Master's in Computer Science from Roosevelt University. Prior to which I was working as Data Engineer in Publicis Sapient building scalable ETL processes & data migrations to implement automated workflows to ingest, transform and load multiple sources into aggregated data on large-scale retail entities (>1TB daily).
Before joining Sapient I worked as Data Engineer/Data Analyst at DIATOZ Solutions, where I was focused on retail project to migrate on-premise databases to AWS environment leveraging on-demand services like Lambda, Glue and Step functions and generate reports and create dashboards for timely reports.
Over the course of my professional career, I have developed strong and varied expertise in techniques like PySpark, SQL, Terraform, Docker, Data Warehousing, Data Modeling, ETL pipelines, Project Management, Reporting, Big Data, Cloud Computing & Consulting. I'm dedicated to helping businesses leverage real-time updates, providing scalable end-to-end data solutions with passion in the rapidly evolving machine learning and deep learning fields.
My skillset is a blend of various tools and technologies, including but not limited to:
📍 Hadoop Ecosystem (HDFS, Hive, Sqoop, Hbase)
📍 Scala / Python
📍 SPARK (PySpark, Databricks)
📍 Snowflake
📍 SQL (SQL Server, Oracle, MySQL)
📍 Tableau, PowerBI
📍 Azure Cloud (ADF, ADLS/Blob, Synapse and other related services)
📍 AWS (Glue, Step functions, Lambda, RDS, SNS, S3, and other related services)
I am a Data Engineer professional with over 8 years of industry experience in developing large-scale data processing and analytics, creating end-to-end data solutions.
Experience in Publicis Sapient building scalable ETL processes & data migrations to implement automated workflows to ingest, transform and load multiple sources into aggregated data on large-scale retail entities (>1TB daily). Before joining Sapient I worked as Data Engineer/Data Analyst at DIATOZ Solutions, where I was focused on retail project to migrate on-premise databases to AWS environment leveraging on-demand services like Lambda, Glue and Step functions and generate reports and create dashboards for timely reports.
Over the course of my professional career, I have developed strong and varied expertise in techniques like PySpark, SQL, Terraform, Docker, Data Warehousing, Data Modeling, ETL pipelines, Project Management, Reporting, Big Data, Cloud Computing & Consulting. I'm dedicated to helping businesses leverage real-time updates, providing scalable end-to-end data solutions with passion in the rapidly evolving machine learning and deep learning fields.
Master's in Computer Science at Roosevelt University