Snowflake Data Engineer at Cognizant Technology Solutions (Discover Financial Services Client) (2022-08 – Present)
Delivered near real-time data processing pipelines for Discover Financial Services' analytics platform, ingesting structured and semi-structured financial transaction data from AWS S3 into Snowflake. Performed advanced query optimization, designed advanced data models, built orchestrated near real-time pipelines, and implemented CDC Type 2 for regulatory compliance.
- Delivered near real-time data processing pipelines for Discover Financial Services' analytics platform, ingesting structured and semi-structured financial transaction data from AWS S3 into Snowflake, improving ingestion efficiency by 30%.
- Performed advanced Snowflake query optimization — clustering keys, micro-partition pruning, and result caching — reducing average query execution time by 40% across high-frequency financial reporting and analytics workloads.
- Designed and implemented advanced data models (Star Schema, Snowflake Schema, SCD Type 2) to serve Discover's analytics, reporting, and machine learning pipelines across Finance, Product, and Marketing divisions.
- Built and orchestrated near real-time pipelines using Apache Airflow DAGs and Snowpipe event-driven ingestion, replacing manual batch loads with automated, low-latency financial data workflows.
- Implemented CDC Type 2 using Snowflake Streams to track historical changes across financial datasets, maintaining complete audit history and data lineage for regulatory compliance.
- Leveraged Snowflake Time Travel and Zero-Copy Cloning for data recovery, test environment provisioning, and pipeline validation — ensuring zero production data loss incidents.
- Developed Python-based AWS Lambda functions for event-driven financial data transformation and for automated ELT, integrating AWS S3 data lake with RDS Aurora source systems.
- Established CI/CD pipelines using Git and GitHub Actions for automated, version-controlled deployment of Snowflake stored procedures, Airflow DAGs, and data workflow changes across dev, staging, and production.
- Monitored data quality, pipeline health, and data lineage across Discover's financial data platform, proactively resolving ingestion failures to maintain SLA-compliant data delivery to downstream analytics consumers.
AI Implementation & Generative AI Engineering at Cognizant Technology Solutions (2022-08 – Present)
Extended AWS AI engagement involving model evaluation and optimization using Amazon Bedrock, architectural integration with EC2, scalable model deployment via Amazon SageMaker, and enterprise-grade AI governance implementation.
- Model Evaluation & Optimization: Leveraged Amazon Bedrock Playgrounds to evaluate and compare foundation models (Mistral, Titan, Nova), optimizing performance through hyperparameter tuning for specific business use cases.
- Architectural Integration: Streamlined the deployment of AI-generated content by integrating Amazon Bedrock with EC2 hosting environments, utilizing Session Manager for seamless application updates and content delivery.
- Scalable Model Deployment: Engineered robust AI solutions using Amazon SageMaker, deploying model endpoints (Mistral Lite) with the SageMaker Python SDK and conducting advanced prompt engineering experiments in SageMaker Studio.
- AI Governance & Security: Implemented enterprise-grade safety protocols using Amazon Bedrock Guardrails to create custom safeguard policies, ensuring secure and compliant interactions within conversational AI assistants.
Programmer Analyst Trainee (Intern) at Cognizant Technology Solutions (2022-01 – 2022-07)
Built SQL-based ETL pipelines and dimensional data models simulating financial services reporting workflows. Designed Tableau KPI dashboards supporting drill-down analysis and trend monitoring.
- Built SQL-based ETL pipelines and dimensional data models (Star Schema, Snowflake Schema) simulating financial services reporting workflows; applied data normalization and integrity best practices.
- Designed Tableau KPI dashboards supporting drill-down analysis and trend monitoring across operational and financial metrics for business stakeholders.