IT Specialist at First Horizon (2025-09 – Present)
IT Specialist role focused on data analysis, SQL query optimization, and dashboard development
- Executed advanced SQL queries to extract, cleanse, and transform large-scale datasets from relational databases, facilitating efficient data preparation and analysis for modeling and reporting initiatives.
- Performed exploratory data analysis and statistical modeling utilizing Python libraries such as pandas, NumPy, scikit-learn, and Prophet to identify trends, seasonality, and business drivers across multiple domains.
- Engineered and deployed interactive dashboards in Power BI and Microsoft Fabric to visualize analytical insights, KPIs, and data narratives for business stakeholders and executive teams.
- Partnered with cross-functional teams to gather business requirements, establish success metrics, and ensure analytical deliverables aligned with organizational objectives and data governance standards.
- Automated data workflows and developed reusable Python scripts and ETL pipelines to support scalable and efficient model training, testing, and deployment processes.
- Leveraged cloud platforms including Microsoft Azure and Databricks for collaborative data processing, distributed computation, and model experimentation within secure, production-ready environments.
Technology Business Office Intern at First Horizon (2025-03 – 2025-08)
Internship focused on dashboard development, data reporting, and process improvement support
- Constructed and maintained dashboards using Power BI and Excel to support executive-level decision-making.
- Collaborated with cross-functional technology teams to support internal process improvement initiatives and data reporting.
- Employed SAS Enterprise Guide to automate data extraction, clean datasets, and generate reports for operational analysis.
- Assisted in gathering and analyzing business requirements to streamline workflows across data and reporting platforms.
- Documented system integrations, technology assets, and process flows to support IT governance and audit readiness.
- Participated in Program Increment (PI) Planning sessions and worked collaboratively with cross-functional associate teams to align project objectives and timelines.
- Coordinated with vendors and internal teams to support enhancements in technology service delivery and data access.
Senior Data Engineer at Larsen & Toubro (Client: Citi Bank) (2022-07 – 2023-08)
Senior Data Engineer role managing ETL pipelines and data optimization for global banking systems
- Directed the development of scalable ETL pipelines utilizing Ab Initio and SQL, processing over 1TB of daily transactional data across global banking systems.
- Optimized SQL performance and indexing strategies, reducing query latency by 40% for high-priority analytical reports.
- Authored automated shell scripts to manage job dependencies and execute tasks using enterprise schedulers such as Control-M.
- Developed and implemented dashboards in Power BI for monitoring operational performance metrics, providing actionable insights to key stakeholders.
- Collaborated with enterprise teams to enhance data accessibility and data quality assurance processes, improving productivity by 20%.
- Executed rigorous unit and regression testing on data pipelines, decreasing data quality incidents by 30%.
- Streamlined operational workflows through Python automation, achieving a 25% increase in efficiency.
Data Engineer at Accenture (Client: Blue Cross Blue Shield, NC) (2018-11 – 2022-07)
Data Engineer role focused on healthcare data pipeline development and reporting solutions
- Delivered interactive Power BI dashboards and custom reporting solutions, equipping leadership with critical business insights.
- Constructed over 100 parameterized SQL scripts for the extraction, transformation, and validation of healthcare claims data, boosting data pipeline efficiency by 25%.
- Established a data analytics framework to satisfy reporting and compliance mandates within the healthcare sector.
- Automated report generation and distribution using shell scripting and scheduler tools, saving 30 hours of manual effort per month.
- Played an integral role in migrating legacy workloads to Azure Data Factory and SQL Server, enhancing overall scalability and maintainability.
- Partnered with cross-organizational teams to identify data challenges and propose innovative solutions for improved operational efficiency.
- Instituted quality control procedures in compliance with regulatory and organizational standards.
- Designed Python validation scripts for data quality assurance checks, ensuring 99.9% data accuracy prior to deployment.