概要

Experience in ETL and Data Management + Migration to AWS cloud serverless.


Hands-on experience in Data Integration Tools including Pentaho Data Integration.
Experience in Data warehousing with substantial knowledge of Real Estate.
Working knowledge of ETL development lifecycle including Extraction, transformation, loading, and scheduling job.
Exposure to large data volumes in major database systems including PostgreSQL.
Good analytical & design skills-focused approach, team player, hard-working and professional attitude to work.

Working on Migration of A system to serverless AWS services


项目

Scrum Management Tool

工作经历

公司标识
Senior Data Engineer (AWS)
Addo AI
Dec 2023 - 代表 | Lahore, Pakistan

公司标识
Senior Cloud Data Engineer
Odyssey Solutions
Nov 2022 - Jan 2024 | Lahore, Pakistan

公司标识
Data Engineer | ETL | DWH | AWS
Kavtech Solutions
Nov 2020 - Nov 2022 | Lahore, Pakistan








Migration of Existing Architecture to AWS Serverless Architecture• Migrate existing Data pipeline to Serverless Architecture. - AWS Lambda - S3 - Athena - Redshift- Cloud Watch - SQS services- AWS Glue• Implemented ETL Jobs and Transformations to load data from different sources topre_staging table, Cleansing data process, moving to stage area, and then to target table usingPentaho Data Integration.• Data wrangling and Change data capture.• Developed a complete ETL pipeline, which included data extraction from tabular and non-tabulardata sources and performed Merging of Data Streams, Data Cleansing, Data Validation, SendingEmail, and Error Handling in the ETL pipeline using Pentaho Data Integration.• Schedule ETL jobs using a Custom-built scheduler in Pentaho• Providing support, and optimization on running ETL processes• Collaborating with the customer’s technical team to gather technical requirements such as performance, maintainability, and scalability.• Producing design documentation.• Managing the approval and acceptance process for the designs and implementation in cooperation with the client.• Resolving ongoing maintenance issues and bug fixes; monitoring daily scheduled jobs and performance tuning of transformation and jobs.• PostgreSQL DB to perform queries + PLSQL queries on dataMigration of Existing Architecture to AWS Serverless Architecture • Migrate existing Data pipeline to Serverless Architecture.   - AWS Lambda - S3 - Athena - Redshift - Cloud Watch - SQS services - AWS Glue • Implemented ETL Jobs and Transformations to load data from different sources to pre_staging table, Cleansing data process, moving to stage area, and then to target table using Pentaho Data Integration.
• Data wrangling and Change data capture.
• Developed a complete ETL pipeline, which included data extraction from tabular and non-tabular data sources and performed Merging of Data Streams, Data Cleansing, Data Validation, Sending Email and Error Handling in the ETL pipeline using Pentaho Data Integration.
• Schedule ETL jobs using a Custom-built scheduler in Pentaho
• Provide support, and optimization on running ETL processes
• Collaborate with the customer’s technical team to gather technical requirements such as performance, maintainability, and scalability.
• Producing design documentation.
• Managing the approval and acceptance process for the designs and implementation in cooperation with the client.
• Resolving ongoing maintenance issues and bug fixes; monitoring daily scheduled jobs and performance tuning of transformation and jobs.
• PostgreSQL DB to perform queries + PLSQL queries on data












Skills: Apache Airflow · Pentaho · Snowflake Cloud · Data Warehouse Architecture · Python (Programming Language) · Extract, Transform, Load (ETL) · SQL · Amazon Web Services (AWS) · PostgreSQL






公司标识
MERN Stack developer
Novatore Solutions
Mar 2019 - Apr 2019 | Lahore, Pakistan


学历

COMSATS Institute of Information Technology
学士, 理工学士, ‎
Software Engineering
CGPA 3.3/4
2019

技能

熟练 Airflow
中级 Business Intelligence
熟练 Data Warehouse Architecture
熟练 ETL Tools
熟练 GitHub
熟练 Pentaho
熟练 PostgreSQL

语言

熟练 乌尔都语
熟练 英语