Data Engineer will use various methods to extract, and transform raw data into data lakes/data warehouses and strive for efficiency by aligning data systems with business goals. To succeed in this data engineering position, he/she should have strong analytical skills and the ability to extract, load and transform data from various sources into data lakes and data warehouses. Data engineer skills also include familiarity with several programming languages (preferably Python or R), SQL, building ETL or ELT pipelines, and managing data lakes, data warehouses, and data marts.

Duties/Responsibilities:

  • Extract data from multiple sources (Cloud/On-Prem) and ingest it into a data lake (AWS S3) through different AWS Services or APIs or connection protocols such as ODBC, JDBC, etc.
  • Data cleaning, cleansing, and transformation.
  • Build and maintain data lakes, data warehouses, and data marts on AWS as per the business requirements.
  • Building data catalog on AWS Glue.
  • Build data pipelines and workflows to ingest raw data and transform/clean data into data lakes and data warehouses respectively using AWS Glue.
  • Conduct complex data analysis and report on results.
  • Explore ways to enhance data quality and reliability.
  • Building and maintaining data catalog.
  • Evaluate business needs and objectives.
  • Interpret trends and patterns.

  Required Skills/Abilities:

  • Previous experience as a data engineer or in a similar role.
  • Technical expertise with data models, data scrapping, data cleansing, and segmentation techniques.
  • Knowledge and understanding of Amazon Web Services such as AWS Glue (Crawler, Job, Database, Workflow), AWS S3, AWS App flow, AWS Athena, AWS Lambda, etc.
  • Knowledge and experience in connecting with multiple data sources using different AWS Services or APIs or connection protocols such as ODBC, JDBC, etc.
  • Knowledge and experience of Python and PySpark.
  • Knowledge and experience in SQL and SparkSQL queries.
  • Knowledge of MS Excel and ability to build various views using pivot tables.
  • Great numerical, statistical, and analytical skills.
  • Data engineering certification will be a plus.
  • Knowledge and experience in Beautiful Soup/Selenium/Scrappy will be a plus.
  • Knowledge and experience on Terraform will be a plus.

Education and Experience:

Bachelor’s degree in Computer Science or equivalent required.

2+ years of progressive experience in working on AWS services.

Note:

Job Timing: 6:00 PM to 3:00 AM (Mon to Fri)Candidate must be fluent in spoken English

Job Details

Total Positions:
1 Post
Job Shift:
Third Shift (Night)
Job Type:
Job Location:
Gender:
Male
Minimum Education:
Bachelors
Career Level:
Experienced Professional
Minimum Experience:
2 Years
Apply Before:
Jan 04, 2023
Posting Date:
Dec 03, 2022

Horizon Technologies

Information Technology · 101-200 employees - Karachi

Horizon Technologies is an established IT services company having years of experience providing high-quality and cost-effective web development, IT Support and Surveillance solutions. Our expertise lie in Custom Web, mobile & Software Development, Surveillance (CCTV), Time Attendance and Access Control, IT consultancy & infrastructure, BPO & contact center work along with recruitment & IT Maintenance services. We are an one stop IT & office automation service provider where all your IT & business augmentation requirements are met under one umbrella of Horizon Technologies.

What is your Competitive Advantage?

Get quick competitive analysis and professional insights about yourself
Talk to our expert team of counsellors to improve your CV!
Try Rozee Premium

Similar Job Titles

Machine Learning Engineer

AirCod Technologies, Lahore, Pakistan
Posted Apr 24, 2024
I found a job on Rozee!