This role is responsible for the development and implementation of data warehouse solutions for company-wide application and managing large sets of structured and unstructured data.

  • The candidate will be expected to analyze complex customer requirements and work with data warehouse architect to gather, define and document data transformation rules and its implementation.
  • Data Engineers will be expected to have a broad understanding of the data acquisition and integration space and be able to weigh the pros and cons of different architectures and approaches.
  • You will have a chance to learn and work with multiple technologies and Thought Leaders in the domain.

Responsibility

  • Translate the business requirements into technical requirements
  • ETL development using native and 3rd party tools and utilities
  • Write and optimize complex SQL and shell scripts
  • Design and develop code, scripts, and data pipelines that leverage structured and unstructured data
  • Data ingestion pipelines and ETL processing, including low-latency data acquisition and stream processing
  • Design and develop processes/procedures for integration of data warehouse solutions in an operative IT environment.
  • Monitoring performance and advising any necessary configurations & infrastructure changes.
  • Readiness to travel to customer sites for short, medium or long-term engagements.
  • Create and maintain technical documentation that is required in supporting solutions.

Skills And Qualifications

  • S. / M.S. in Computer Sciences or related field
  • Hands on experience with at least one ETL tool like Informatica, DataStage, Talend etc.
  • Strong RDBMS concepts and SQL development skills
  • Knowledge of data modeling and data mapping
  • Experience with Data Integration from multiple data sources
  • Good Data warehouse & ETL concepts
  • Understanding of one or more business areas and industries Telecom, Retail, Financial etc.
  • Good knowledge of Big Data technologies such as Pig, Hive, Spark, Kafka, Nifi
  • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
  • Experience with any of the Hadoop distributions such as Cloudera/Hortonworks
  • Experience on any of one development or scripting languages e.g. Java, Groovy or Python
  • Good understanding and basic working experience with at least one cloud service provider AWS, Azure or Google Cloud and with their native tools.
  • Good understanding of Agile delivery methodologies.
  • Solid understanding of DevOps technology stack and standard tools/practices like Linux, Dockers, Jenkins & Git etc.
  • Training/Certification on any Hadoop distribution will be a plus.
  • Good communication and analytical skills
  • Ability to work in a dynamic and collaborative team environment, demonstrating excellent interpersonal skills

Job Details

Total Positions:
15 Posts
Job Shift:
First Shift (Day)
Job Type:
Job Location:
Gender:
No Preference
Age:
24 - 36 Years
Minimum Education:
Bachelors
Career Level:
Experienced Professional
Experience:
3 Years - 6 Years (Data Engineering experience is mandatory)
Apply Before:
Nov 29, 2021
Posting Date:
Oct 17, 2021

Teradata

Information Technology · 11-50 employees - Islamabad, Lahore, Rawalpindi

What is your Competitive Advantage?

Get quick competitive analysis and professional insights about yourself
Talk to our expert team of counsellors to improve your CV!
Try Rozee Premium

Similar Job Titles

Data Engineer

Musharp, Lahore, Pakistan
Posted Apr 19, 2024

Data Engineer

Easy Consulting, Lahore, Pakistan
Posted Apr 17, 2024

Data Integration Engineer

Posted Apr 05, 2024

Data Modeler / Data Architect

MetaSol, Rawalpindi, Pakistan
Posted Apr 23, 2024
View All
I found a job on Rozee!