Job DescriptionThis role is responsible for the development and implementation of data warehouse solutions for company-wide application and managing large sets of structured and unstructured data.

  • The candidate will be expected to analyze complex customer requirements and work with data warehouse architect to gather, define and document data transformation rules and its implementation.
  • Data Engineers will be expected to have a broad understanding of the data acquisition and integration space and be able to weigh the pros and cons of different architectures and approaches.
  • You will have a chance to learn and work with multiple technologies and Thought Leaders in the domain.

Responsibility

  • Translate the business requirements into technical requirements
  • ETL development using native and 3rd party tools and utilities
  • Write and optimize complex SQL and shell scripts
  • Design and develop code, scripts, and data pipelines that leverage structured and unstructured data
  • Data ingestion pipelines and ETL processing, including low-latency data acquisition and stream processing
  • Design and develop processes/procedures for integration of data warehouse solutions in an operative IT environment.
  • Monitoring performance and advising any necessary configurations & infrastructure changes.
  • Create and maintain technical documentation that is required in supporting solutions.
  • Coordinate with customers to understand their requirements
  • Provide direct support to the project lead and the solution delivery team.
  • Readiness to travel to customer sites for short, medium or long-term engagements.

Skills And Qualifications

  • S. / M.S. in Computer Sciences or related field
  • Hands on experience with one or more ETL tools like Informatica, DataStage, Talend etc.
  • Strong concepts of designing and developing ETL architectures.
  • Strong RDBMS concepts and SQL development skills
  • Good understanding of data modeling and mapping
  • Experience with Data Integration from multiple data sources
  • Working experience in one or more business areas like Telecom, Retail, Finance etc.
  • Good knowledge of Big Data technologies such as Pig, Hive, Spark, Kafka, Nifi
  • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
  • Experience with any of the Hadoop distributions such as Cloudera/Hortonworks
  • Working experience with at least one cloud service provider AWS, Azure or Google Cloud and hands on with their native tools.
  • Experience in projects using Agile delivery is mandatory.
  • Solid understanding of DevOps technology stack and standard tools/practices like Linux, Dockers, Jenkins & Git etc.
  • Experience with Teradata tools and technologies will be a plus.
  • Training/Certification on Teradata will be a plus.
  • Good communication and presentation skills, both verbal and written
  • Proven experience in customer facing roles for large engagements and experience in working with delivery teams.
  • Ability to solve problems using a creative and logical mind set
  • Must be self-motivated, analytical, detail oriented, organized and pursue excellence on all tasks

工作详细内容

全部职位:
5 发布
工作时间:
早班
工作类型:
性别:
没有偏好
年龄:
35 - 46 年
最低学历:
学士
学位头衔:
BCS/BS/MCSC/MS
职位等级:
资深专业人员
经验:
5年 - 9年 (Data Engineering experience is mandatory)
在之前申请:
Nov 29, 2021
发布日期:
Oct 17, 2021

Teradata

· 11-50 员工 - 伊斯兰堡, 拉合尔, 拉瓦尔品

你最大的竞争优势

快速得到有竞争力的分析和专业的对你的评定
联系我们团队的专业顾问来提升你的简历
尝试罗资 专业版

相同职位头衔

Data Engineer

Horizon Technologies, 卡拉奇, 巴基斯坦
发布 Mar 09, 2024

Data Integration Engineer

Professional Employers (Pvt) Ltd, 伊斯兰堡, 巴基斯坦
发布 Mar 13, 2024

Data Processor

Research and Development Solutions, 伊斯兰堡, 巴基斯坦
发布 Mar 26, 2024

Data Analyst

BlueEX, 卡拉奇, 巴基斯坦
发布 Mar 19, 2024
浏览全部
我在ROZEE上找到工作啦!