Data Engineer

icon Pakistan, Islamabad


Job Description

Activities To be Performed:

  • The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams.
  • The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them f rom the ground up.
  • The person will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout the ongoing projects.
  • They must be self - directed and comfortable supporting the data needs of multiple teams, systems, and products.
  • The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing inf rastructure for greater scalability, etc.
  • Design, Implement and maintain Data pipelines and data driven solutions.
  • Build data platform at scale using AWS and Snowflake to solve complex business use cases.
  • Conceptualize and design the best fit solution against desired requirements for data delivery.
  • Work with stakeholders including the Managers, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Work with data and analytics experts to strive for greater functionality in our data systems that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.

Must-Have:

  • Experience with integration of data from multiple data sources
  • Strong knowledge of data cleaning and various ETL/ELT techniques and frameworks for development, including merging and normalizing related data sources.
  • Assemble large, complex data sets that meet functional /non-functional business requirements.
  • Hands-on development of blending of data sets from different 
  • data venues
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • We are looking for a candidate with experience in a Data Engineer or Business Intelligence role, who has attained a 
  • Graduate degree in Computer Science, Statistics, Informatics, 
  • Information Systems, or another quantitative field. They should also have experience using the following software/tools
  • Experience with relational SQL databases, including SQL server and Postgres. • Experience with AWS cloud services: EC2, EMR, RDS
  • Hands-on Experience with Cloud Datalake: Snowflake, Redshift
  • Experience with ETL/ELT tools such as Matillion, Talend, Streamset, NiFi
  • Experience with Tableau or comparable experience with other BI visualization tool such as Qlikview or Domo is a plus
  • Experience with any object-oriented/object function scripting languages: C#, Python, Java, C++, Scala, etc.

Experience building and optimizing ‘big data’ data pipelines, architectures, and data sets.Demonstrated experience building data integrations using web data sources (REST, json, javascript, SOAP, RSS etc)Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.Experience scheduling/automating scripts.Experience using Python/AWS/ to extract, clean and manipulate large datasets both structured and unstructured to construct data staging layers to apply ML algorithms.Have built processes supporting data transformation, data structures, metadata, dependency, and workload management.Experience supporting and working with cross-functional teams in a dynamic environment.

Nice to Have:

  • AWS Certified – Solutions Architect/Developer/Big Data is a plus
  • Nice to have experience with big data tools: Hadoop, Spark, Kafka, etc

Required Skills

JAVA,Scala,Hadoop,ETL,Bigdata,Kafka,Nifi,PySpark,SQL,Spark,Python Knowledge,BI Visulalization
Apply this job
Job has been deactivated.


Industry

Information Technology

CATEGORY

Software & Web Development

JOB TYPE

Contract

Minimum Education

Bachelors

Career Level

Experienced Professional

Minimum Experience

4 Years

Total Positions

1