We are looking for an Intermediate Analytics Engineer to join our dynamic team in Islamabad. The ideal candidate will possess a strong analytical mindset and excellent problem-solving abilities. We value entrepreneurial spirit, honesty, innovation, collaboration, and a customer-focused approach, all demonstrated through relevant experience. This role requires expertise in the Data Analytics domain and familiarity with Agile methodologies. You will be part of a 40-member team, including the department head, and report directly to the Practice Lead.
Key Responsibilities
Design and maintain scalable ETL/ELT pipelines using Informatica BDM and DEI. Optimize mappings, workflows, and sessions to improve performance, scalability, reliability, and maintainability. Integrate and transform both structured and unstructured data from multiple sources such as RDBMS, Hadoop, Hive, Spark, and cloud data lakes. Identify gaps in data analytics processes and implement improvements to enhance performance and quality. Build and support data warehousing and analytics solutions aligned with business requirements. Contribute to the practice backlog by enhancing standards, assets, and team learning. Deliver on functional backlogs across operations, development, and reporting. Collaborate with cross-functional teams to translate business rules into technical mappings. Implement CI/CD, version control, and automation for Informatica workflows. Optimize job performance through partitioning, caching, and resource tuning. Maintain comprehensive documentation of data flows, mappings, transformation logic, and technical specifications. Ensure adherence to data governance, security, and compliance standards.
Required Qualifications
A minimum of 4 years’ experience in the Data Analytics domain is essential. Proven experience working in an Agile environment is required. Hands-on knowledge of Data Lakes and Data Warehousing concepts is necessary. Strong expertise in Shell Scripting is expected. Demonstrated proficiency in ETL development, particularly with Informatica BDM (Big Data Management) and Informatica DEI (Data Engineering Integration), is critical. Ability to integrate data from diverse sources, including structured, semi-structured, and unstructured formats such as logs, Hadoop/HDFS, Hive, Spark, and Open Table Formats is required. Advanced SQL skills for data transformation, aggregation, and optimization are essential. Solid understanding of data warehousing and data lake architecture concepts, including star/snowflake schemas, Type-2 Slowly Changing Dimensions (SCD), partitioning, and change data capture, is expected. Familiarity with DevOps practices such as CI/CD pipelines, version control for ETL/BDM jobs, and process automation aligned with Agile methodologies is important.
Preferred Qualifications and Benefits
Experience with cloud platforms and big data distributions such as Azure, AWS, GCP, or Cloudera is a plus. Joining this organization means becoming part of one of Pakistan’s leading employers, inspired by visionary leadership and a unique professional culture. The company fosters a flourishing lifestyle alongside continuous learning and development opportunities. Core values emphasize entrepreneurial and innovative mindsets, professional collaboration, and customer obsession. As one of the largest private sector organizations in Pakistan, the company is committed to transforming the lives of over 75 million customers. This role offers the opportunity to contribute to a transformative journey, empowering millions to thrive in a digital economy.
The application deadline for this position is November 11, 2025.