About The Role:
At Shopdev, we love to craft high-end technology products for our clients. Our fast-paced Agile programming environment is exciting and well integrated, we work closely with industry experts to deliver meaningful experiences using the most cutting edge technology.
We are looking for a Sr. Data Engineer with 4+ years of experience - ideally in the E-Commerce domain. Senior Data Engineers develop Data Warehouse solutions based on the analysis of business goals, objectives, needs, and existing data systems infrastructure. You will be participating in exciting projects covering the end-to-end data lifecycle – from raw data integrations with primary and third-party systems, through advanced data modeling, to state-of-the-art data visualization and development of innovative data products.
- Provide hands-on development within a team of Data Engineers, BI Analysts, Data Architects.
- Design and implementation of data products enabling data-driven features or business solutions.
- Work on Data warehousing projects and involved in supporting data modelling activities as part of building the data warehouse.
- Develop, implement, and optimize existing ETL processes that merge data from disparate sources for consumption by data analysts and scientists, business owners, and decisions makers.
- Development and testing of batch and real-time data pipelines using serverless and managed Cloud based services
- Facilitate data transformation, normalization, cleansing, aggregation, workflow management, and business rule application.
- Detect data quality issues, identify their root causes, implement fixes, and design data audits to capture issues.
- Building data dashboards and advanced visualizations for Cloud Data Platforms.
Job Requirements and Skills
- B.S. degree in Information Technology, Computer Science, Engineering or equivalent experience.
- 4+ years of overall work experience in data engineering.
- At least 2+ years of hands-on experience using Snowflake is mandatory. (Note: specific hands-on Snowflake experience will be verified at the interview). Should possess hands-on experience with Snowflake utilities, SnowSQL, SnowPipe etc.
- Strong understanding of data modeling (i.e. conceptual, logical and physical model design, experience with Operation Data Stores, Enterprise Data warehouses, Data Marts, Data Lakes etc.).
- Experience in Databases like Teradata/ Postgres/ Oracle / SQL Server etc.
- Strong experience in one or more ETL tools such as Informatica, Talend, Tableau, Matillion etc.
- Development experience using Big data technologies; Spark, Kafka, Message Queues
- Fluidity with at least one modern scripting language (Python or R)
- Must have experience with a wide range of persistence stores such as columnar MPP, relational, and NoSql/document databases.
- Must have experience implementing custom and/or third party ETL/ELT pipelines on a managed data warehouse/analytics platform.
- Must have experience with design, implementation, and operation of data intensive - distributed systems.