Position Summary :
Work Timings (7.30 PM - 3.30 AM)
We are seeking a skilled Data Engineer to join our dynamic team. The Data Engineer will be responsible for designing, developing, and maintaining our data pipelines, integrations, and data warehouse infrastructure. The successful candidate will work closely with data scientists, analysts, and business stakeholders to ensure that our data is accurate, secure, and accessible for all users.
Responsibilities :
- Design and build scalable data pipeline architecture that can handle large volumes of data
- Develop ELT / ETL pipelines to extract, load and transform data from various sources into our data warehouse
- Optimize and maintain the data infrastructure to ensure high availability and performance
- Collaborate with data scientists and analysts to identify and implement improvements to our data pipeline and models
- Develop and maintain data models to support business needs
- Ensure data security and compliance with data governance policies
- Identify and troubleshoot data quality issues
- Automate and streamline processes related to data management
- Stay up-to-date with emerging data technologies and trends to ensure the continuous improvement of our data infrastructure and architecture.
- Analyze the data products and requirements to align with data strategy
- Assist in extracting or researching data for cross-functional business partners for consumer insights, supply chain, and finance teams
- Enhance the efficiency, automation, and accuracy of existing reports
- Follow best practices in data querying and manipulation to ensure data integrity
Requirements :
Bachelor's or masters degree in computer science, Data Science, or a related fieldMust have 8+ years of experience as a Snowflake Data Engineer or related roleMust have experience with SnowflakeStrong Snowflake experience building, maintaining and documenting data pipelinesExpertise in Snowflake concepts like RBAC management, virtual warehouse, file format, streams, zero copy clone, time travel and understand how to use these featuresStrong SQL development experience including SQL queries and stored proceduresStrong knowledge of ELT / ETL no-code / low-code tools like Informatica / SnapLogic.Well versed in data standardization, cleansing, enrichment, and modelingProficiency in one or more programming languages such as Python, Java, or C#Experience with cloud computing platforms such as AWS, Azure, or GCPKnowledge of ELT / ETL processes, data warehousing, and data modelingFamiliarity with data security and governance best practicesExcellent hands-on experience in problem-solving and analytical skills and improving the performance of processesStrong communication and collaboration skills(ref : hirist.tech)