The Role
As a Data Engineer, you will be responsible for designing, building, and maintaining robust data pipelines and workflows. You will work closely with data analysts, data scientists, and other stakeholders to ensure data availability, quality, and reliability, and contribute to our data-driven success.
Key Responsibilities
- Data Modeling and Transformation : Design, develop, and maintain data models and transformations using DBT to create analysis-ready datasets.
- ETL Automation : Build and optimize ETL pipelines using Snowflake and Python, ensuring efficient data extraction, transformation, and loading processes.
- Workflow Automation : Implement and manage data workflows using Apache Airflow, scheduling and orchestrating data-related tasks and jobs.
- Data Quality : Establish data quality checks and validation procedures to ensure data accuracy and integrity.
- Collaboration : Collaborate closely with data engineering, data science, and business teams to gather requirements and deliver data solutions.
- Documentation : Maintain comprehensive documentation for data pipelines, models, workflows, and automation processes.
- Security : Implement and adhere to data security best practices within Snowflake and other data-related systems.
- Performance Optimization : Monitor and optimize data pipelines and workflows for performance and scalability.
- Troubleshooting : Diagnose and resolve data pipeline issues, errors, and performance bottlenecks.
Qualifications
Bachelor’s degree in computer science, Information Technology, or a related field (master’s preferred).3+ years of relevant experience required.Proficiency in DBT, Snowflake, Python, and Apache Airflow.Strong SQL skills for data manipulation and querying.Previous experience in data engineering and ETL development.Knowledge of data warehousing concepts and best practices.Excellent problem-solving and communication skills.Ability to work collaboratively in a team-oriented environment.Attention to detail and a commitment to delivering high-quality data solutions.Preferred Qualifications
Snowflake certification(s).Experience with cloud platforms such as AWS, Azure, or Google Cloud.Familiarity with version control systems like Git.Strong scripting skills for automation tasks.Previous experience with big data technologies (e.g., Spark, Hadoop) is a plus.