Data Engineer with at least 5 years of experience who can handle the entire data lifecycle, from extraction and transformation to loading and analysis. You'll use your skills in data transformation, database management, and scripting to build robust data pipelines in a fast-paced, agile environment.
Core Responsibilities
- Data Transformation & Databases : You must have solid experience with a data transformation tool like Informatica and be strong in writing SQL queries for databases such as Oracle and MySQL .
- Agile & DevOps : You will be comfortable with agile development techniques, including Scrum , continuous integration , and DevOps .
- Big Data & Cloud : The role requires familiarity with the Big Data stack (Hadoop, Hive, Spark, etc.). Knowledge of AWS and its services is preferred and a significant plus.
- Scripting & Visualization : You should have knowledge of Python or shell scripting . Experience with a visualization tool like Looker, Tableau, or QlikView is also a plus.
- System Development : Experience with building high-scale, distributed systems and data pipelines using technologies like Apache Kafka or AWS Lambda is also beneficial.
Skills Required
Data Transformation, Sql, Informatica, Big Data, Python, Scrum