Job Summary :
The Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives.
This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs.
This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching.
Candidate should elicit qualities such as Innovation, Critical thinking, optimism / positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable.
Candidate must have experience in Snowflake, Matillion and Data warehouse :
- Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Matillion and Snowflake.
- Create functional & technical documentation e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc.
- Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs.
- Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration
- Ensures proper execution / creation of methodology, training, templates, resource plans and engagement review processes
- Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels
- Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices.
Required Qualifications :
6+ Years industry implementation experience with data integration tools such as Matillion and Snowflake etc.Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Matillion.Work together with data scientists and analysts to understand the needs for data and create effective data workflows.Create and maintain data storage solutions including Snowflake.Bachelors degree or equivalent experience, Masters Degree PreferredStrong experience in big data frameworks & working experience in Spark or Hadoop or Hive (incl. derivatives like pySpark (prefered), SparkScala or SparkSQL) or Similar, along with experience in libraries / frameworks to accelerate code developmentStrong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or SimilarUnderstanding of on premises and cloud infrastructure architectures (e.g. Snowflake)Experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI / CD using one or more code management platformsPreferred Skills & Experience :
Can contribute to internal / external Data Integration proof of concepts.Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered.Ability to work independently on projects as well as collaborate effectively across teamsMust excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for successStrong team building, interpersonal, analytical, problem identification and resolution skillsExperience working with multi-level business communitiesDeals effectively with all team members and builds strong working relationships / rapport with them.Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution.(ref : hirist.tech)