We are looking for a skilled 3- 5 years experienced Senior Technical Consultant with expertise in SQL, SSIS, Python and PySpark to join our team The ideal candidate will have proficiency in building scalable Interfaces, performance tuning, data cleansing and validation strategies by leveraging the defined tech stack for data processing and data movement
What you ll do :
- Advanced Execution & Data Management : Oversee and manage intricate project tasks, providing insights and directions related to advanced data ingestion, transformation, validation, and publishing
- Review and analyse the data provided by the customer along with its technical / functional intent and interdependencies
- Engage proactively with functional teams, ensuring a thorough understanding of end-toend data flows as related to the technical integration
- Build data Ingress or Egress pipelines , handling of huge volume of data and developing data transformation functions using languages such as SSIS, Python, Pyspark, SQL etc
- Integration of various data sources definitions like Teradata, SAP ERP, SQL Server, Oracle, Sybase, ODBC connectors & Flat Files through API or Batch
- Production Deployment and Hypercare : Assist with Production Deployment tasks; Assists with triage of issues, testing and identifying root cause; Carry out timely response and resolution of batch automation disruptions, in order to meet customer SLA s with accurate and on-time results
- Technical Leadership & Coding Oversight : Guide and review the code developed by junior consultants, ensuring alignment with best practices
- Incorporate o9 ways of working and embed the industry standards for smoother project executions
What you should have :
3+ years experience in Data architecture, Data engineering, or a related field, with a strong focus on data modelling, ETL processes, and cloud-based data platformsHands-on experience with SSIS Packages, Python, PySpark, SQL languagesalong with workflow management tools like Airflow, SSISExperience working with Parquet, JSON, Restful APIs, HDFS, Delta Lake and query frameworks like Hive, PrestoAdvanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databasesWorking experience with version control platforms, eg GitHub, Azure DevOpsFamiliarity with Agile methodologyProactive mindset and the right attitude to embrace the agility of learningExcellent verbal and written communication skillsGood to have
Hands-on Experience with Delta LakeExperience with Supply chain planning applicationsExperience with Amazon Web Services (AWS), AZURE, Google Cloud InfrastructuresSkills Required
data engineering , Etl Process, Ssis Packages, Data Lake, Python, Sql, Azure Devops