We are looking for an enthusiastic and highly skilled Senior Data Engineer to join our growing team and play a key role in shaping complex data-centric solutions that power smarter decisions for our clients and internal teams.
As part of our data engineering team, you’ll build and maintain scalable data systems and pipelines managing acquisition, storage, processing, and transformation of large datasets from diverse sources. From data mapping and validation to geocoding, metadata management, and automation , you’ll help deliver the backbone of our analytics and decision-making platforms.
This is a hands-on role where you’ll design and optimize modern data infrastructure, ensuring reliability, scalability, and performance. You’ll collaborate closely with product, engineering, and analytics teams to deliver solutions end-to-end — from concept to deployment — in a fast-paced, agile environment.
What You’ll Do :
- Design, build, and maintain robust, scalable data pipelines and architectures.
- Work with large-scale structured and unstructured datasets , implementing complex transformations.
- Develop and optimize ETL / ELT workflows with modern orchestration frameworks.
- Ensure data quality, governance, security, and reliability across platforms.
- Deliver data structures that support predictive modeling, analytics, and reporting .
- Implement monitoring, alerting, and performance optimization for data systems.
- Collaborate with stakeholders to integrate and deploy data-driven solutions .
- Mentor peers and contribute to architecture decisions for cloud-native data environments.
What We’re Looking For :
Bachelor’s / Master’s in Computer Science, Data Engineering, or related field.5+ years working with large-scale data systems and pipelines .Advanced SQL and strong data modeling expertise.Hands-on experience with PostgreSQL, MS SQL Server , and modern cloud data warehouses (Snowflake, Redshift).Strong Python programming skills for automation and optimization.Experience with ETL / ELT frameworks and orchestration tools (Airflow, dbt).Exposure to cloud platforms (preferably AWS) and DevOps practices (CI / CD, Docker, Kubernetes, Terraform).Ability to leverage AI-assisted coding tools (GitHub Copilot, ChatGPT, etc.) to boost efficiency.Excellent problem-solving and communication skills with a collaborative mindset.