Description :
We are looking for a Data Modeller with at least 3 years of professional experience in designing scalable data models and writing complex SQL queries in Postgres and MongoDB. The ideal candidate must also have hands-on expertise in Python and data-related libraries along with strong working knowledge of AWS data services and strong AWS fundamentals like VPC, subnets, routing, peering and configuring inbound / outbound rules.
Key Responsibilities :
- Design, develop, and maintain conceptual, logical, and physical data models across relational (Postgres) and NoSQL (MongoDB) systems.
- Write and optimize complex SQL queries, stored procedures, and views in Postgres; build advanced queries and aggregation pipelines in MongoDB.
- Develop and automate ETL / ELT data pipelines using Python and libraries such as pandas, NumPy, SQLAlchemy, PyMongo, boto3.
- Manage and optimize AWS Data related services. Experience in other cloud providers like Azure, GCP is good to have.
- Configure network routing, inbound / outbound rules, and security groups for secure data access.
- Monitor performance, troubleshoot issues, and ensure scalability, data quality, and security.
- Collaborate with data engineers, analysts, and business teams to translate requirements into effective data solutions.
Requirements :
Required Skills & Experience :
Minimum 3 years of experience in data modelling, database development, and SQL programming.Expertise in Postgres (complex queries, indexing, optimization) and MongoDB (schema design, aggregation).Strong hands-on experience with Python and related libraries (pandas, NumPy, SQLAlchemy, PyMongo, boto3). Shell scripting must have.Proficiency in AWS cloud services : RDS, S3, CloudWatch, VPC, IAM, ASM, EC2, Event bridge and any other data-based services.Experience with Routing, Peering, Inbound / Outbound rules, and security group configuration.Strong problem-solving, analytical, and communication skills.Knowledge of data governance and compliance frameworks.(ref : hirist.tech)