🔍 We're Hiring : Senior Data Engineer
We are looking for a highly skilled Senior Data Engineer to help build and scale our data pipelines and infrastructure. If you are passionate about modern data architectures, cloud platforms, and high-performance engineering, we want to meet you.
🔧 Key Technical Responsibilities
- Design and build scalable ETL / ELT pipelines for large structured & unstructured datasets.
- Develop and optimize data workflows using modern orchestration tools (preferably Dagster , Airflow, or Prefect).
- Architect and maintain data warehouses and data lake solutions.
- Implement data quality , validation, and monitoring systems.
- Optimize database performance , query tuning, and large-scale data processing.
- Collaborate on data infrastructure for high availability and disaster recovery .
🛠 Core Technical Skills Required
Strong expertise in SQL and one programming language ( Python , Java, or Scala).Hands-on experience with big data technologies : Spark, Hadoop, Kafka, etc.Experience with cloud platforms : AWS / GCP / Azure .Experience with data warehousing (Snowflake, Redshift, BigQuery).Exposure to vector databases (Pinecone, Weaviate, Milvus, Qdrant).Strong understanding of data modeling , database design, and performance optimization.Experience with CI / CD , version control ( Git ), and scalable deployment practices.Familiarity with standalone servers , hybrid cloud environments, and on-prem infrastructure.✨ Preferred Technical Exposure
Real-time streaming & event-driven architectures.Data governance, data security, and compliance frameworks.Containerization tools : Docker / Kubernetes .MLOps tools and support for machine learning workflows.BI & visualization platforms.#Hiring #SeniorDataEngineer #DataEngineerJobs #DataEngineering #BigData #ETL #DataPipelines #DataInfrastructure #CloudJobs #AWS #Azure #GCP #PythonDeveloper #Spark #Kafka #Snowflake #Redshift #BigQuery #MachineLearning #AIJobs #TechJobs #EngineeringJobs #NowHiring #Careers #JobOpportunity