Job Summary :
We are seeking an experienced Big Data Architect to lead the design and implementation of scalable, secure, and high-performing big data solutions. The ideal candidate will possess a deep understanding of big data technologies, data modeling, and cloud-based data services. You will collaborate with cross-functional teams including data scientists, engineers, and business stakeholders to transform business requirements into effective technical solutions. This is a hands-on, leadership role that includes architecture design, technology evaluation, mentorship, and client-facing responsibilities during pre- and post-sales & Responsibilities : Responsibilities :
- Data Architecture Design : Design and implement scalable and fault-tolerant data architecture for batch and streaming systems.
- Data Modeling : Develop efficient data models and database structures to support analytics
and business processes.
Data Integration : Integrate data from various structured and unstructured sources, including legacy systems.Performance Optimization : Monitor and fine-tune data infrastructure for optimal performance.Data Security & Governance : Ensure data security, compliance, and governance across platforms.Collaboration : Work closely with data scientists, engineers, and business teams to gather requirements and deliver solutions.Technology Evaluation : Research, evaluate, and recommend emerging data technologies and tools.Mentorship : Mentor junior team members and promote best practices in data engineering and architecture.Problem Solving : Diagnose complex data challenges and provide robust solutions.Pre / Post Sales Support : Engage in client interactions to support technical sales, solutioning, and Required :Total Experience : 10 - 12 years in IT, with at least 2+ years as a Big Data Architect.Big Data Technologies : Hands-on experience with Hadoop, Spark, Hive, HBase, Kafka, etc.Programming : Proficient in Python, Java, Scala, and Spark.Cloud Platforms : Experience with AWS, Azure, or GCP-based data services (e.g., EMR,Databricks, BigQuery).
Database Expertise : Strong knowledge of SQL and NoSQL databases (MongoDB, Cassandra,etc.).
ETL Pipelines : Proven experience designing and implementing scalable ETL workflows.Security & Authentication : Understanding of LDAP, Kerberos, Active Directory, SAMLconfiguration in distributed Skills :
API Design : Experience in building and integrating APIs for data exchange.Data Visualization : Familiarity with visualization tools (Tableau, Power BI, or similar).DevOps & Automation : Experience with CI / CD, containerization (Docker / Kubernetes), and automation tools.Soft Skills : Strong communication, collaboration, and stakeholder management abilities.Compliance & Governance : Knowledge of data privacy laws and regulatory requirements(GDPR, HIPAA, etc.).
(ref : hirist.tech)