About the Job : HEROIC Cybersecurity ( HEROIC.com ) is seeking an experienced Elasticsearch Pipeline Developer to design, optimize, and maintain large-scale search and analytics infrastructures that power our AI-driven cybersecurity intelligence platform.
You will be responsible for architecting and managing high-performance Elasticsearch clusters and data ingestion pipelines that process billions of cybersecurity data points sourced from the surface, deep, and dark web.
This role combines strong technical expertise in Elasticsearch, data engineering, and distributed systems, helping HEROIC achieve real-time indexing, search, and analytics performance at global scale. to make the internet safer through intelligent, data-driven cybersecurity insights.
What you will do :
- Design, deploy, and manage scalable Elasticsearch clusters supporting petabyte-scale cybersecurity datasets.
- Build and optimize data ingestion pipelines using tools such as Logstash, Beats, Kafka, or custom Python pipelines.
- Develop efficient indexing and querying strategies to enable fast search and analytics across diverse data types.
- Configure and maintain index mappings, analyzers, tokenizers, and relevance tuning for optimized search accuracy.
- Implement and automate data transformation and enrichment workflows for ingestion from multiple data sources.
- Monitor and troubleshoot cluster health, performance, and capacity planning using Elasticsearch APIs and Kibana.
- Manage index lifecycle policies, snapshots, and replication strategies to ensure high availability and reliability
- Work with the backend team to deliver search-ready, structured datasets for advanced analytics and threat detection.
- Integrate Elasticsearch with APIs, microservices, and external systems to support HEROIC's platform ecosystem.
- Automate infrastructure provisioning and scaling using Docker, Kubernetes, and cloud platforms (AWS / GCP).
- Continuously improve data pipeline reliability, latency, and throughput through proactive tuning and optimization.
Requirements
Bachelor's Degree in Computer Science, Information Technology or related fieldMinimum 4 years of professional experience with Elasticsearch in production environments.Deep knowledge of Elasticsearch architecture, including shards, replicas, nodes, and cluster scaling.Hands-on experience with Logstash, Beats, Kafka, or Python-based ETL pipelines for large-scale data ingestion.Strong understanding of index design, query performance optimization, and relevance tuning.Proficiency in Python, Java, or Scala for pipeline development and automation scripting.Solid experience with Kibana for visualization, monitoring, and troubleshooting.Familiarity with NoSQL / relational databases (Cassandra, MongoDB, PostgreSQL) and data modeling for search.Experience with CI / CD pipelines, Git, and DevOps workflows for deployment and monitoring.Strong analytical, debugging, and problem-solving skills in distributed data systems.Excellent English communication skills (written and verbal).Prior experience in cybersecurity, threat intelligence, or large-scale data analytics (preferred but not required).Benefits
Position Type : Full-timeLocation : India (Remote – Work from anywhere)Salary : Competitive salary based on experienceOther Benefits : PTOs & National HolidaysProfessional Growth : Work with cutting-edge AI, cybersecurity, and SaaS technologiesCulture : Fast-paced, innovative, mission-driven team.About Us : HEROIC Cybersecurity (HEROIC.com) is building the future of cybersecurity. Unlike traditional solutions, HEROIC takes a predictive and proactive approach to intelligently secure users before an attack or threat occurs. Our work environment is fast-paced, challenging, and exciting. At HEROIC, you'll collaborate with a team of passionate, driven individuals dedicated to making the world a safer digital place.
Skills Required
Kibana, Cassandra, Postgresql, Logstash, Kafka, beats , Git, Gcp, Docker, Elasticsearch, Mongodb, Python, Kubernetes, Aws