Company Description
ThreatXIntel is a startup cybersecurity company focused on delivering advanced and tailored solutions to protect businesses and organizations from cyber threats. Our expertise spans cloud security, web and mobile security testing, DevSecOps, and cloud security assessment. We are committed to providing affordable and custom solutions that cater specifically to the needs of businesses of all sizes, ensuring high-quality protection for digital assets. With a proactive approach to security, ThreatXIntel helps clients identify and address vulnerabilities before they can be exploited, allowing businesses to operate with confidence and peace of mind.
Role Description
We are seeking an experienced Freelance Senior Data Engineer with strong expertise in Vectr, Cribl, Azure Data Factory (ADF), Databricks (PySpark), and large-scale ETL / ELT development . The consultant will play a key role in designing, building, and optimizing cloud-native data pipelines, integrating disparate data sources, and ensuring secure, high-quality data processing across the organization.
This is a senior-level role requiring 8+ years of data engineering experience along with deep Azure and Python capabilities.
Responsibilities
- Design, develop, and maintain ETL / ELT pipelines using Azure Data Factory (ADF) and Databricks (PySpark)
- Develop and manage API integrations for seamless data exchange across internal and external systems
- Write, tune, and optimize SQL queries, stored procedures, and transformation logic
- Build, secure, and maintain data platforms including Azure Data Lake and Azure SQL Database
- Implement Python-based automation, orchestration, and data processing workflows
- Leverage Vectr and Cribl for pipeline observability, log analytics, and data flow monitoring
- Troubleshoot and optimize pipeline performance, ensuring reliability and scalability
- Perform unit testing , integrate with automated test frameworks, and collaborate with QA teams
- Ensure data governance, compliance, and security alignment with enterprise and industry standards
- Work independently while communicating effectively with cross-functional teams
Required Technical Skills
Strong proficiency in designing and building ETL / ELT pipelinesHands-on expertise with Azure Data Factory (ADF)Practical experience developing data workflows using Databricks (PySpark)Advanced skills in Python for automation, orchestration, and data processingStrong SQL development skills including query optimization and stored proceduresExperience working with Azure Data Lake and Azure SQL DatabaseAbility to build and manage API integrations for data exchangeWorking knowledge of Vectr for analytics and data security visibilityWorking knowledge of Cribl for log routing, observability, and pipeline monitoringFamiliarity with data governance, security controls, and cloud best practices