About the Role
We are seeking a highly skilled
Risk Data Pipeline Engineer
to design, build, and maintain scalable data pipelines supporting
risk management, market data, and trading analytics . The ideal candidate will have strong programming expertise, hands-on experience with data engineering workflows, and a deep understanding of financial instruments and risk concepts.
You’ll work closely with
Risk, Quant, and Technology teams
to ensure accurate, real-time data flow for risk computations, exposure monitoring, and regulatory reporting.
Key Responsibilities
Design, develop, and maintain
end-to-end data pipelines
for risk and P&L systems.
Integrate and transform data from multiple sources —
trading systems, market data feeds, position data, reference data , etc.
Ensure
data quality, consistency, and timeliness
across all risk data layers.
Collaborate with
Risk, Quant, and DevOps teams
to support risk calculations and analytics.
Build
automated validation, monitoring, and alerting
systems for critical data flows.
Optimize ETL workflows and improve
data processing efficiency and scalability .
Contribute to the design and implementation of
data models and schemas
for risk systems.
Troubleshoot data and performance issues in production environments.
Required Skills & Experience
Strong programming skills in
Python
and
SQL
(experience with
Pandas ,
PySpark , or
Airflow
is a plus).
Hands-on experience with
data pipeline tools
(e.g.,
Airflow ,
Luigi ,
Kafka ,
AWS Glue , or similar).
Strong understanding of
database systems
— relational (PostgreSQL, MySQL, Oracle) and / or NoSQL (MongoDB, Redis).
Exposure to
risk management concepts
— VaR, exposure, sensitivities, stress testing, etc.
Familiarity with
financial data types
— trades, positions, reference data, pricing data, and market feeds.
Experience working in
Unix / Linux environments
with scripting knowledge (Shell, Bash).
Strong analytical and problem-solving skills, attention to detail, and ownership mindset.
Preferred / Nice-to-Have Skills
Knowledge of
cloud platforms
(AWS, GCP, or Azure) and
data lake / data warehouse
architectures.
Familiarity with
quant / risk systems
or
risk engines
(Murex, Calypso, proprietary risk systems).
Understanding of
fixed income, derivatives, and equities .
Experience in
distributed data processing
(Spark, Hadoop).
Prior experience in
hedge funds, investment banks, or fintech firms
preferred.
Education
Bachelor’s or Master’s degree in
Computer Science, Engineering, Mathematics, or a related discipline .
Strong academic background with exposure to
data structures, algorithms, and statistics .
Why Join Us
Work with global
Risk, Quant, and Technology
teams on mission-critical systems.
Be part of a
high-performance engineering culture
driving real-time financial insights.
Opportunity to shape and optimize
data-driven risk infrastructure
in a dynamic financial environment.
Data Pipeline • Vizianagaram, Andhra Pradesh, India