About Us
Taiyo.AI is the world's first infrastructure intelligence platform. We are building the largest universal
and industry standard database of opportunities (tenders, projects, news) and threats (economy,
climate, geopolitics, finance, logistics, etc.) for real assets. Taiyo.AI has been instrumental in shaping
how infrastructure companies (infra investors, engineering, procurement, construction, and infra
insurers) benchmark new project development opportunities, get a panoramic and dynamic view of
external risks, predict prices, identify drivers, and mitigate supply-side disruptions. We are seeking a
candidate that is willing to learn and contribute to emerging technology and policy.
Responsibilities :
1. Work on data sourcing
2. Use web scrapers (Beautifulsoup, selenium, etc.)
3. Manage the data normalization and standards validation
4. Parametrize and automate the scrapers
5. Develop and execute the processes for monitoring data sanity and checking for data availability
and reliability
6. Understand the business drivers and build insights through data
7. Work with the stakeholders at all levels to establish current and ongoing data support and
reporting needs
8. Ensure continuous data accuracy and recognize data discrepancies in systems that require
immediate attention / escalation
9. Work and become an expert in the company's data warehouse and other data storage tools,
understanding the definition, context, and proper use of all attributes and metrics
10. Create dashboards based on business requirements
11. Work on the distributed systems, scale, cloud, caching, CI / CD (continuous integration and
deployment), distributed logging, data pipeline, REST API)
Requirements :
their graduation in 2025
platforms (e.g., AWS, Azure, GCP).
Benefits :
Apply here : https : / / docs.google.com / forms / d / e / 1FAIpQLSeHeVmkZ2h2KtQq2uisokZFrGNEVJT0UhWa8fBLZT27TyA08g / viewform?usp=sharing&ouid=109654441412058597015
Data Engineer • Nagpur, IN