Talent.com
Freelance Data Architect

Freelance Data Architect

Leading MNCranchi, jharkhand, in
10 hours ago
Job description

Looking for a Freelance Data Architect to join a team of rockstar developers. The candidate should have a minimum of 12+ yrs. of experience.

There are multiple openings. If you're looking for freelance / part time opportunity (along with your day job) & a chance to work with the top 0.1% of developers in the industry, this one is for you! You will report into IIT'ans / BITS grads with 10+ years of development experience + work with F500 companies (our customers).

Company Background - We are a multinational software company that is growing at a fast pace. We have offices in Florida & New Delhi. Our clientele spreads across the US, Australia & APAC. To give you a sense of our growth rate, we've added 70+ employees in the last 6 weeks itself and expect another 125+ by the end of Q4 2025.

Key Responsibilities :

  • Design, develop, and maintain end-to-end ETL / ELT pipelines on GCP using tools like Dataflow, Composer, or Cloud Run.
  • Build and manage data models and data warehouses in Snowflake ensuring performance and scalability.
  • Write efficient Python and SQL scripts for data extraction, transformation, and loading processes.
  • Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements.
  • Implement best practices for data quality, governance, and security across GCP and Snowflake environments.
  • Monitor and optimize pipeline performance, troubleshoot data issues, and ensure timely data availability.
  • Automate repetitive data workflows and contribute to continuous integration & deployment (CI / CD) for data pipelines.

Must-Have Skills :

  • Strong proficiency in Python and SQL for data engineering tasks.
  • Hands-on experience with Google Cloud Platform (BigQuery, Dataflow, Composer, Cloud Storage, Pub / Sub) .
  • Expertise in Snowflake — schema design, data warehousing, and performance tuning.
  • Experience designing and maintaining ETL / ELT pipelines using modern tools and frameworks.
  • Solid understanding of data modelling techniques (star / snowflake schemas, normalization, denormalization).
  • Good-to-Have Skills :

  • Experience with CI / CD tools (Git, Jenkins, Cloud Build).
  • Familiarity with Airflow or other orchestration frameworks.
  • Knowledge of data governance and security best practices .
  • Exposure to data visualization tools (e.g., Looker, Tableau, Power BI).
  • What we need

  • ~35 hours of work per week.
  • 100% remote from our side
  • You will be paid out every month.
  • Min 4yrs of experience
  • Please apply only if you have a 100% remote job currently
  • If you do well, this will continue for a long time
  • Create a job alert for this search

    Data Architect • ranchi, jharkhand, in