Description :
We are seeking a highly skilled Technical Architect Data Science to design and lead the implementation of complex, end-to-end data and AI architectures.
This permanent role in Leicester requires proven experience, with 34+ years specifically in a Data Science Technical Architect role (or equivalent seniority).
You will be instrumental in defining ML pipelines, MLOps frameworks, and optimizing solutions across cloud and big-data stacks.
Key Responsibilities & Architectural Deliverables :
1. Data & AI Architecture Design :
- Platform Definition : Design and define the core components of the enterprise data platform, covering data ingestion, processing, storage, and analytics.
- MLOps Strategy : Architect and implement scalable model deployment frameworks and MLOps practices to transition machine learning models from experimentation to production reliably and securely.
- Technology Evaluation : Actively evaluate and select appropriate tools and technologies across the big-data and cloud landscape to meet performance and scalability requirements.
Governance, Optimization & Leadership :
Governance & Security : Define and enforce stringent data governance, security, and compliance protocols across all data and ML architectures.Performance & Cost : Continuously optimize solutions for cost-efficiency and performance, particularly concerning data storage (Data Warehouses / Lakes) and processing.Mentorship : Serve as a subject matter expert, providing technical leadership and mentoring development and data science teams on architectural best practices.Core Technology Stack :
Programming Languages : Proficiency in key languages for data engineering and science, including Python, R, SQL, Java, and / or Scala.Big Data Processing : Experience with processing frameworks like Spark, Hive, Kafka, and Flink.Data Warehousing / Lakes (Mandatory) : Mandatory architectural experience with modern data warehouse and data lake solutions such as Snowflake, Databricks, Redshift, BigQuery, and / or Synapse.Orchestration : Experience using workflow orchestrators like Airflow and dbt.Required Skills & Experience Summary :
ML / DS Libraries : Expertise in libraries like NumPy, Pandas, TensorFlow (TF), PyTorch, and XGBoost.Data Platforms : Practical experience architecting solutions on Snowflake, Databricks, Redshift, BigQuery, or Synapse.Orchestration : Experience with Airflow and / or dbt.Preferred / Nice to Have :
MLOps Tools : Direct implementation experience with MLOps platforms like MLflow, Kubeflow, DVC, or TFX.DevOps : Familiarity with CI / CD practices and containerization (Docker / Kubernetes - K8s).BI / Analytics : Exposure to Business Intelligence tools (Power BI, Tableau, or Looker)(ref : hirist.tech)