The ideal candidate's favorite words are learning, data, scale, and agility. You will leverage your strong collaboration skills and ability to extract valuable insights from highly complex data sets to ask the right questions and find the right answers.
Responsibilities Solve Real-World Problems : Dive into challenging business questions and uncover solutions using advanced analytics and machine learning. Create Impact : Design predictive models and tools that help shape strategic decisions across the organization. Work with Cutting-Edge Tech : Leverage the latest in AI, big data, and cloud computing to process and analyze complex datasets. Visualize Your Vision : Build dashboards and data stories that captivate and inform stakeholders at all levels. Collaborate Across Teams : Partner with experts in engineering, design, and business to bring your ideas to life. Environment Meaningful Work : Work on high-impact projects that tackle real-world challenges Ownership & Autonomy : We empower you to take the lead on projects Cutting-Edge Environment : Experiment with the latest technologies and methodologies in data science and AI. Cloud Culture : Carefully crafted culture for efficiency and transparency Qualifications Bachelor's degree or equivalent experience in quantitative field (Statistics, Mathematics, Computer Science, Engineering, etc.) At least 1 - 2 years' of experience in quantitative analytics or data science Programming : Proficiency in Python for data analysis and machine learning. Statistical Analysis : Strong understanding of probability, hypothesis testing, and inferential statistics. Machine Learning : Experience with supervised and unsupervised algorithms, using libraries like Scikit-learn, TensorFlow, or PyTorch. Data Wrangling : Expertise in cleaning, transforming, and preprocessing data. Data Visualization : Ability to create impactful visuals using tools like Tableau, Power BI, or libraries like Matplotlib and Seaborn. SQL : Advanced querying and database management skills for relational databases. Big Data Tools : Familiarity with Hadoop, Spark, or similar technologies for large-scale data processing. Cloud Platforms : Experience with AWS, Azure, or Google Cloud for data storage and deployment. Data Engineering Basics : Knowledge of ETL pipelines and workflow orchestration tools like Apache Airflow. Version Control : Proficiency in Git for collaborative development and version tracking.