8-10 years' experience in design, architecture or development in Analytics and Data Warehousing.
Experience in building end-to-end solutions with the Big data platform, Spark or scala programming.
5 years of Solid experience in ETL pipeline building with spark or sclala programming framework with knowledge in developing UNIX Shell Script, Oracle SQL / PL-SQL.
Experience in Big data platform for ETL development with AWS cloud platform.
Excellent skills in Python-based framework development are mandatory.
Should have experience with Oracle SQL database programming, SQL performance tuning, and relational model analysis.
Extensive experience with Teradata data warehouses and Cloudera Hadoop. Proficient across Enterprise Analytics / BI / DW / ETL technologies such as Teradata Control Framework, Tableau, OBIEE, SAS, Apache Spark, Hive