· Degree in computer science or a numerate subject (e.g. engineering, sciences, or mathematics)
or Bachelor's/Master degree with 6 years of experience
· Hand-on Development experience with PySpark, Scala Spark and Distributed computing.
· Development and implementation experience of applications for 4-6 years.
· 4 to 6 years' experience designing and developing in Python.
· 4 to 6 years' experience in Hadoop Platform (Hive, HDFS and Spark)
· 3 to 5 years' experience with Unix shell scripting
· 3 to 5 years' experience with SQL
· 2 to 3 years' experience with Spark programming.
· Knowledge of micro-services architecture and cloud will be added advantage.
· Knowledge of Java and Scala will be added advantage