Skills Needed:
• Minimum of 3-5 years of work experience on AWS Big Data Projects and related tools such as Glue, EMR, PySpark, Spark SQL, Athena, Snowflake, Kinesis and Lambda
• Minimum of 8 years of IT work experience
• Should have experience in Big Data Analytics Projects
• Experience in development of Large Enterprise Data warehouses and Business Intelligence Solutions.
• Experience in creating APIs using Python.
• Should have experience in Map-Reduce Hadoop, PySpark, Spark SQL and Python.
• Should have strong experience in Hadoop and Big Data eco system related files such CSV/JSON & Parquet
• Experience on delivering logical and physical data models for the large enterprise data warehouses & dimensional ER modeling
• Exposure to Performance issues and suggesting strategic technical solutions to solve performance issues.
• Prior Database Development experience is mandatory
• Exposure to Financial Markets is big plus.
Technology Platform experience:
EMR, PySpark, Python, Spark SQL, ANSI SQL, Glue, Snowflake, AWS Lambda, AWS Step Functions, Vault & Kinesis