What you will do?
- Write efficient code in the chosen technology for the project - For example - Spark, Apache Beam
- Explore new technologies and learn new techniques to solve business problems creatively
- Collaborate with many teams - engineering and business, to build better data products and services
- Deliver the projects along with the team collaboratively and manage updates to customers on time
What we are looking for?
- Exposure to Apache Spark.
- Exposure to the understanding of ETL concepts using pySpark, and SparkSQL
- Exposure to SQL queries and stored procedures to work on challenging projects with mentor mentor-oriented leader
You will be preferred if you have
- Prior experience in working on AWS EMR, Apache Airflow
- AWS Certified Big Data – Specialty certification, Azure Certification, Snowflake Certification
- Cloudera or Hortonworks Certified Big Data Engineer
- Understanding of DataOps Engineering
