What You’ll Be Doing:
- Build complex ETL code
- Build complex SQL queries using MongoDB, Oracle, SQL Server, MariaDB, MySQL
- Work on Data and Analytics Tools in the Cloud
- Develop code using Python, Scala, R languages
- Work with technologies such as Spark, Databricks, Hadoop, Kafka, etc
- AWS, GCP And/Or Azure Public Cloud skills and experience
- Build complex Data Engineering workflows
- Create complex data solutions and build data pipelines
- Establish credibility and build impactful relationships with our customers to enable them to be cloud advocates
- Capture and share industry best practices amongst the community
- Attend and present valuable information at Industry Events
Qualifications & Experience:
- 3+ years design & implementation experience with distributed applications
- 2+ years of experience in database architectures and data pipeline development
- Demonstrated knowledge of software development tools and methodologies
- Presentation skills with a high degree of comfort speaking with executives, IT management, and developers
- Excellent communication skills with an ability to right level conversations
- Technical degree required; Computer Science or Math background desired
- Demonstrated ability to adapt to new technologies and learn quickly
- AWS, GCP And/Or Azure Public Cloud skills and experience
- #LI-JB2
- #LI-Remote
