You will be redirected to the company career page

Job Responsibilities:

  • Develop and maintain data pipelines and ETL/ELT processes using Python
  • Design and implement scalable, high-performance applications
  • Work collaboratively with cross-functional teams to define requirements and deliver solutions
  • Develop and manage near real-time data streaming solutions using Pub, Sub or Beam.
  • Contribute to code reviews, architecture discussions, and continuous improvement initiatives
  • Monitor and troubleshoot production systems to ensure reliability and performance

Basic Qualifications:

  • 5+ years of professional software development experience with Python
  • Strong understanding of software engineering best practices (testing, version control, CI/CD)
  • Experience building and optimizing ETL/ELT processes and data pipelines
  • Proficiency with SQL and database concepts
  • Experience with data processing frameworks (e.g., Pandas)
  • Understanding of software design patterns and architectural principles
  • Ability to write clean, well-documented, and maintainable code
  • Experience with unit testing and test automation
  • Experience working with any cloud provider (GCP is preferred)
  • Experience with CI/CD pipelines and Infrastructure as code
  • Experience with Containerization technologies like Docker or Kubernetes
  • Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience)
  • Proven track record of delivering complex software projects
  • Excellent problem-solving and analytical thinking skills
  • Strong communication skills and ability to work in a collaborative environment

Preferred Qualifications:

  • Experience with GCP services, particularly Cloud Run and Dataflow
  • Experience with stream processing technologies (Pub/Sub)
  • Familiarity with big data technologies (Airflow)
  • Experience with data visualization tools and libraries
  • Knowledge of CI/CD pipelines with Gitlab and infrastructure as code with Terraform
  • Familiarity with platforms like Snowflake, Bigquery or Databricks,.
  • GCP Data engineer certification

Job Summary

CompanyShyftLabs
LocationNoida, Uttar Pradesh
TypeFull-Time
LevelMid-level
DomainSoftware Engineering