Data Enterprise Architect

Noida, Uttar PradeshFull-TimeLeadOther

You will be redirected to the company career page

Job Responsibilities:

  • Work involves planning and analyzing user requirements, procedures, and problems to automate processing or to improve existing systems.
  • Prepare As-Is, To Be, Gap analysis, SWOTs, charts, diagrams, and tables that depict the present and proposed systems in terms of costs, benefits derived, and tasks accomplished.
  • Plans and schedules studies and system implementations.
  • Drives and leads initiatives that cross functional agency systems, other IT groups and other state entities that require coordinating, planning, and scheduling during project development and implementation stages.
  • Provides the options in the determination and implementation of architectural solutions.
  • Provides the design and implementation of new or revised methods that effectively meet agency business needs.
  • Evaluates current and future solutions, applications, technologies and establish overall requirements and planning objectives.
  • Provides expertise in the development of and/or develops agency Information Technology (IT) operations and management information system plans.
  • Oversees the development, analysis, and revision of design procedures, program codes, test procedures, and quality standards.
  • Plays a critical role in translating business strategy into technical strategy and defines end-to-end technology architectures that support the strategy.
  • Provides Architectural expertise on strategic planning actions and policy decisions related to the agency’s systems and makes recommendations concerning the direction of the agency’s computer and management information systems.
  • Analyzes and defines agency disaster recovery responsibilities and procedures.
  • Articulates desired future state, understand the current state, identify gaps between the two states, and develop approaches to close these gaps.
  • Assists in the preparation of information technology planning and justification.
  • Has good experience in developing real time data visualization platform and tools.
  • Own and aggressively drive forward MDM, Data Governance, Big Data, and Cloud Data Management
  • Develops data integration processes using the ETL platform (eg Biqquery, Informatica, snowflake)
  • Solid experience in emerging and traditional data stack components such as: batch and real time data ingestion, ETL, ELT, orchestration tools, on-prem and cloud DW, Python, structured, semi and unstructured databases.
  • Creating business solutions which utilize the CDI platform and ensure high quality and scalable solutions
  • Work with EA (Enterprise Architecture) team on standards, product roadmap, and architecture decisions based on the enterprise blueprint
  • Assist operational teams in troubleshooting activities within a mission critical run-time environment
  • Monitor CDI services and run-time components, maintaining consistent high performance , High through put and High availability
  • Collaborate with global team members in EIM and IT to deliver MDM / Governance team deliverables and capabilities to drive continuous improvement
  • Drive excellence and ensure continuous improvement in quality of deliverables
  • Definition and management of project deliverables whose intent is to optimize usage of data within the organization
  • Collaborate closely with Data Engineering and Product team to execute on the set roadmap of data ingestion, integration and data transformation.

Basic Qualifications:

  • At least 8 Years’ experience in Architecting and Designing Data pipelines and Data Integrations.
  • At least 3 years experience in Architecting and and managing large data sets and CDI platforms.
  • 3+ years developing Big Data Solutions (architectures, deployments, and operations - including design and estimation)
  • 4+ years of public cloud (GCP, AWS, Azure) design and implementation experience
  • 3+ years in using GCP pub-sub to build data integration pipelines .
  • 6+ years in developing python
  • 3+ years experience in cloud dataflow and other GCP tools for data engineer.
  • Expertise in data models, data pipeline concepts, and cloud-based infrastructure disciplines such as Kubernetes, containerization, etc.
  • Deep working knowledge of data technologies such as Snowflake, Kafka, DBT, Airflow, Kubernetes etc.
  • In-depth experience in solutions around data ingestion/pipelines, data migration, data testing & validation, data cleansing and data modeling.

Job Summary

CompanyShyftLabs
LocationNoida, Uttar Pradesh
TypeFull-Time
LevelLead
DomainOther