JOB DETAIL

The position is not currently open for new applicants.
 
Consultant
Date Posted:26-04-2022
 
Job Summary
  • Skills
    Data Engineer
  • SME Group
    DGTL-South
  • Designation
    Consultant
  • Level
  • Grade
    S2
  • Location
    Bangalore
  • City
    Bangalore
  • Job Title
    Consultant
  • Educational Qualification
    B.E/B.Tech
  • Work Mode

Job Description

1.Data Engineer for Bangalore with 2 to 3 Years of Experience

Key responsibilities of the role include:

  • Contribution towards delivery of innovative and engaging data engineering solutions.
  • Understanding of business and technical requirements, provision of subject matter expertise and implementation of data engineering techniques.
  • Conducting of data discovery activities, performing root cause analysis, and making recommendations for the remediation of data quality issues.
  • Putting into practice good organizational and time management skills, with the ability to prioritize and complete multiple complex projects with complete ownership.
  • Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility, and fostering data-driven decision making across the organization.
  • Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modelling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations.
  • Development of the ETL/ELT Frameworks with best practice available in the market.
  • Development of Data Solutions in alignment with the Data Security and Data Governance Rules in various Markets (Countries)
  • Develop a very good understanding of Client Data & Analytics Strategy and ensure alignment of team’s priorities to the strategy
  • Develop execution plan to enable future fit skills by training on Data Engineering principles
  • Build and manage relationship with stakeholders within D&A and Business
  • Drive operational excellence within the team
  • Ensure adherence to SLA and drive user satisfaction
  • Design thinking and Agile Scrum Approach
  • 2 to 3 years of relevant experience in Big Data Solutioning and Cloud Computing
  • Experience in developing data pipelines to transform, aggregate and or process data using Azure Databricks platform
  • Experience in creating data pipelines using Azure Data Factory, Polybase and U-SQL.
  • Hands-on experience in implementing Azure Cloud data warehouses, Azure and No-SQL databases and hybrid data ingestion scenarios
  • Experience in creating tabular models (DAX) in Visual Studio.
  • Hands on Experience in Azure Devops/Git Hub for Code maintenance and Deployment
  • Experience of Working with Senior Business Stake holders.

Education:

  • Ideal candidates would preferably have a master’s or Bachelor’s degree in Computer Science

 

2. Data Engineer for Bangalore with 3 to 5 Years of Experience

Key responsibilities of the role include:

  • Spearhead the Product Development through engaging with the Product Owners and key Business Stakeholders and understand the business needs.
  • Collaborate with enterprise architects, data architects, ETL (Extraction Transfer Load) developers & engineers, data scientists and information designers to lead identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities.
  • Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modelling, data virtualization, self-service data preparation & analytics, AI (Artificial Intelligence) enablement, and API (Application Programming Interface) integrations
  • Participate in deep architectural discussions to build confidence while building innovative solutions and migrating existing data applications on the Azure platform.
  • Optimize data integration platform to provide optimal performance under increasing data volumes
  • Create training contents for new joiners and SME’s.
  • Support the data architecture and data governance function to continually expand their capabilities
  • Manage and work with 3rd Party Data engineers
  • Lead creation of design patterns for effective Data Engineering Practices across functions.
  • Implement business and IT (INFORMATION TECHNOLOGY) data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools
  • Work with business and application/solution teams to implement data strategies, lead build of data flows
  • Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks
  • Development of the ETL Frameworks with best practice available in the market.
  • Development of Data Solutions in alignment with the Data Security and Data Governance Rules in various Markets (Countries)
  • Develop an exceptionally good understanding of Client Data & Analytics Strategy and ensure alignment of team’s priorities to the strategy
  • Work closely with the D&A Markets, functions, Business teams, Geo IT, and any spoke teams when it comes to Data Engineering requirements
  • Build and manage relationship with stakeholders within D&A and Business
  • Committed to drive operational excellence and be an example for the team
  • Ensure adherence to SLA (Service Level Agreement) and drive user satisfaction
  • Mentor team on continuous improvement which will includes Improved ways of working between engagement, tracking, Simplification and Standardization
  • Design thinking and Agile Scrum Approach
  • 3 to 5 years of relevant experience in Big Data Solutioning and Cloud Computing
  • Experience in developing data pipelines to transform, aggregate and or process data using Azure Databricks platform
  • Experience in creating data pipelines using Azure Data Factory, Polybase and U-SQL.
  • Hands-on experience in implementing Azure Cloud data warehouses, Azure and No-SQL databases, Graph DB (Data Base) (e.g., Neo4j, Gremlin) and hybrid data ingestion scenarios
  • Experience in creating tabular models (DAX) in Visual Studio.
  • Hands on Experience in Azure DevOps/Git Hub for Code maintenance and Deployment
  • Experience of Working with Senior Business Stake holders.

Education:

  • Ideal candidates would preferably have a master’s or bachelor's degree in Computer Science / Information Technology

Copyright © 2020 Talentrackr Technologies