Service Manager (Senior Hadoop Developer) (1 Year Contract) Service Manager (Senior Hadoop Developer) (1 Year  …

in Singapore
Permanent, Full time
Be the first to apply
in Singapore
Permanent, Full time
Be the first to apply
Service Manager (Senior Hadoop Developer) (1 Year Contract)
This role includes:
  • Oversee the BAU operations of data platforms that include Enterprise Datawarehouse, Enterprise Data Lake, ETL and Data Transformation, Hortonworks Hadoop, Informatica PowerCenter, BDM, EDC and SAS EG
  • Review all application events and actions taken by support team to ensure events are properly handled
  • Coordinate between internal technology stakeholders (infrastructure, security and applications teams) to troubleshoot incidents
  • Perform trend analysis on similar incidents and conduct necessary root cause analysis to prevent future occurrence.
  • Track and plan all compliance, audit and patching activities to ensure timely remediation of action items.
  • Provide covering for L2 / L3 Support where necessary.
  • Provide L2/L3 Support for in-house developed solutions using Hadoop, Denodo and/or Informatica BDM, covering the following activities
  • Address user enquiries and ensure users are able to perform their task
  • Investigate issues based on the L1 escalation through log extraction and analysis, configuration and code troubleshooting, etc
  • Log tickets with product principals (where applicable) if issues cannot be resolved and track the tickets to ensure timely follow-up to resolve issues
  • Perform root cause analysis of issues
  • Propose and implement fixes to reported issues
  • Support audit activities
  • Responsible for technical design, application development and testing to deliver data solutions of superior quality that covers end-to-end data lifecycle
  • Design robust and low latency applications that support high-volume transaction volume
  • Perform end-to-end application development/enhancement that encompass web application, database and API end-points
  • Perform code reviews and providing critical suggestions for fixes and improvements
  • Use configuration management and integration/build automation tools to manage, test and deploy application codes.
  • Provide support to SIT and UAT, investigate and resolve technical issues reported in projects or issue resolution.
  • Support issue analysis and fix activities during test phases, as well as production issue resolution.
  • Fix and perform tuning Java-based applications
  • Plan and commission production system implementation.
  • Develop and review technical documents and other System Development Life Cycle (SDLC) related documents.
  • Recommend best-in-class solution architecture identifying functional as well non-functional parameters.
  • Provide recommendations on improvements and changes that can be made on existing solution to achieve near-term quick-wins
  • Provide detailed architecture analysis and design, and direction on development activities.

The ideal candidate should possess:
  • Tertiary Qualification in Information Security, Information Technology, Computer Science, Engineering (Computing/Telecommunication) or equivalent
  • At least 5 years of hands on experience in handling Data related projects (eg, big data, data warehouse, business intelligence, master data management)
  • At least 3 years of working experience in project management and vendor management
  • Minimum 1 year of experience in managing Hadoop or Informatica platform
  • Minimum 5 years of experience in demonstrating a high degree of proficiency in designing and engineering complex, data technical architectures and detailed data models and designs
  • Experience in data warehouses and data lakes with big data technologies (e.g. Hadoop, MapReduce, Hive, Spark)
  • Deep understanding of data modelling in Hadoop based data warehouse (e.g. Data Vault, Star Schema)
  • Experience using various in ETL technologies such as Informatica (PowerCenter and Big Data management), SAS EG etc.
  • Experience with Continuous Integration, Continuous Deliver and Test Driven Development, with experience using the following DevOps tools is an added advantage:
  • Ansible
  • Bitbucket
  • Jenkins
  • SonarQube
  • Nexus
  • Flow
  • Selenium / Microfocus UFT / LoadRunner
  • Exposure to information management tools for metadata management, data catalogue, master data management, data quality management
  • Certifications in relevant skills (e.g. project management, big data etc.) will be an advantage
  • Strong client and project management abilities coupled with excellent communication, written, analytical, organisational and problem-solving skills
Only shortlisted candidates will be contacted by KPMG Talent Acquisition team and personal collected will be used for recruitment purposes only.

KPMG logo
More Jobs Like This
See more jobs