Find Your Dream Job

Search through thousands of job postings to find your next opportunity

Date Posted

Job Type

Technology

Work Setting

Salary Range

$0k $100k $200k+

Experience Level

Kubernetes Devops Engineer

Capgemini

Singapore, Singapore

About Capgemini :-


Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.


Responsibilities :-


  • Collaborate with the Bank's core platform team to install in-house packed on-premises Kubernetes.
  • Configure Azure Kubernetes on the cloud platform.
  • Deploy Apache Spark, Apache Ranger, and Apache Kafka on Kubernetes.
  • Utilize Azure DevOps (ADO) or recommended tools for deployment automation.
  • Manage data storage by integrating ADLS G2 with Hot and Cold storage via S3 Protocol.
  • Integrate with the bank's Central Observability Platform (COP) and create Grafana-based monitoring dashboards for Spark jobs and K8s pods.
  • Implement data encryption at rest using TDE via Bank CaaS (Cryptography as a Service) and configure on-wire encryption (TLS) for intra/inter-cluster communication.
  • Enforce RBAC (Role-Based Access Control) via Apache Ranger for Datahub, Spark, and Kafka.
  • Work alongside the Bank's central security team for platform control assessments.
  • Facilitate data file transfer, including transferring 5 PB of data within the existing Hadoop Platform and migrating <1PB of data from Brownfield Azure HDI (ADLS G2) to Greenfield GDP AKS (ADLS G2).
  • Implement backup and disaster recovery strategies for ADLS and Isilon.
  • Tokenize personal data using Protegrity.


Requirements:-


  • Proficiency in Azure DevOps (ADO) and deployment automation tools.
  • Strong experience with IT infrastructure, Kubernetes, and Microsoft Azure.
  • Expertise in Microsoft development, Apache Spark, and cloud computing.
  • Knowledge of cloud providers, containerization, and information technology.
  • Experience with public cloud, software development, and technology.
  • Familiarity with virtualization, web application frameworks, and web application development.


Preferred Qualifications :-


  • Previous experience working with hybrid big data platforms.
  • Strong understanding of data encryption techniques and security compliance.
  • Experience with monitoring and observability tools, particularly Grafana.
  • Familiarity with data file transfer technologies and strategies.
  • Proven track record in implementing backup and disaster recovery strategies.
  • Knowledge of tokenization techniques and tools like Protegrity.


What You'll love about Working Here


We promote Diversity & Inclusion as we believe diversity of thought fuels excellence and innovation.

In Capgemini, you are the architect of your career growth. We equip people in maximizing their full potential by providing wide array of career growth programs that empower them to get the future they want.

Capgemini fosters impactful experiences for its people that would aid in bringing out the best in them for them, for the company, and for their clients.

NewSREJobs

Connecting top SRE talent with leading companies.

For SRE Professionals

For Employers

Company