Find Your Dream Job

Search through thousands of job postings to find your next opportunity

Date Posted

Job Type

Technology

Work Setting

Salary Range

$0k $100k $200k+

Experience Level

Data Platform Engineer

Infosys

Bangalore Urban, Karnataka, India

Data Platform Engineer


Summary

Python + Airflow Data Engineer

We are seeking a highly skilled Python / Airflow Data Engineer with 3 to 6 years of experience, specifically with a strong background in AWS technologies. The ideal candidate will have a deep understanding of Apache Airflow and its integration within AWS ecosystem, enabling efficient data pipeline orchestration and management.


Responsibilities

  • Design, develop, and maintain complex data pipelines using Python for efficient data processing and orchestration.
  • Collaborate with cross-functional teams to understand data requirements and architect robust solutions within the AWS environment.
  • Implement data integration and transformation processes to ensure optimal performance and reliability of data pipelines.
  • Optimize and fine-tune existing data pipelines / Airflow to improve efficiency, scalability, and maintainability.
  • Troubleshoot and resolve issues related to data pipelines, ensuring smooth operation and minimal downtime.
  • Work closely with AWS services like S3, Glue, EMR, Redshift, and other related technologies to design and optimize data infrastructure.
  • Develop and maintain documentation for data pipelines, processes, and system architecture.
  • Stay updated with the latest industry trends and best practices related to data engineering and AWS services.


Requirements

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Proficiency in Python and SQL for data processing and manipulation.
  • Min 5 years of experience in data engineering, specifically working with Apache Airflow and AWS technologies.
  • Strong knowledge of AWS services, particularly S3, Glue, EMR, Redshift, and AWS Lambda.
  • Understanding of Snowflake is preferred.
  • Experience with optimizing and scaling data pipelines for performance and efficiency.
  • Good understanding of data modeling, ETL processes, and data warehousing concepts.
  • Excellent problem-solving skills and ability to work in a fast-paced, collaborative environment.
  • Effective communication skills and the ability to articulate technical concepts to non-technical stakeholders.



Preferred Qualifications:

  • AWS certification(s) related to data engineering or big data.
  • Experience working with big data technologies like Snowflake, Spark, Hadoop, or related frameworks.
  • Familiarity with other data orchestration tools in addition to Apache Airflow.
  • Knowledge of version control systems like Bitbucket, Git.

New SRE Jobs

Connecting top SRE talent with leading companies.

For SRE Professionals

For Employers

Company