We are seeking a highly motivated Data Engineer to join a dynamic Enterprise Data team.
This role involves designing and developing robust data pipelines, working with cloud-native technologies, and contributing to innovative solutions that power next-generation digital services.
The ideal candidate will have strong experience in Python or Java, cloud platforms, and modern data engineering tools.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines to support ETL / ELT use cases.
- Build applications and platforms using Python and / or Java.
- Implement and optimize workflows using Apache Hop, Apache Beam, Spark, and Airflow.
- Configure and deploy containerized applications in public cloud environments (Azure / AWS).
- Apply Infrastructure-as-Code practices using tools like Terraform, CloudFormation (CFT), or ARM templates.
- Integrate CI / CD tools such as GitHub, Artifactory, and Jenkins into development workflows.
- Configure alerts and monitoring to ensure platform reliability and availability.
- Collaborate with global engineering teams to deliver innovative solutions.
- Solve complex problems and adapt to evolving requirements.
- Communicate effectively with stakeholders and influence technical decisions.
Required Qualifications
Bachelor's degree in Computer Science, Information Systems, or related field.Hands-on experience with Python and / or Java development.Experience designing and maintaining data pipelines and workflows.Strong knowledge of cloud platforms (Azure or AWS).Practical experience with containerization and deployment automation.Proficiency with Infrastructure-as-Code tools (Terraform, CFT, ARM).Experience with CI / CD tools (GitHub, Jenkins, Artifactory).Strong problem-solving and communication skills.Preferred Qualifications
Certification in Azure or AWS.Experience with data movement platforms like SnapLogic or Informatica.Familiarity with virtual databases and data replication platforms.Certifications
Preferred : Azure or AWS Cloud CertificationJ-18808-Ljbffr