Talent.com
serp_jobs.error_messages.no_longer_accepting
Developer

Developer

Tata Consultancy ServicesDetroit, MI, US
job_description.job_card.variable_days_ago
serp_jobs.job_preview.job_type
  • serp_jobs.job_card.full_time
job_description.job_card.job_description

Join to apply for the Developer role at Tata Consultancy Services

1 day ago Be among the first 25 applicants

Join to apply for the Developer role at Tata Consultancy Services

  • 3-5 years of experience in DataStage, including designing, developing, and optimizing ETL workflows.
  • Strong experience with Cloud Pak for Data (IBM) and cloud-based platforms such as AWS, Azure, or Google Cloud.
  • Proven experience in platform and infrastructure engineering, particularly in cloud environments.
  • Expertise in Data Engineering and ETL processes with a strong background in database technologies (SQL, NoSQL).
  • Proficiency in programming languages such as Python, Java, or Shell scripting for data operations.
  • Familiarity with DevOps practices, tools (e.g., Jenkins, Docker, Kubernetes), and infrastructure automation (e.g., Terraform).
  • Knowledge of data security, governance, and compliance best practices.
  • Strong troubleshooting, problem-solving, and analytical skills.
  • Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams.
  • Design, develop, and optimize DataStage ETL workflows for large-scale data processing tasks.
  • Collaborate with Cloud Pak services to architect and deploy data solutions, ensuring scalability, security, and high availability.
  • Participate in the cloud infrastructure management for data pipelines using various cloud services, ensuring efficiency and cost optimization.
  • Provide expertise in data engineering best practices for integrating and managing data on hybrid cloud platforms (e.g., IBM Cloud Pak for Data).
  • Build and support data platforms by integrating tools and technologies across the data engineering stack.
  • Design and implement DevOps pipelines for automation of data operations in cloud environments.
  • Ensure data governance, security, and compliance standards are maintained throughout the engineering lifecycle.
  • Troubleshoot, diagnose, and resolve any issues with data processing pipelines or infrastructure components.
  • Collaborate with cross-functional teams, including data scientists, architects, and business stakeholders, to optimize data solutions.

Datastage Platform Engineer

Must Have Technical / Functional Skills

  • 3-5 years of experience in DataStage, including designing, developing, and optimizing ETL workflows.
  • Strong experience with Cloud Pak for Data (IBM) and cloud-based platforms such as AWS, Azure, or Google Cloud.
  • Proven experience in platform and infrastructure engineering, particularly in cloud environments.
  • Expertise in Data Engineering and ETL processes with a strong background in database technologies (SQL, NoSQL).
  • Proficiency in programming languages such as Python, Java, or Shell scripting for data operations.
  • Familiarity with DevOps practices, tools (e.g., Jenkins, Docker, Kubernetes), and infrastructure automation (e.g., Terraform).
  • Knowledge of data security, governance, and compliance best practices.
  • Strong troubleshooting, problem-solving, and analytical skills.
  • Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams.
  • Roles & Responsibilities

  • Design, develop, and optimize DataStage ETL workflows for large-scale data processing tasks.
  • Collaborate with Cloud Pak services to architect and deploy data solutions, ensuring scalability, security, and high availability.
  • Participate in the cloud infrastructure management for data pipelines using various cloud services, ensuring efficiency and cost optimization.
  • Provide expertise in data engineering best practices for integrating and managing data on hybrid cloud platforms (e.g., IBM Cloud Pak for Data).
  • Build and support data platforms by integrating tools and technologies across the data engineering stack.
  • Design and implement DevOps pipelines for automation of data operations in cloud environments.
  • Ensure data governance, security, and compliance standards are maintained throughout the engineering lifecycle.
  • Troubleshoot, diagnose, and resolve any issues with data processing pipelines or infrastructure components.
  • Collaborate with cross-functional teams, including data scientists, architects, and business stakeholders, to optimize data solutions.
  • Salary Range- $100,000-$125,000 a year

    Seniority level

    Seniority level

    Mid-Senior level

    Employment type

    Employment type

    Full-time

    Job function

    Job function

    Engineering and Information Technology

    Industries

    IT Services and IT Consulting

    Referrals increase your chances of interviewing at Tata Consultancy Services by 2x

    Get notified about new Developer jobs in Detroit, MI

    Node.js Backend Developer - United States

    Detroit, MI $140,000.00-$220,000.00 3 weeks ago

    Software Engineer, Developer Productivity

    Detroit, MI $140,000.00-$220,000.00 3 weeks ago

    Full stack Developer with Origami & Duck Creek

    Belleville, MI $70,000.00-$100,000.00 1 day ago

    We're unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

    J-18808-Ljbffr

    serp_jobs.job_alerts.create_a_job

    Developer • Detroit, MI, US