Search jobs > Dearborn Heights, MI > Gcp data engineer

"Lead Cloud Data Engineer – GCP & SAP Integration",

Saanvi Technologies
Dearborn Hts, MI, United States
Full-time
Quick Apply

Please check with the below requirement and reply back with updated resume. This is a W2 and Hybrid Role

Lead Cloud Data Engineer

Dearborn, MI GCP Certification preferred Ford Experience preferred Hybrid with up to 4 days a week on site.

Position Description :

Materials Management Platform (MMP) is a multi-year transformation initiative aimed at transforming Ford's Materials Requirement Planning & Inventory Management capabilities.

This is part of a larger Industrial Systems IT Transformation effort. This position responsibility is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier Supply Chain, Supplier Collaboration

Skills Required :

Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, Dataproc, Artifact Registry, Cloud Build, Cloud Run, Vertex AI, Pub / Sub, GCP APIs.

Build ETL pipelines to ingest the data from heterogeneous sources into our system Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure.

Implement version control and CI / CD practices for data engineering workflows to ensure reliable and efficient deployments.

Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures Troubleshoot and resolve issues related to data processing, storage, and retrieval.

Promptly address code quality issues using SonarQube, Checkmarx, Fossa, and Cycode throughout the development lifecycle Implement security measures and data governance policies to ensure the integrity and confidentiality of data Collaborate with stakeholders to gather and define data requirements, ensuring alignment with business objectives.

Develop and maintain documentation for data engineering processes, ensuring knowledge transfer and ease of system maintenance.

Participate in on-call rotations to address critical issues and ensure the reliability of data engineering systems. Provide mentorship and guidance to junior team members, fostering a collaborative and knowledge-sharing environment.

Experience Required :

8 years of professional experience in : o Data engineering, data product development and software product launches o At least three of the following languages : Java, Python, Spark, Scala, SQL and experience performance tuning.

4 years of cloud data / software engineering experience building scalable, reliable, and cost-effective production batch and streaming data pipelines using : o Data warehouses like Google BigQuery.

o Workflow orchestration tools like Airflow. o Relational Database Management System like MySQL, PostgreSQL, and SQL Server.

o Real-Time data streaming platform like Apache Kafka, GCP Pub / Sub o Microservices architecture to deliver large-scale real-time data processing application.

o REST APIs for compute, storage, operations, and security. o DevOps tools such as Tekton, GitHub Actions, Git, GitHub, Terraform, Docker.

o Project management tools like Atlassian JIRA Automotive experience is preferred Support in an onshore / offshore model is preferred Excellent at problem solving and prevention.

Knowledge and practical experience of agile delivery

Experience Preferred :

Experience in IDOC processing, APIs and SAP data migration projects. Experience working in SAP S4 Hana environment

Education Required :

Requires a bachelor's or foreign equivalent degree in computer science, information technology or a technology related field

Education Preferred :

Master's preferred

3 days ago
Related jobs
Promoted
Fifth Third Bank
Detroit, Michigan

As the Lead Information Security Data Software Engineer, you will leverage your creativity and data-first skills to enhance our cyber security programs. This includes working with multiple teams to co-develop solutions and gather the necessary data across various domains such as data ingestion, ETL,...

Artmac Soft LLC
Dearborn, Michigan

Job Description: </b></p> <p>Job Title : Lead Cloud Data Engineer Azure & Databricks</p> <p>Job Type : W2 </p> <p>Experience : 8 to 10 years</p> <p>Location : Dearborn, Michigan</p> <p> </p> <p><b>Requir...

Tekishub Consulting Services
Dearborn, Michigan

Experience with designing and developing end to end data and analytics solutions leveraging Azure Data Lake, Azure Databricks, and Data Factory</div> </div> <div><b>Essential Job Functions: </b></div> <div><b>Responsibilities:</b></div> <...

PricewaterhouseCoopers Advisory Services LLC
Detroit, Michigan
Remote

Leading SAP SuccessFactors data migration implementations including from SAP HCM to SAP SuccessFactors;. Managing and developing SAP SuccessFactors integrations using the integration suite on SAP Business Technology Platform (SAP BTP);. SummaryA career in our SAP Human Capital practice, within SAP C...

Tekvivid Inc
Dearborn, Michigan

W2 Requirement!!</p> <p>Looking for GCP Data Engineer</p> <p>Location: Dearborn, Mi- hybrid</p> <p> </p> <p>Additional Information:</p> <p>GCP Certification Mandatory</p> <p> </p> <p>Required Skills:</p&...

V2Soft
Allen Park, Michigan

Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, Dataproc, Artifact Registry, Cloud Build, Cloud Run, Vertex AI, Pub/Sub, GCP APIs. Data engineering, data product development an...

Tekvivid Inc
Dearborn, Michigan

Experience working in GCP based Big Data deployments (Batch/RealTime) leveraging Terraform Big Query Big Table Google Cloud Storage PubSub Data Fusion Dataflow Dataproc Cloud Build Airflow Cloud Composer etc. GCP Cloud experience with solution designed and implemented at production scale. Experience...

CVS Health
Work from home, MI, US
Remote

The Lead Cloud Engineer will be a Technical Subject Matter Expert / Individual Contributor accountable for leading the organizational transformation in onboarding various data consumption and business intelligence tools into Google Cloud. The Lead Cloud Engineer will: . Create and propose cloud engi...

Mindlance
Dearborn, Michigan

You will: • Work in collaborative environment including pairing and mobbing with other cross-functional engineers • Work on a small agile team to deliver working, tested software • Work effectively with fellow data engineers, product owners, data champions and other technical experts • Demonstrate t...

Ford Motor Company
Dearborn, Michigan

Data Engineer - To meet the growing needs of the mobility business, the CSE team is looking for a highly motivated, technology focused individual with a passion to work on a collaborative software delivery team to enable the creation and management of Data Products deployed to the Ford ecosystem. Th...