Search jobs > Dearborn, MI > Temporary > Data engineer

Data Engineer

Akkodis
Dearborn, MI, United States
$72-$75 an hour
Full-time

Akkodis is seeking a GCP Data Engineer for a Contract position with a client located in Dearborn, MI (Hybrid ) and ideally strong hands-on l ooking for Real-time Data Streaming, GCP Pub / Sub, Tekton and PostgreSQL

Day one onsite, hybrid is okay

Contract Positions

Pay Range : $72-75 / hr

Client Update : Looking for 10+ years of experience

Top five Skills :

GCP tools like Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, DataProc, Artifact Registry, Cloud Build, Cloud Run, Vertex AI, Pub / Sub, GCP APIs

Python / Java / Spark / Scala and Hadoop

Terraform / Tekton

Relational Database Management System like MySQL, PostgreSQL, and SQL Server with strong SQL skills

Real-Time data streaming platform like Apache Kafka, GCP Pub / Sub

Skills : Required :

Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, Dataproc, Artifact Registry, Cloud Build, Cloud Run, Vertex AI, Pub / Sub, GCP APIs.

Build ETL pipelines to ingest the data from heterogeneous sources into our system

Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data

Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets

Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements

Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure.

Implement version control and CI / CD practices for data engineering workflows to ensure reliable and efficient deployments.

Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures

Troubleshoot and resolve issues related to data processing, storage, and retrieval.

Promptly address code quality issues using SonarQube, Checkmarx, Fossa, and CyCode throughout the development lifecycle

Implement security measures and data governance policies to ensure the integrity and confidentiality of data

Collaborate with stakeholders to gather and define data requirements, ensuring alignment with business objectives. Develop and maintain documentation for data engineering processes, ensuring knowledge transfer and ease of system maintenance.

Participate in on-call rotations to address critical issues and ensure the reliability of data engineering systems.

Provide mentorship and guidance to junior team members, fostering a collaborative and knowledge-sharing environment.

If you are interested in this job, you can click APPLY NOW For other opportunities available at Akkodis go to If you have questions about the position, please contact Nitish Kumar at Nitish.

[email protected]

Equal Opportunity Employer / Veterans / Disabled :

Benefit offerings include medical, dental, vision, term life insurance, short-term disability insurance, additional voluntary benefits, commuter benefits, and a 401K plan.

Our program provides employees the flexibility to choose the type of coverage that meets their individual needs. Available paid leave may include Paid Sick Leave, where required by law;

any other paid leave required by Federal, State, or local law; and Holiday pay upon meeting eligibility criteria. Disclaimer : These benefit offerings do not apply to client-recruited jobs and jobs that are direct hires to a client.

To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit

4 days ago
Related jobs
Promoted
The Auto Club Group
Dearborn, Michigan

Must have 3 years of experience with the following: integrating tools with big data platforms and building data transformation accelerator tools; building and optimizing 'big data' data pipelines, architectures and data sets; message queuing, stream processing, and highly scalable 'big data' data st...

Promoted
VirtualVocations
Warren, Michigan

A company is looking for a Marketing Data Engineer to analyze and optimize marketing data. ...

Ally
Detroit, Michigan

The Senior Data Engineer will be a part of the Enterprise Data & Analytics team. The individual will be a part of the engineering team that develops data pipelines & objects in the warehouse. The cloud based warehouse is expected to contains the data, historical & transactional, for several up-strea...

Promoted
VirtualVocations
Warren, Michigan

A company is looking for a Senior Software Engineer - Data Technology. ...

Tekvivid Inc
Dearborn, Michigan

Software Engineer - Data Engineer</p> <p>Location: Dearborn, Mi- Hybrid </p> <p>Experience: 7+ years</p> <p> </p> <p>Skills Required:</p> <p>· Python SQL</p> <p>· 3+ years experience with ETL solutions<...

Promoted
VirtualVocations
Warren, Michigan

A company is looking for a Software Engineer for Training AI Data. ...

BDO
Detroit, Michigan

Net, C#, Qlik, Power BI, Machine Learning, Azure Data Factory, RedShift, UiPath, Cloud, RPA, AWS, Redshift, Kinesis, QuickSight, SageMaker, S3, Databricks, AWS Lake Formation, Snowflake, Python, Qlik, Athena, Data Pipeline, Glue, Star Schema, Data Modeling, SQL, SSIS, SSAS, SSRS, PySpark, Microsoft ...

Miracle Software Systems, Inc
Detroit, Michigan

Data design, data architecture and data modeling (both transactional and analytic). Running and tuning queries in databases including Big Query, SQL Server, Data Management - including running queries and compiling data for analytics. Miracle Software Systems is currently in search of a skilled "Dat...

Highmark Health
MI, Working at Home, Michigan

Performs modeling of data sources and flows, works with data management staff to define an operational framework that will define, develop and implement policies and security frameworks to ensure precise and secure delivery of knowledge and information. Perform modeling of data sources and flows, wo...

Mindlance
Dearborn, Michigan

Data Pipeline Development: • Design, build, and maintain scalable and robust data pipelines on GCP using tools such as Apache Airflow, Cloud Composer, and Cloud Dataflow. Collaboration and Communication: • Work closely with data scientists, analysts, and business stakeholders to understand data requ...