Search jobs > Dearborn, MI > Permanent > Gcp data engineer

Data Engineer with GCP (Google Cloud Platform) Experience (Only W2 role & strictly no C2C Accepted) 6.26.24

Systems Technology Group, Inc. (STG)
Dearborn, MI, United States
Temporary

Title : Data Engineer with GCP (Google Cloud Platform) Experience (Only W2 role & strictly no C2C Accepted) 6.26.24

Description : STG is a SEI CMMi Level 5 company with several Fortune 500 and State Government clients. STG has an opening for Data Engineer with GCP (Google Cloud Platform) Experience.

Please note that this project assignment is with our own direct clients. We do not go through any vendors. STG only does business with direct end-clients.

This is expected to be a long-term position. STG will provide immigration and permanent residency sponsorship assistance to those candidates who need it.

Job Description :

A position is open for a data engineer in the GDI&A Customer 360.The successful candidate will be responsible for designing, developing the transformation and modernization of big data solutions on GCP cloud integrating native GCP services and 3rd party data technologies also build new data products in GCP.

We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design and develop right solutions with appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud.

Responsibilities :

Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successfully deployment of Data Platform Implement methods for automation of all parts of the pipeline to minimize labor in development and production.

Identify, develop, evaluate, and summarize Proof of Concepts to prove out solutions

Test and compare competing solutions and report out a point of view on the best solution

Experience with large scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP

Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (GCP) services : o BigQuery, DataFlow (Apache Beam), Pub / Sub, BigTable, Data Fusion, DataProc, Cloud Composer (Apache Airflow), Cloud SQL, Compute Engine, Cloud Functions, and App Engine

Migrate existing Big Data pipelines into Google Cloud Platform . Build new data products in GCP.

Skills Required :

Minimum 3 Years of Experience in Java / python in-depth Minimum 2 Years of Experience in data engineering pipelines / building data warehouse systems with ability to understand ETL principles and write complex SQL queries.

Minimum 5 Years of GCP experience working in GCP based Big Data deployments (Batch / Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc

Minimum 2 years of experience in development using Data warehousing, Big Data Eco System Hive (Hql) & Oozie Scheduler, ETL IBM Data Stage, Informatica IICS with Teradata

1 Year experience of deploying google cloud services using Terraform.

Skills Preferred :

Understands Cloud as being a way to operate and not a place to host systems Understands data architectures and design independent of the technology Experience with Python, Shell Script preferred Exceptional problem solving and communication skills and management of multiple stakeholders Experience in working with Agile and Lean methodologies Experience with Test-Driven Development

Education Required :

Bachelor’s degree in computer or data science, engineering, statistics, operations research, or other quantitative area.

Data Engineer with Java / Python with GCP (Google Cloud Platform) Experience. A great opportunity to experience the corporate environment leading personal career growth.

Resume Submittal Instructions : Interested / qualified candidates should email their word formatted resumes to Vasavi Konda vasavi.

konda(.@)stgit.com and / or contact @ (Two-Four-Eight) Seven- One-Two Six-Seven-Two-Five (@248.712.6725) . In the subject line of the email please include : First and Last Name - Data Engineer with Java / Python with GCP (Google Cloud Platform) Experience .

For more information about STG, please visit us at .

Sincerely,

Vasavi Konda Recruiting Specialist

Opportunities don't happen, you create them.

Systems Technology Group (STG)

3001 W. Big Beaver Road, Suite 500

Troy, Michigan 48084

Phone : @ (Two-Four-Eight) Seven- One-Two Six-Seven-Two-Five : @ 248.712.6725(O)

Email : vasavi.konda(.@)stgit.com

3 days ago
Related jobs
Promoted
Systems Technology Group, Inc. (STG)
Dearborn, Michigan

Data Engineer with GCP (Google Cloud Platform) Experience (Only W2 role & strictly no C2C Accepted) 6. Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (GCP) services: o BigQuery, DataFlow (Apache Beam), Pub/Sub, BigTable, Data Fusio...

Promoted
RIT Solutions, Inc.
Dearborn, Michigan

Our Data Factory Platform covers all business processes and technical components involved in ingesting a wide range of enterprise data into the Global Data Insight & Analytics Data Factory (Data Lake) and the transformation of that data into consumable data sets in support of analytics. These ingest...

Promoted
Systems Technology Group, Inc. (STG)
Dearborn, Michigan

BTP Experience (Only W2 role & strictly no C2C Accepted) 6. SAP ABAP/4 programming to generate Reports and develop APIs to connect to SAP/Non-SAP using RFC, ALE/IDOCs, and Output Programming using SAP Script, Smart Forms, and Adobe Forms - Integration with Non-SAP systems using Web Services/Proxies/...

Promoted
VirtualVocations
Livonia, Michigan

Key Responsibilities:Supporting public cloud platform databases using AWS RDS MySQL, Redis, and other servicesDeveloping infrastructure automation with Terraform, Python, and GolangCreating and managing production database alerts, investigating data layer issues, and leading FinOps for data storage ...

Promoted
OCPA
Warren, Michigan
Remote

Read and strictly follow the In-Home Usage Test Daily Schedule provided with each product testing project (may include tasks such as unpacking, reading instructions, journal entries, online or mobile feedback, usage of product for a certain amount of time, writing reviews, taking pictures, etc. Some...

Promoted
VirtualVocations
Livonia, Michigan

A company is looking for a Site Reliability Engineer for their IDaaS Data Platform. ...

Promoted
Akkodis
Dearborn, Michigan

Experience working in GCP native (or equivalent) services like Big Query, Google Cloud Storage, PubSub, Dataflow, Dataproc, Cloud Build, etc. Experience migrating Teradata to GCP - Experience working with Airflow for scheduling and orchestration of data pipelines. Akkodis is seeking a GCP Data Engin...

Promoted
VirtualVocations
Livonia, Michigan

A company is looking for a Senior Cloud Platform Engineer. ...

SmartIPlace
Dearborn, Michigan

Our Data FactoryPlatform covers all business processes and technical componentsinvolved in ingesting a wide range of enterprise data into theGlobal Data Insight & Analytics Data Factory (Data Lake) andthe transformation of that data into consumable data sets insupport of analytics. These ingestion s...

General Motors
Detroit, Michigan

Experience in designing and engineering highly-available public cloud services (Azure, GCP and AWS). Experience with open-source, cloud-native products supporting complex applications and services (ex- Service Mesh, Service Discovery, Routing, Logging Infrastructure, CI/CD tools, etc. Knowledge of o...