Overview
GCP Data Engineer Consulting Role
Location : Onsite in Dallas, TX
Duration : 12 Months
US Required : Yes
Role Overview
We're looking for an experienced GCP Data Engineer to design, develop, and implement scalable data pipelines and solutions using Google Cloud Platform services. The ideal candidate will have strong experience in ETL development, CI / CD automation, and working with large data sets in cloud- environments.
Responsibilities
- Build and manage data pipelines using GCP services : BigQuery, Dataflow, Cloud Storage, Pub / Sub, Cloud Composer, etc.
- Use PySpark, Spark SQL, and Python to process large datasets
- Create and optimize complex SQL queries for business data insights
- Develop monitoring systems for proactive data issue detection
- Implement CI / CD pipelines using Jenkins, GitHub, and GitHub Actions
- Follow GCP architecture best practices and conduct code / design reviews
Skills & Experience
Hands-on experience with Compute Engine, Kubernetes Engine, DataProc, Cloud FunctionsProficiency in PySpark, DataFrames, and PyTestStrong SQL and data modeling skillsBackground in ETL and data lifecycle managementFamiliarity with Agile and DevOps practicesHiring Timeline
Resume Deadline : May 16, 2025#J-18808-Ljbffr