Tittle : GCP Data Engineer
Location : Richardson Dallas TX (Day 1 onsite)
for subcon 75-80 / hr
The pay range for this role is $125k - $130k per annum including any bonuses or variable pay. Tech Mahindra also offers benefits like medical, vision, dental, life, disability insurance and paid time off (including holidays, parental leave, and sick leave, as required by law). Ask our recruiters for more details on our Benefits package. The exact offer terms will depend on the skill level, educational qualifications, experience, and location of the candidate.
Job Description
Need 12 + years of experience , PySpark, Python,proactive monitoring and alert mechanism , datacore
Experience with GCP services such as Compute Engine, Data Proc, Kubernetes Engine, Cloud Storage, BigQuery, PUB / SUB, Cloud Functions and Dataflow.
Cloud Composer, ETL experience - working with large data sets, PySpark, Python, Spark SQL, DataFrames, PyTest
Develop and implement proactive monitoring and alert mechanism for data issues.
Familiarity with CI / CD pipelines and automation tools such as Jenkins, GitHub & GitHub actions.
Able to write complex SQL queries for business results computation
Develop architecture recommendations based on GCP best practices and industry standards.
Work through all stages of a data solution life cycle : analyze / profile data, create conceptual, logical & physical data model designs, architect and design ETL, reporting, and analytics solutions.
Conduct technical reviews and ensure that GCP solutions meet functional and non-functional requirements.
Strong knowledge of GCP architecture and design pattern
Business Logic & Workload Processing - Data Engineer Responsibilities
Developing Workloads for Business Logic Execution
Designed and implemented scalable workloads in Google Cloud Platform (GCP) to process complex business rules for Rx claim pricing and drug coverage analysis.
Business Rule Integration
Collaborated with stakeholders to translate regulatory and business requirements into executable rules. These rules determine drug pricing based on plan configurations, drug coverage, and pharmacy-specific factors.
Workload Orchestration
Leveraged Cloud Composer, Dataflow, and BigQuery to build automated, efficient workflows that :
Ingest and validate data
Apply rule-based logic at scale
Output regulatory-compliant pricing files
Dynamic Rule Processing
Supported rule versioning and updates to ensure accurate real-time processing across large f data.
Optimization & Monitoring
Tuned workload performance for cost efficiency and monitored execution through GCP's built-in tools, ensuring timely delivery and data quality.
Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.
Gcp Data Engineer • Dallas, TX, United States