Location : Phoenix AZ
Pay Range : $30- 39 per hour
Duration - 6+ months
Required Skills & Qualifications
Must-have qualifications that candidates must meet to be considered :
- Proficiency in designing, developing, and maintaining scalable ETL / ELT pipelines using PySpark, Airflow, and GCP-native tools with a minimum of 5 years of experience
- Expertise in building and optimizing data warehouses and analytics solutions in BigQuery
- Experience in implementing and managing workflow orchestration with Airflow / Cloud Composer
- Strong ability to write complex SQL queries for data transformations, analytics, and performance optimization
- Hands-on experience with programming using Python
- Expertise in Hadoop and Spark Architecture
- Experience in UNIX shell scripting
- Collaborate with analysts and product teams to deliver reliable data solutions
- Troubleshoot, debug, and resolve production issues in large-scale data pipelines
- Contribute to best practices, reusable frameworks, and automation for data engineering
For immediate consideration please click APPLY