Data Engineer General
V2Soft
Dearborn, MI, United States
Full-time
V2Soft (
www.v2soft.com
is a global company, headquartered out of Bloomfield Hills, Michigan, with locations in Mexico, Italy, India, China and Germany.
At V2Soft, our mission is to provide high performance technology solutions to solve real business problems. We become our customer’s true partner, enabling both parties to enjoy success.
We are committed to promoting diversity in the workplace, and believe it has a positive effect on our company and the customers we serve.
Skills Required :
- Infrastructure as Code : Design, build, and maintain scalable and reliable infrastructure on GCP using Infrastructure as Code (IaC) tools such as Terraform and Deployment Manager.
- Automate the provisioning and management of cloud resources to ensure consistency and repeatability.
- Continuous Integration and Continuous Deployment (CI / CD) :
- Implement and manage CI / CD pipelines using tools like Jenkins, GitLab CI, or Cloud Build to facilitate seamless code integration and deployment.
- Ensure automated testing and monitoring are integrated into the CI / CD process to maintain high-quality code and rapid delivery cycles.
- Data Pipeline Management :
- Collaborate with data engineers to design and optimize data pipelines on GCP using tools such as Apache Airflow, Cloud Composer, and Cloud Dataflow.
- Implement monitoring and alerting solutions to detect and resolve issues in data pipelines promptly.
- Cloud Platform Expertise :
- Utilize GCP services such as Cloud Storage, Cloud Run, and Cloud Functions to build scalable and cost-effective solutions.
- Implement best practices for cloud security, cost management, and resource optimization.
- Collaboration and Communication : Work closely with data engineers, data scientists, and other stakeholders to understand their requirements and provide the necessary infrastructure and tooling support.
- Foster a culture of collaboration and continuous improvement within the team.
- Monitoring and Incident Management :
- Implement robust monitoring, logging, and alerting solutions using tools like Stackdriver, Prometheus, and Grafana.
- Manage and respond to incidents, ensuring minimal downtime and quick resolution of issues.
- Documentation and Training :
- Create and maintain comprehensive documentation for infrastructure, CI / CD pipelines, and operational procedures.
- Provide training and support to team members on DevOps best practices and GCP services.
Skills Preferred :
- Technical Skills : Proficiency in Infrastructure as Code (IaC) tools such as Terraform, Deployment Manager, or CloudFormation.
- Strong knowledge of CI / CD tools and practices, including Jenkins, GitLab CI, and Cloud Build.
- Experience with data pipeline tools and frameworks such as Apache Airflow, Cloud Composer, and Cloud Dataflow.
- Familiarity with GCP services, including Cloud Storage, Cloud Run, Cloud Functions, and BigQuery.
- Proficiency in scripting languages such as Python, Bash, or PowerShell.
- Soft Skills :
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration abilities.
- Ability to work independently and as part of a team in a fast-paced, dynamic environment.
Experience Required :
- Minimum of 5 years of experience in DevOps or infrastructure engineering, with a strong focus on data warehousing.
- At least 2 years of hands-on experience working with Google Cloud Platform (GCP).
Education Required :
Bachelor's degree in Computer Science, Information Technology, or a related field is required.
V2Soft is an Equal Opportunity Employer ( EOE).
https : / / www.v2soft.com / careers
- to view all of our open opportunities and to learn more about our benefits.
1 day ago