We are looking for an experienced and innovative Cloud Engineer with a strong background in designing, developing, and maintaining scalable data pipelines and ETL processes.
The ideal candidate will have deep expertise in leveraging Microsoft Fabric and Azure's data and analytics resources, alongside proficiency in integrating data from diverse sources, including on-premises systems, third-party applications, and other cloud platforms.
If you are passionate about optimizing data workflows, implementing CI / CD pipelines, ensuring data integrity and security, and staying ahead of industry trends, we want to hear from you.
Join us to play a pivotal role in shaping our data-driven future.
Responsibilities :
- Design, develop, migrate, and maintain scalable data pipelines and ETL processes on Microsoft Fabric and utilizing Azure data & analytics resources.
- Design and implement data integration processes to extract, transform, and load (ETL) data from various sources both on-prem, external 3rd party applications, and other cloud providers.
- Create data processing workflows and pipelines to support data analytics, machine learning, and other analytic-driven applications.
- Implement CI / CD pipelines for automating data engineering processes using Azure DevOps or GitHub.
- Collaborate with cross-functional teams to understand data requirements and implement effective data integration solutions.
- Set up monitoring solutions using Azure Monitor, Azure Log Analytics, Fabric Data Activator, and other Azure solutions to track the performance of data pipelines or analytics workflows to ensure data quality, integrity, and performance.
- Set up and configure various Microsoft security solutions such as Purview and Sentinel to monitor and alert security and compliance best practices throughout the data & analytics lifecycle.
- Conduct thorough testing and validation of data solutions to ensure the accuracy, completeness, and reliability of the delivered data products.
- Implement, optimize, and maintain cost-effective data & analytics solutions by leveraging Azure Cost Management tools and best practices.
- Stay updated on industry regulations and implement necessary changes to maintain compliance.
Qualifications :
- 5-10 years of experience with data engineering, particularly in Azure, AWS, and / or Google Cloud Platform (GCP).
- Proficiency in cloud services, including Azure Data Factory, AWS Glue, Google Cloud Dataflow, Azure Databricks, AWS EMR, Google Cloud Dataproc, and storage solutions like Azure Blob Storage, AWS S3, and Google Cloud Storage.
- Experience with Microsoft SQL Server tools (SSIS, SSRS, and SSAS).
- Experience with data engineering processes using Apache Spark, Python, R, T-SQL, or KQL.
- Experience with generating Semantic Models.
- Familiarity with Microsoft Power BI for data visualization.
- Passion for staying updated on industry trends and emerging data engineering and cloud computing technologies.
- Strong communication skills and the ability to collaborate effectively with cross-functional teams.
About us
Donivia Overseas is part of Donivia Group which caters immigration, manpower, apparel, Logistics, textiles, business consulting, digital marketing and education.
Our branches or subsidiaries are located at Saudi Arabia, UAE, US and UK along with head office in India. Donivia Overseas specialize in end-to-end facilitation of global mobility for skilled workers, bridging talent with opportunity.
We do exhaustive personal and professional background assessments and document verification. As a leading recruitment firm, we are actively sourcing top talent for a reputable partner based in the United States.