Job Description
This is a remote position.
About Us
DataTeams help Technology Companies find the most skill specific AI enabled Data Professionals.
What You Will Help Us With as a Snowflake Engineer :
We are looking for a talented Snowflake Engineer to architect and implement large-scale data intelligence solutions using the Snowflake Data Warehouse.
The ideal candidate will have extensive experience in designing, operationalizing, and managing robust data and analytics solutions within the Snowflake environment.
A deep understanding of data engineering best practices, coupled with a proactive problem-solving approach in a cloud-based setting, is essential.
Join us in leveraging data to drive insights and support impactful business decisions!
Requirements
Experience & Skills We’re Looking For :
- Extensive experience in designing and implementing modern cloud-based strategies with Snowflake Data Cloud, for automating data ingestion pipelines.
- Strong proficiency in SQL, Python, Scala, or Java for data processing and analysis in context to Snowflake.
- Deep understanding of cloud services (AWS, Azure, or GCP) and their data offerings.
- Develop and optimise ETL pipelines for seamless data extraction, loading, and transformation using a combination of Python and Snowflake's SnowSQL
- Thoroughly document implementations to ensure clarity for future reference, including requirements, processes, and testing conditions.
- Experience working with diverse external datasets, including data cleaning, joining, and transformation.
- Experience with dbt and orchestration tools is highly valued.
- Familiarity with advanced analytics techniques and data modelling concepts.
Requirements :
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
- 4+ years of experience designing and implementing a full-scale data warehouse solution based on Snowflake.
- Leverage professional knowledge of AWS Redshift for comparative analysis and data warehousing solutions.
- Exceptional leadership and communication skills, with the ability to convey information clearly and provide constructive feedback, including addressing mistakes.
Preferred Qualifications :
- Additional certifications in data engineering or related fields.
- Experience with machine learning and AI workflows.
- Experience with Cloudera Impala or Apache Kudu, demonstrating proficiency in managing and optimising distributed data storage and querying systems.
- Knowledge of Apache workflows, such as Apache Airflow or Apache NiFi, showcasing the ability to design, implement, and manage complex data pipelines.
- Familiarity with data visualisation and reporting tools (e.g., Power BI, Tableau).
Requirements
Experience & Skills We’re Looking For : - Extensive experience in designing and implementing modern cloud-based strategies with Snowflake Data Cloud, for automating data ingestion pipelines.
- Strong proficiency in SQL, Python, Scala, or Java for data processing and analysis in context to Snowflake. - Deep understanding of cloud services (AWS, Azure, or GCP) and their data offerings.
- Develop and optimise ETL pipelines for seamless data extraction, loading, and transformation using a combination of Python and Snowflake's SnowSQL - Thoroughly document implementations to ensure clarity for future reference, including requirements, processes, and testing conditions.
- Experience working with diverse external datasets, including data cleaning, joining, and transformation. - Experience with dbt and orchestration tools is highly valued.
- Familiarity with advanced analytics techniques and data modelling concepts. Requirements : - Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
- 4+ years of experience designing and implementing a full-scale data warehouse solution based on Snowflake. - Leverage professional knowledge of AWS Redshift for comparative analysis and data warehousing solutions.
- Exceptional leadership and communication skills, with the ability to convey information clearly and provide constructive feedback, including addressing mistakes.
Preferred Qualifications : - Additional certifications in data engineering or related fields. - Experience with machine learning and AI workflows.
- Experience with Cloudera Impala or Apache Kudu, demonstrating proficiency in managing and optimising distributed data storage and querying systems.
- Knowledge of Apache workflows, such as Apache Airflow or Apache NiFi, showcasing the ability to design, implement, and manage complex data pipelines.
- Familiarity with data visualisation and reporting tools (e.g., Power BI, Tableau).