Job Description
Job Description
We are looking for a skilled Data Engineer to join our team in Johnson City, Texas. In this role, you will design and optimize data solutions to enable seamless data transfer and management in Snowflake. You will work collaboratively with cross-functional teams to enhance data accessibility and support data-driven decision-making across the organization.
Responsibilities :
- Design, develop, and implement ETL solutions to facilitate data transfer between diverse sources and Snowflake.
- Optimize the performance of Snowflake databases by constructing efficient data structures and utilizing indexes.
- Develop and maintain automated, scalable data pipelines within the Snowflake environment.
- Deploy and configure monitoring tools to ensure optimal performance of the Snowflake platform.
- Collaborate with product managers and agile teams to refine requirements and deliver solutions.
- Create integrations to accommodate growing data volume and complexity.
- Enhance data models to improve accessibility for business intelligence tools.
- Implement systems to ensure data quality and availability for stakeholders.
- Write unit and integration tests while documenting technical work.
- Automate testing and deployment processes in Snowflake within Azure.
- Minimum of 3 years of experience in data engineering or related field.
- Proficiency in Apache Spark, Python, and Apache Hadoop.
- Strong knowledge of Snowflake, including schema design and optimization.
- Experience with ETL processes and tools such as Azure Data Factory and Azure Databricks.
- Familiarity with data modeling techniques and T-SQL.
- Hands-on experience with Apache Kafka and JSON formats.
- Understanding of Azure Data Lake and Microsoft Azure technologies.
- Ability to troubleshoot and resolve data-related issues effectively.