Summary :
Senior Data Engineer with 17+ years of experience across Data Engineering, Analytics, and Data Science , including 15+ years in global financial institutions like Scotiabank, Bank of America, and HSBC. Extensive expertise in building and managing ETL pipelines , designing data models , and developing enterprise-grade data platforms using DBT, Big Query, Snowflake, Python, and Airflow (Composer) .
Led end-to-end ETL architecture and cloud migration initiatives from on-prem to GCP and Snowflake , ensuring scalable, high-performance pipelines aligned with data governance and quality standards. Delivered critical financial data pipelines and reconciliations , collaborating with Accounting, Treasury, and Product teams to enable strategic, data-driven decision-making.
Certified in GCP Architecture , AWS Cloud , and Spark / Hadoop , with deep hands-on experience in Python, SQL, Spark, Hive, Docker, Kubernetes, Jenkins , and more. Strong track record of mentoring teams , improving pipeline reliability, and driving efficient execution across cross-functional projects .
Role : Data Engineer
Contract duration : 8 9 months
Location of Work : Remote. (Client is US-based) Might need to go to EY Canada office in person.
Job duties :
- Work with Product and Engineering to drive new initiatives, supporting new product launches fueling business growth
- Help define data model, contract, scaling the build of data quality check and contract enforcement
- Collaborate with stakeholders across Accounting, Treasury, and other Finance functions, to extract financial data and insights, and provide data driven recommendations to enhance efficiencies for financial processes
- Act as a strategic partner to functional teams; design and scope project plans; execute, test and implement data and reporting solutions
- Initiate, develop and maintain data pipelines and data models that power dashboards and data products with outstanding craftsmanship
- Work with the broader Data and Engineering teams to find ways to scale our platform through better systems and automation
- Develop interactive dashboards and visualizations to present financial insights to stakeholders
- Partner with peers and stakeholders to define and execute roadmaps that can help the function scale
Job requirements :
5+ years of relevant experienceSelf-starter, has the ability to multitask, and highly collaborative in working with cross functional teamsStrong experience with data processing (ETL), fluency in complex SQL and basic programming language high proficiency with scripting language (preferably Python)Strong experience building data pipelinesKnowledge of data modeling, architecture, best practicesExperience using Apache Airflow or similar orchestration systemsStrong understanding of key finance principles, products within Accounting and Treasury domain, and eagerness to learnBe able to independently create documentation for new data projects and requests, and build collaboration within the teamDemonstrate our core cultural values : clear communication, positive energy, continuous learning, act like an owner and efficient executionBachelor's degree in engineering, Computer Science or equivalent practical experienceGood to have experience :
Snowflake Migration & Business IntelligenceLLM RAG Application SAP Document GenerationData Analytics & Machine LearningOperation SimulationData Strategy, Data Architecture & MDMFixed transaction data quality issues