0% found this document useful (0 votes)
20 views2 pages

Senior Data Engineer Role at Interac Corp

Data engineer training

Uploaded by

Roshan Rosh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views2 pages

Senior Data Engineer Role at Interac Corp

Data engineer training

Uploaded by

Roshan Rosh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Senior Data Engineer – Interac Corp.

At Interac, we design and deliver products and solutions that give Canadians control over their money so
they can get more out of life. But that’s not all. Whether we’re leading real-time money movement,
driving innovative commerce solutions like open payments for transit systems, or making advancements
in new areas like verification and open banking, we are playing a key role in shaping the future of the
digital economy in Canada.

The Senior Data Engineer will be responsible for designing, developing, and supporting the
implementation of high-quality and sustainable data fulfillment solutions to improve business outcomes.
You’ll work with several teams to develop and maintain data pipelines to improve workflows and
automate processes wherever possible.

You’ll be responsible for:

 Designing, developing, implementing, and monitoring highly scalable processes to ingest and
process high volume transactions while ensuring high performance and quality.

 Collaborating with various teams to translate business and analytics requirements into a data
fulfillment strategy including building data pipelines, data aggregations, infrastructure and
tooling to support business initiatives.

 Overseeing the design and maintenance of data pipelines and contributing to the continual
enhancement of the data engineering architecture.

 Working with other developers, engineers, data scientists, and business stakeholders to
continuously explore new capabilities and technologies to drive innovation.

 Collaborating with the team to meet performance, scalability, and reliability goals.

 Developing standards for data processes and automating routine tasks while ensuring the timing
of automatic jobs do not conflict with application processes.

 Writing out tests and thorough documentation for processes and tooling.

 Supporting data transformation testing and production implementation as required.

 Adapting to working with new technologies and frameworks, sometimes headlining the
investigation into their usefulness to the team.

 Supporting the team with problem analysis and resolution.

 Maintaining and expanding existing systems, tooling, and infrastructure.

 Management and proactive monitoring of data flows ensuring data integrity and performance.

 Participating in rotating on-call support.

 Mentoring and coaching a talented team of engineers to maintain best practices in data
engineering.
You bring:

 5+ years of hands-on experience in data or software engineering. Preferably leading teams.

 A University degree in Computer Science Engineering or an equivalent combination of education


and experience.

 A solid foundation in data structures, distributed systems, algorithms, data modeling, data
pipeline processes, data integration patterns and software design.

 Eligibility to work for Interac Corp. in Canada in a full-time capacity.

 Hands on experience with implementing dimensional models in SQL as well as experience with
RDBMS (Oracle), BigData (Hive, Impala) environments, Apache Spark framework (PySpark
preferred).

 A demonstration of the Agile mindset, and strong experience in problem-solving within Agile
environments.

 Excellent interpersonal and communication skills with the ability to articulate complex technical
concepts to a varied audience.

 Experience building out a scalable infrastructure to fit the needs of a growing company.

 Experience with data warehousing, operational data stores, large-scale implementations and
ETL/ELT/data streaming processing.

 Knowledge of and proven experience with:

o Cloud tools and services (AWS preferred)

o Cloud based Data Warehousing tools (AWS Redshift preferred)

o ETL or Data Integration tools (Talend preferred)

o Oracle PL/SQL, SQL, Python and Java

 Exposure to Big Data technologies including development, performance tuning, and standard
operational activities involved.

 Awareness that security is one of the most critical parts in any data related projects.

 Experience in data ingestion and processing including Third-Party APIs

 Experience with Github and CI/CD practices is an asset

 Experience with testing frameworks such as Jest, pytest or equivalent.

 Strong organization, collaboration and relationship building skills.

You might also like