Big Data Solutions Architect – Expert

JOB DESCRIPTION

Location: Hybrid | Downtown, Toronto

Duration: 12 months

Our client a leading financial institution in Downtown Toronto is looking for a Big Data Solutions Architect – Expert to collaborate with data engineers, data scientists, business analysts, and IT teams to implement the data strategy in the cloud. The successful candidate will have the opportunity to work with one of the Top 5 Banks in Canada.

Typical Day in role:

  • Build scalable data platform infrastructure using cloud native technologies to enable Data Engineering, Data Science and Reporting to meet the business needs.
  • Design and oversee the extraction, transformation, and loading (ETL) processes, ensuring efficient data flow from source systems and on-premises data lakes into the cloud.
  • Ensure that the data solutions are optimized for performance and reliability.
  • Implement security and recovery measures to protect data quality and ensure compliance with relevant policies and regulations.
  • Design and implement automated procedures for the effective management of data throughout its lifecycle, from creation and storage to archiving and deletion.
  • Maintain comprehensive documentation for the cloud data architecture including the development of big data standards, best practices, and methodologies.
  • Stay abreast of the latest trends and advancements in big data technologies.
  • Provide guidance and training to other team members on big data concepts and cloud data platforms.
  • Apply design-thinking and agile mindset in working with various stakeholders to continuously experiment, iterate and deliver on solutions that enable end-users to extract insights and value from data.

Must-Have Skills:

  • 3-5 years of architecture experience in both Databricks and Snowflake platforms to design and implementing big data solutions including migration towards hybrid architecture.
  • Professional certification on designing solutions for Azure and/or AWS public clouds.
  • Strong knowledge of ETL processes, OLAPOLTP systems, SQL and NoSQL Databases.
  • Experience building batch and real-time data pipelines leveraging big data technologies such as Spark, Airflow, Nifi, Kafka, Cassandra, and Elasticsearch
  • Expertise in DataOps/DevOps practices for deploying and monitoring automated data pipelines and data lifecycle management.
  • Proficiency in writing and optimizing SQL queries and at least one programming languages like Java, Scala and/or Python.
  • Continuously learning mindset and enjoy working on open-ended problems.

Nice-To-Have Skills:

  • System administration experience including Docker and Kubernetes platforms.
  • Experience with OpenShift, S3, Trino, Ranger and Hive.
  • Knowledge of machine learning and data science concepts and tools.
  • Experience with BI tools.

Soft Skills Required: N/A

Education: Highest Education

Finance professional is committed to creating an inclusive environment where all team members and clients feel like they belong. We seek applicants with a wide range of abilities and we provide an accessible candidate experience. We advocate for you and welcome anyone regardless of race, color, religion, national origin, sex, physical or mental disability, or age.