Skip Navigation

Big Data Application Developer – Expert

  • Location: Toronto, ON
  • Project Type: Information Technology (IT)

Apply to the Big Data Application Developer – Expert position

"*" indicates required fields

This field is for validation purposes and should be left unchanged.
Max. file size: 512 MB.
Upload or drop files here

An Initial AI Screening will be conducted for this role. Please Click Here to complete your AI screening while submitting your application.

JOB DESCRIPTION

Location: Hybrid | Downtown, Toronto

Duration: 12 months

Our client, a leading financial institution in Downtown Toronto, is looking for a Big Data Application Developer – Expert to stay informed about emerging technologies and trends in the data engineering domain. The successful candidate will have the opportunity to work with one of the Top 5 Banks in Canada.

Typical Day in role:

  • Build scalable data platform infrastructure using cloud native technologies to enable Data Engineering, Data Science and Reporting.
  • Enable the deployment of batch and event-driven data pipelines using latest technologies such as Spark, NiFi, Kafka, and Airflow across cloud platforms (Databricks, Azure, Snowflake).
  • Support with the implementation of DevOps, DataOps, and MLOps tools and processes to drive the team’s efficiency to deliver, maintain, and monitor high quality data products.
  • Ensure data security standards are applied across all data platform’s environments.
  • Provide L3 support for all components of the data platform tech stack
  • Perform proof-of-concepts to implement hybrid cloud data strategy involving on-prem systems like S3, OpenShift, Trino, Spark with Cloud based solutions like Databricks, Azure, Snowflake, etc.
  • Apply design-thinking and agile mindset in working with other engineers and business stakeholders to continuously experiment, iterate and deliver on new initiatives.

Must Have Skills:

  • 3-5 years of hands-on experience in big data processing using Spark, Python, NiFi, Kafka, Airflow, and cloud data platforms (Databricks, Snowflake).
  • 2-3 years of system administration including Docker and Kubernetes platforms.
  • Strong knowledge of UNIX-like operating systems and shell scripting skills.
  • Proficiency in writing and optimizing SQL queries and at least one programming languages like Java, Scala and/or Python.
  • Experience with DevOps tools, automation, and using Data Ops to develop data flows and the continuous use of data.
  • Continuously learning mindset and enjoy working on open-ended problems.

Nice-To-Have Skills:

  • Experience with OpenShift, AirFlow, S3, Trino, Ranger and Hive.
  • Knowledge of networking, information security and vulnerability management.
  • Knowledge of creating dashboards using Prometheus and Grafana.
  • Knowledge about data science tools and libraries.

Education:

  • Highest Education

FP Inc. is committed to creating an inclusive environment where all team members and clients feel like they belong. In accordance with the requirements set out in the Employment Standards Act, FP Inc. hereby declares that AI is utilized in the screening process for this position. The hourly compensation range for this role is $65/hr -$86/hr. We seek applicants with a wide range of abilities, and we provide an accessible candidate experience. We advocate for you and welcome anyone regardless of race, colour, religion, national origin, sex, physical or mental disability, or age.