Fintal Partners
Base pay range: $175,000.00/yr - $225,000.00/yr
A leading global trading organization is looking to bring on a
Senior Data Engineering Specialist
to help strengthen the backbone of its real-time and large-scale data ecosystem. This group sits at the intersection of engineering, quantitative research, and high-performance trading — meaning the work you do will directly support complex strategies operating across global markets.
If you enjoy working on massive datasets, architecting resilient systems, and partnering with developers and quants to solve meaningful problems, this is a place where your ideas truly shape the next generation of trading technology.
What You’ll Work On
Architect large-scale data environments , including modern streaming and storage ecosystems built on technologies like
Kafka, Hadoop, and Dremio .
Develop and optimize advanced data pipelines
using
Java, Python, Spark, and Flink , designed for both high throughput and low latency.
Enhance data modeling and ingestion layers , ensuring smooth integration across research, engineering, and trading teams.
Drive reliability and availability
of mission-critical datasets used across the firm’s analytics and trading functions.
Deploy, scale, and manage containerized workloads
using
Kubernetes and Docker
across distributed environments.
Monitor and tune system performance
using tools such as
Prometheus, Grafana, Alert Manager , and related observability platforms.
Troubleshoot complex production issues , applying strong statistical reasoning, root-cause analysis, and systems-level thinking.
Automate repetitive workflows
with
Unix scripting (bash, Python)
to improve efficiency across teams.
Serve as a key technical advisor , helping stakeholders understand best practices in data engineering, architecture, and scaling.
What You Bring
Several years of hands‑on experience in a mature data engineering environment supporting demanding workloads.
Deep familiarity with
distributed streaming systems
and experience building or maintaining real-time applications.
Strong foundation working with
modern big‑data storage layers
and distributed computation frameworks.
Proficiency in
Java, Python, and SQL
for building and optimizing data workflows.
Experience working with
containerized deployments
and orchestrators in production environments.
Comfort with
monitoring, alerting, and observability tooling
for mission‑critical systems.
A problem‑solver’s mindset: the ability to diagnose issues, trace root causes, and design durable fixes.
Solid experience with scripting and Linux‑based environments.
Seniority Level Mid‑Senior level
Employment Type Full‑time
Job Function Capital Markets
#J-18808-Ljbffr
A leading global trading organization is looking to bring on a
Senior Data Engineering Specialist
to help strengthen the backbone of its real-time and large-scale data ecosystem. This group sits at the intersection of engineering, quantitative research, and high-performance trading — meaning the work you do will directly support complex strategies operating across global markets.
If you enjoy working on massive datasets, architecting resilient systems, and partnering with developers and quants to solve meaningful problems, this is a place where your ideas truly shape the next generation of trading technology.
What You’ll Work On
Architect large-scale data environments , including modern streaming and storage ecosystems built on technologies like
Kafka, Hadoop, and Dremio .
Develop and optimize advanced data pipelines
using
Java, Python, Spark, and Flink , designed for both high throughput and low latency.
Enhance data modeling and ingestion layers , ensuring smooth integration across research, engineering, and trading teams.
Drive reliability and availability
of mission-critical datasets used across the firm’s analytics and trading functions.
Deploy, scale, and manage containerized workloads
using
Kubernetes and Docker
across distributed environments.
Monitor and tune system performance
using tools such as
Prometheus, Grafana, Alert Manager , and related observability platforms.
Troubleshoot complex production issues , applying strong statistical reasoning, root-cause analysis, and systems-level thinking.
Automate repetitive workflows
with
Unix scripting (bash, Python)
to improve efficiency across teams.
Serve as a key technical advisor , helping stakeholders understand best practices in data engineering, architecture, and scaling.
What You Bring
Several years of hands‑on experience in a mature data engineering environment supporting demanding workloads.
Deep familiarity with
distributed streaming systems
and experience building or maintaining real-time applications.
Strong foundation working with
modern big‑data storage layers
and distributed computation frameworks.
Proficiency in
Java, Python, and SQL
for building and optimizing data workflows.
Experience working with
containerized deployments
and orchestrators in production environments.
Comfort with
monitoring, alerting, and observability tooling
for mission‑critical systems.
A problem‑solver’s mindset: the ability to diagnose issues, trace root causes, and design durable fixes.
Solid experience with scripting and Linux‑based environments.
Seniority Level Mid‑Senior level
Employment Type Full‑time
Job Function Capital Markets
#J-18808-Ljbffr