Radley James
Direct message the job poster from Radley James
Headhunter specialising in Technology and Quant Trading at Radley James
Senior Data Engineer - High-Frequency Trading - Chicago A High-Frequency Trading firm based in Chicago is seeking an adept Data Engineer to join their global team and help them architect and deploy a brand new data platform, while optimizing their existing data infrastructure and optimizing research workflows. Responsibilities Assist in architecting and deploying a brand new data platform Translate experimental research workflows into scalable, production-grade systems Help shape global data strategy Work with Quant Researchers and traders to unlock predictive trading insights Design, build, and maintain ETL/ELT pipelines with Spark, Databricks, and in-house tools Optimize and performance-tune research workflows Define and document best practices for Data Engineering
Requirements
Bachelors or Masters degree in Computer Science, Engineering, or a related technical discipline 5+ Years of experience in Data Engineering Proficiency with Python Scripting Strong experience with Apache Spark Strong experience with Databricks Experience with Delta Lake Experience with Kafka Strong knowledge of AWS cloud-native infrastructure Strong knowledge of cloud cost optimization Proficiency with relational databases Strong verbal and written communication skills
Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Information Technology, Engineering, and Finance
Industries
Capital Markets, IT System Data Services, and Data Infrastructure and Analytics
#J-18808-Ljbffr
Senior Data Engineer - High-Frequency Trading - Chicago A High-Frequency Trading firm based in Chicago is seeking an adept Data Engineer to join their global team and help them architect and deploy a brand new data platform, while optimizing their existing data infrastructure and optimizing research workflows. Responsibilities Assist in architecting and deploying a brand new data platform Translate experimental research workflows into scalable, production-grade systems Help shape global data strategy Work with Quant Researchers and traders to unlock predictive trading insights Design, build, and maintain ETL/ELT pipelines with Spark, Databricks, and in-house tools Optimize and performance-tune research workflows Define and document best practices for Data Engineering
Requirements
Bachelors or Masters degree in Computer Science, Engineering, or a related technical discipline 5+ Years of experience in Data Engineering Proficiency with Python Scripting Strong experience with Apache Spark Strong experience with Databricks Experience with Delta Lake Experience with Kafka Strong knowledge of AWS cloud-native infrastructure Strong knowledge of cloud cost optimization Proficiency with relational databases Strong verbal and written communication skills
Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Information Technology, Engineering, and Finance
Industries
Capital Markets, IT System Data Services, and Data Infrastructure and Analytics
#J-18808-Ljbffr