Apex Fintech Solutions LLC
WHO WE ARE
Apex Fintech Solutions (AFS) powers innovation and the future of digital wealth management by processing millions of transactions daily, to simplify, automate, and facilitate access to financial markets for all. Our robust suite of fintech solutions enables us to support clients such as Stash, Betterment, SoFi, and Webull, and more than 20 million of our clients' customers.
Collectively, AFS creates an environment in which companies with the biggest ideas in fintech are empowered to change the world. As a global organization, we have offices in Austin, Dallas, Chicago, New York, Portland, Belfast, and Manila.
If you are seeking a fast-paced and entrepreneurial environment where you'll have the opportunity to make an immediate impact, and you have the guts to change everything, this is the place for you.
AFS Industry Awards
2021, 2020, 2019, and 2018 Best Wealth Management Company - presented by Fintech Breakthrough Awards
2021 Most Innovative Companies - presented by Fast Company
2021 Best API & Best Trading Technology - presented by Global Fintech Awards
ABOUT THIS ROLE At Apex Fintech Solutions, we're transforming how businesses leverage data to drive strategic decisions in the fintech and wealth tech community. Our Risk & Regulatory Data Products team is at the heart of this mission, building and maintaining scalable cloud infrastructure that powers our analytics capabilities and supports our clearing and custody services.
As a Senior Data Engineer on our team, you will design, build, and optimize robust, cloud-native data pipelines that transform raw financial data into valuable business insights. This role focuses initially on building data products that enable reporting and analytics for our Trade Processing System, in addition to building out performant data systems for a wide variety of use cases in the Margin, Risk, Regulatory Tech, and Securities Lending domains. You'll work closely with these cross-functional teams, along with Data Experience and Analytics to understand requirements and implement scalable solutions that support our growing data needs.
This role offers an exciting opportunity to work with cutting-edge cloud technologies while solving complex data challenges in a fast-paced financial services environment.
Key Responsibilities Data Pipeline Development
Design, develop, and maintain scalable ETL/ELT pipelines for our enterprise data warehouse and data lake
Implement robust data processing systems that support our Data Products
Build systems to handle real-time and batch data updates using change data capture (CDC) tools such as HVR, Dataflow, Datastream
Build and extend existing systems that process, store, and move streaming and batch data from SQL-based sources to centralized GCP storage and/or cloud data warehousing platforms
Build complex views that enable convenient and intuitive access to datasets
Quality & Optimization
Implement comprehensive data quality checks and monitoring to ensure accuracy and reliability
Optimize existing data workflows for improved performance, cost efficiency, and scalability
Ensure system quality, stability, and maintainability across all solutions
Build and maintain data models that support transactional workloads, analytics and reporting needs
Support and Maintenance
Optimize existing data workflows for improved performance and efficiency
Build, maintain, and extend data models that support realtime transactions, analytics and reporting needs
Work with a continuous improvement mindset
Provide Level 2 support and be part of on-call rotations
Collaboration & Communication
Communicate effectively with teams across many product domains
Participate in code reviews and contribute to best practices for data engineering
Document architectural diagrams and technical specs for data processes and workflows
Compliance & Security
Ensure security and privacy of data in accordance with financial-sector compliance requirements
Implement appropriate access controls and data governance practices
Support data operations across development, testing, and production environments
Perform regular audits and attestations for data protection and compliance
Leadership
Gather requirements from stakeholders and translate them into technical solutions
Advocate for architectural designs, explain tradeoffs, and recommend optimal paths forward
Serve as technical lead on a team of data engineers
Coordinate across Data Product teams to align tool usage and encourage best practices
Required Qualifications Education & Experience
Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field
5+ years of experience in data engineering, cloud data engineering, or similar roles
Experience in financial services strongly preferred
Technical Skills
Cloud Platforms: Expert-level knowledge of Google Cloud Platform (GCP); GCP Data Engineer certification strongly preferred
Programming: Expert proficiency in SQL and Python; familiarity with Java is a plus
Data Processing: Experience with ETL/ELT tools and frameworks (Apache Airflow, AWS Glue, dbt, GCP Dataflow)
Databases: Strong experience with relational database systems (PostgreSQL, SQL Server) and cloud data warehouses such as Bigquery, Snowflake.
Infrastructure: Experience with CI/CD systems, Infrastructure as Code (Terraform), and Kubernetes
Data Replication: Experience with change data capture (CDC) tools like HVR, Datastream, Dataflow, Kinesis
Version Control: Proficiency with Git and modern CI/CD development practices
AI & Machine Learning Skills
Prompt Engineering: Demonstrated experience in crafting effective prompts for various AI models and use cases
Large Language Models (LLMs): Hands-on exposure with multiple LLMs (e.g., GPT-4, Claude, Gemini, open-source models)
AI-Assisted Development: Proven experience using Agentic AI tools to co-author development work, including code generation, debugging, and
AI Integration: Understanding of how to integrate AI capabilities into data processing workflows and automation
Core Competencies
Strong understanding of data warehousing concepts and dimensional modeling
Knowledge of distributed systems and data architecture patterns
Understanding of batch vs. streaming data processing
Experience with data modeling and schema design
Strong analytical and problem-solving skills
Work Environment
This role operates in a hybrid capacity, requiring on-site collaboration three days per week.
#engineering #full-time #mid-senior #LI-MJ1 #APEX
Our Rewards We offer a robust package of employee perks and benefits, including healthcare benefits (medical, dental and vision, EAP), competitive PTO, 401k match, parental leave, and HSA contribution match. We also provide our employees with a paid subscription to the Calm app and offer generous external learning and tuition reimbursement benefits. At AFS, we offer a hybrid work schedule for most roles that allows employees to have the flexibility of working from home and one of our primary offices.
EEO Statement Apex Fintech Solutions is an equal opportunity employer that does not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), national origin, age, disability, veteran status, marital status, or any other protected characteristic. Our hiring practices ensure that all qualified applicants receive fair consideration without regard to these characteristics.
Disability Statement Apex Fintech Solutions is committed to creating an inclusive and accessible workplace for all candidates, including those with disabilities. We are dedicated to ensuring equal employment opportunities and providing reasonable accommodations to qualified individuals with disabilities. If you require reasonable accommodations to participate in the application or interview process, please submit your request via the Candidate Accommodation Requests Form. We will work with you to provide the necessary accommodations to ensure your full participation in our hiring process.
#J-18808-Ljbffr
Collectively, AFS creates an environment in which companies with the biggest ideas in fintech are empowered to change the world. As a global organization, we have offices in Austin, Dallas, Chicago, New York, Portland, Belfast, and Manila.
If you are seeking a fast-paced and entrepreneurial environment where you'll have the opportunity to make an immediate impact, and you have the guts to change everything, this is the place for you.
AFS Industry Awards
2021, 2020, 2019, and 2018 Best Wealth Management Company - presented by Fintech Breakthrough Awards
2021 Most Innovative Companies - presented by Fast Company
2021 Best API & Best Trading Technology - presented by Global Fintech Awards
ABOUT THIS ROLE At Apex Fintech Solutions, we're transforming how businesses leverage data to drive strategic decisions in the fintech and wealth tech community. Our Risk & Regulatory Data Products team is at the heart of this mission, building and maintaining scalable cloud infrastructure that powers our analytics capabilities and supports our clearing and custody services.
As a Senior Data Engineer on our team, you will design, build, and optimize robust, cloud-native data pipelines that transform raw financial data into valuable business insights. This role focuses initially on building data products that enable reporting and analytics for our Trade Processing System, in addition to building out performant data systems for a wide variety of use cases in the Margin, Risk, Regulatory Tech, and Securities Lending domains. You'll work closely with these cross-functional teams, along with Data Experience and Analytics to understand requirements and implement scalable solutions that support our growing data needs.
This role offers an exciting opportunity to work with cutting-edge cloud technologies while solving complex data challenges in a fast-paced financial services environment.
Key Responsibilities Data Pipeline Development
Design, develop, and maintain scalable ETL/ELT pipelines for our enterprise data warehouse and data lake
Implement robust data processing systems that support our Data Products
Build systems to handle real-time and batch data updates using change data capture (CDC) tools such as HVR, Dataflow, Datastream
Build and extend existing systems that process, store, and move streaming and batch data from SQL-based sources to centralized GCP storage and/or cloud data warehousing platforms
Build complex views that enable convenient and intuitive access to datasets
Quality & Optimization
Implement comprehensive data quality checks and monitoring to ensure accuracy and reliability
Optimize existing data workflows for improved performance, cost efficiency, and scalability
Ensure system quality, stability, and maintainability across all solutions
Build and maintain data models that support transactional workloads, analytics and reporting needs
Support and Maintenance
Optimize existing data workflows for improved performance and efficiency
Build, maintain, and extend data models that support realtime transactions, analytics and reporting needs
Work with a continuous improvement mindset
Provide Level 2 support and be part of on-call rotations
Collaboration & Communication
Communicate effectively with teams across many product domains
Participate in code reviews and contribute to best practices for data engineering
Document architectural diagrams and technical specs for data processes and workflows
Compliance & Security
Ensure security and privacy of data in accordance with financial-sector compliance requirements
Implement appropriate access controls and data governance practices
Support data operations across development, testing, and production environments
Perform regular audits and attestations for data protection and compliance
Leadership
Gather requirements from stakeholders and translate them into technical solutions
Advocate for architectural designs, explain tradeoffs, and recommend optimal paths forward
Serve as technical lead on a team of data engineers
Coordinate across Data Product teams to align tool usage and encourage best practices
Required Qualifications Education & Experience
Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field
5+ years of experience in data engineering, cloud data engineering, or similar roles
Experience in financial services strongly preferred
Technical Skills
Cloud Platforms: Expert-level knowledge of Google Cloud Platform (GCP); GCP Data Engineer certification strongly preferred
Programming: Expert proficiency in SQL and Python; familiarity with Java is a plus
Data Processing: Experience with ETL/ELT tools and frameworks (Apache Airflow, AWS Glue, dbt, GCP Dataflow)
Databases: Strong experience with relational database systems (PostgreSQL, SQL Server) and cloud data warehouses such as Bigquery, Snowflake.
Infrastructure: Experience with CI/CD systems, Infrastructure as Code (Terraform), and Kubernetes
Data Replication: Experience with change data capture (CDC) tools like HVR, Datastream, Dataflow, Kinesis
Version Control: Proficiency with Git and modern CI/CD development practices
AI & Machine Learning Skills
Prompt Engineering: Demonstrated experience in crafting effective prompts for various AI models and use cases
Large Language Models (LLMs): Hands-on exposure with multiple LLMs (e.g., GPT-4, Claude, Gemini, open-source models)
AI-Assisted Development: Proven experience using Agentic AI tools to co-author development work, including code generation, debugging, and
AI Integration: Understanding of how to integrate AI capabilities into data processing workflows and automation
Core Competencies
Strong understanding of data warehousing concepts and dimensional modeling
Knowledge of distributed systems and data architecture patterns
Understanding of batch vs. streaming data processing
Experience with data modeling and schema design
Strong analytical and problem-solving skills
Work Environment
This role operates in a hybrid capacity, requiring on-site collaboration three days per week.
#engineering #full-time #mid-senior #LI-MJ1 #APEX
Our Rewards We offer a robust package of employee perks and benefits, including healthcare benefits (medical, dental and vision, EAP), competitive PTO, 401k match, parental leave, and HSA contribution match. We also provide our employees with a paid subscription to the Calm app and offer generous external learning and tuition reimbursement benefits. At AFS, we offer a hybrid work schedule for most roles that allows employees to have the flexibility of working from home and one of our primary offices.
EEO Statement Apex Fintech Solutions is an equal opportunity employer that does not discriminate on the basis of race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), national origin, age, disability, veteran status, marital status, or any other protected characteristic. Our hiring practices ensure that all qualified applicants receive fair consideration without regard to these characteristics.
Disability Statement Apex Fintech Solutions is committed to creating an inclusive and accessible workplace for all candidates, including those with disabilities. We are dedicated to ensuring equal employment opportunities and providing reasonable accommodations to qualified individuals with disabilities. If you require reasonable accommodations to participate in the application or interview process, please submit your request via the Candidate Accommodation Requests Form. We will work with you to provide the necessary accommodations to ensure your full participation in our hiring process.
#J-18808-Ljbffr