Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of over 6 months, offering a competitive pay rate. Key skills include batch and streaming data processing, AWS proficiency, Python expertise, and Terraform experience.
Location: Fort Mill, SC
Responsibilities
Work as a Senior Engineer on the “T-Bar” (Transaction Books and Records) product, a critical data store for positions, transactions, and tax lots.
Lead the transition from a legacy data center to a cloud-native AWS environment, modernizing infrastructure and processes.
Must‑Have Requirements
Experience with batch and streaming data processing, including intraday use cases with trading partners and micro‑batches.
CI/CD and developer discipline: proficiency in code commit practices, tying commits to Jira tickets for traceability and cherry‑picking.
AWS familiarity: experience in AWS environments, particularly with data pipelines and foundational data layers.
Python: strong skills in Python for data processing and logic conversion (e.g., rewriting stored procedures from legacy systems).
Infrastructure as Code (IaC): experience with Terraform for deploying and managing infrastructure, especially for carving out separate AWS accounts.
Nice‑to‑Have Skills
Orchestration tools: familiarity with Airflow, Dagster, or similar tools for centralized orchestration; current setup includes Step Functions, EventBridge, and an in-house eventing system.
AI/ML integration: exposure to AI tools like GitHub Copilot or Cursor for code development, unit test generation, or data quality checks (e.g., anomaly detection).
Data governance: experience with data catalogs, producing/consuming assets, or tools like AWS DataZone for governance and contract‑based testing.
Event‑driven architecture: understanding of event‑driven systems; the team aims to move toward this model in the future.
#J-18808-Ljbffr
Location: Fort Mill, SC
Responsibilities
Work as a Senior Engineer on the “T-Bar” (Transaction Books and Records) product, a critical data store for positions, transactions, and tax lots.
Lead the transition from a legacy data center to a cloud-native AWS environment, modernizing infrastructure and processes.
Must‑Have Requirements
Experience with batch and streaming data processing, including intraday use cases with trading partners and micro‑batches.
CI/CD and developer discipline: proficiency in code commit practices, tying commits to Jira tickets for traceability and cherry‑picking.
AWS familiarity: experience in AWS environments, particularly with data pipelines and foundational data layers.
Python: strong skills in Python for data processing and logic conversion (e.g., rewriting stored procedures from legacy systems).
Infrastructure as Code (IaC): experience with Terraform for deploying and managing infrastructure, especially for carving out separate AWS accounts.
Nice‑to‑Have Skills
Orchestration tools: familiarity with Airflow, Dagster, or similar tools for centralized orchestration; current setup includes Step Functions, EventBridge, and an in-house eventing system.
AI/ML integration: exposure to AI tools like GitHub Copilot or Cursor for code development, unit test generation, or data quality checks (e.g., anomaly detection).
Data governance: experience with data catalogs, producing/consuming assets, or tools like AWS DataZone for governance and contract‑based testing.
Event‑driven architecture: understanding of event‑driven systems; the team aims to move toward this model in the future.
#J-18808-Ljbffr