Weekday AI
This role is for one of the Weekday's clients
Salary range: $180k - $250k
Experience: 6+ YoE
We are seeking a highly skilled
Backend Engineer
with deep expertise in
AI-driven application development , strong programming skills in
Python
and
TypeScript , and a proven track record in designing and maintaining
ETL pipelines . This role will be at the core of our data and AI platform, powering scalable, high-performance systems that enable innovative product features and business intelligence solutions. You will collaborate closely with data scientists, frontend developers, and product teams to design, build, and optimize backend systems capable of handling complex data flows and delivering real-time AI-driven functionalities. Key Responsibilities
AI-Driven Backend Development Build and optimize backend services to support AI/ML-powered applications. Collaborate with data scientists to integrate AI models into production systems with efficiency, scalability, and reliability. Design APIs and data workflows that seamlessly deliver AI insights to frontend and external systems. Python & TypeScript Programming Write clean, maintainable, and efficient backend code in
Python
and
TypeScript . Develop robust server-side logic, APIs, and data processing workflows. Implement automated testing and ensure high-quality software releases. ETL Pipeline Development Design, develop, and maintain ETL pipelines for large-scale data ingestion, transformation, and delivery. Optimize data pipelines for performance, scalability, and fault tolerance. Work with structured and unstructured datasets from multiple sources to feed AI models and analytics platforms. System Architecture & Performance Optimization Contribute to architectural decisions for backend systems, ensuring modularity, maintainability, and scalability. Implement best practices for performance tuning, security, and observability. Monitor and troubleshoot backend systems, ensuring high uptime and reliability. Collaboration & Documentation Work cross-functionally with data engineering, frontend, and product teams to align backend solutions with business needs. Create and maintain technical documentation for backend systems, APIs, and data workflows. Required Skills & Experience
6–10 years
of experience in backend engineering or related roles. Proven expertise in
AI-driven application development , including integrating and deploying machine learning models into production systems. Strong proficiency in
Python
for backend development and data processing. Solid experience with
TypeScript
for server-side development (Node.js environment preferred). Hands-on experience designing and managing
ETL pipelines
for large-scale data systems. Experience with relational and non-relational databases (e.g., PostgreSQL, MongoDB). Familiarity with cloud platforms (AWS, GCP, or Azure) and containerization (Docker, Kubernetes). Strong understanding of API design principles (REST, GraphQL) and asynchronous data processing. Proficiency in performance optimization, scalability strategies, and distributed systems. Excellent problem-solving skills, attention to detail, and ability to work in fast-paced environments. Preferred Qualifications
Experience working with
data orchestration frameworks
like Apache Airflow, Prefect, or Dagster. Familiarity with
real-time data streaming technologies
(Kafka, Kinesis, etc.). Exposure to
AI/ML model lifecycle management
tools (MLflow, Kubeflow, Vertex AI). Understanding of data governance, compliance, and security best practices.
#J-18808-Ljbffr
We are seeking a highly skilled
Backend Engineer
with deep expertise in
AI-driven application development , strong programming skills in
Python
and
TypeScript , and a proven track record in designing and maintaining
ETL pipelines . This role will be at the core of our data and AI platform, powering scalable, high-performance systems that enable innovative product features and business intelligence solutions. You will collaborate closely with data scientists, frontend developers, and product teams to design, build, and optimize backend systems capable of handling complex data flows and delivering real-time AI-driven functionalities. Key Responsibilities
AI-Driven Backend Development Build and optimize backend services to support AI/ML-powered applications. Collaborate with data scientists to integrate AI models into production systems with efficiency, scalability, and reliability. Design APIs and data workflows that seamlessly deliver AI insights to frontend and external systems. Python & TypeScript Programming Write clean, maintainable, and efficient backend code in
Python
and
TypeScript . Develop robust server-side logic, APIs, and data processing workflows. Implement automated testing and ensure high-quality software releases. ETL Pipeline Development Design, develop, and maintain ETL pipelines for large-scale data ingestion, transformation, and delivery. Optimize data pipelines for performance, scalability, and fault tolerance. Work with structured and unstructured datasets from multiple sources to feed AI models and analytics platforms. System Architecture & Performance Optimization Contribute to architectural decisions for backend systems, ensuring modularity, maintainability, and scalability. Implement best practices for performance tuning, security, and observability. Monitor and troubleshoot backend systems, ensuring high uptime and reliability. Collaboration & Documentation Work cross-functionally with data engineering, frontend, and product teams to align backend solutions with business needs. Create and maintain technical documentation for backend systems, APIs, and data workflows. Required Skills & Experience
6–10 years
of experience in backend engineering or related roles. Proven expertise in
AI-driven application development , including integrating and deploying machine learning models into production systems. Strong proficiency in
Python
for backend development and data processing. Solid experience with
TypeScript
for server-side development (Node.js environment preferred). Hands-on experience designing and managing
ETL pipelines
for large-scale data systems. Experience with relational and non-relational databases (e.g., PostgreSQL, MongoDB). Familiarity with cloud platforms (AWS, GCP, or Azure) and containerization (Docker, Kubernetes). Strong understanding of API design principles (REST, GraphQL) and asynchronous data processing. Proficiency in performance optimization, scalability strategies, and distributed systems. Excellent problem-solving skills, attention to detail, and ability to work in fast-paced environments. Preferred Qualifications
Experience working with
data orchestration frameworks
like Apache Airflow, Prefect, or Dagster. Familiarity with
real-time data streaming technologies
(Kafka, Kinesis, etc.). Exposure to
AI/ML model lifecycle management
tools (MLflow, Kubeflow, Vertex AI). Understanding of data governance, compliance, and security best practices.
#J-18808-Ljbffr