Bricklayer AI Inc.
Principal Engineer (Backend and Data Architecture)
Bricklayer AI Inc., Atlanta, Georgia, United States, 30383
Principal Engineer (Backend and Data Architecture)
Our Principal Engineer (Backend and Data Architecture) will serve as both the hands‑on technical leader and architect for Bricklayer AI’s backend systems and data infrastructure, personally implementing solutions while mentoring engineers across the organization. This individual contributor role with significant technical influence reports directly to the CTO and requires 80% hands‑on development and 20% mentoring/architecture guidance. The ideal candidate will personally write high‑performance code, architect complex data systems, and scale our technical capabilities as we grow from processing thousands to millions of security events per second. You’ll be in the code daily while serving as the technical authority for our most critical backend challenges.
Location:
This position is based in Atlanta, GA and requires full‑time in‑office work at our Atlanta R&D Center
About Bricklayer AI Inc. Bricklayer AI is the first multi‑agent LLM‑based AI solution that brings autonomous AI specialists and human experts into a single collaborative and effective security team. Our automated AI Security Analyst, Threat Intelligence Analyst, and Incident Responder autonomously handle complex incidents, with minimal human oversight. They learn continuously, collaborate seamlessly, and enable human teams to build a stronger defense.
Description Our Principal Engineer (Backend and Data Architecture) will serve as both the hands‑on technical leader and architect for Bricklayer AI’s backend systems and data infrastructure, personally implementing solutions while mentoring engineers across the organization. This individual contributor role with significant technical influence reports directly to the CTO and requires 80% hands‑on development and 20% mentoring/architecture guidance. The ideal candidate will personally write high‑performance code, architect complex data systems, and scale our technical capabilities as we grow from processing thousands to millions of security events per second. You’ll be in the code daily while serving as the technical authority for our most critical backend challenges.
Location:
This position is based in Atlanta, GA and requires full‑time in‑office work at our Atlanta R&D Center
Responsibilities
Hands‑on System Architecture:
Personally design and implement highly scalable backend architectures, writing production code for real‑time cybersecurity data processing and AI agent orchestration
Performance Engineering:
Lead by example in performance optimization, personally implementing solutions to achieve sub‑100ms API response times and optimize data pipeline throughput for millions of security events
Data Architecture Implementation:
Personally architect and implement data lakes, streaming architectures, and storage solutions, writing the code that processes cybersecurity data patterns and AI workloads
Database Development:
Design, implement, and optimize database schemas, indexing strategies, and query patterns through hands‑on development for both transactional and analytical workloads
Hands‑on API Development:
Personally design and implement high‑performance REST and GraphQL APIs, writing the code that supports complex cybersecurity workflows and integrations
Pipeline Implementation:
Build and optimize ETL/ELT pipelines through hands‑on development, personally implementing solutions for ingesting, processing, and transforming large‑scale cybersecurity data streams
Caching & Storage:
Implement advanced caching strategies, data partitioning, and storage optimization for multi‑tenant SaaS architecture
AI/ML Integration:
Architect backend systems that efficiently serve LLM requests, manage vector embeddings, and support real‑time agent decision‑making
Technical Mentorship:
Mentor senior engineers through code reviews, architecture discussions, and hands‑on pairing while personally establishing backend engineering best practices through implementation
Innovation & Research:
Evaluate cutting‑edge technologies and implement proof‑of‑concepts to push the boundaries of cybersecurity data processing
Qualifications Required
8+ years of hands‑on backend engineering experience with 3+ years focused on personally implementing high‑performance data systems
Proven track record personally building systems that process millions of events per minute with sub‑second latency
Experience scaling technical implementations as companies grow from thousands to millions of transactions
Expert‑level proficiency in Python with hands‑on experience with high‑performance languages (Go, Rust, or Java)
Deep hands‑on expertise in database technologies including PostgreSQL, ClickHouse, and distributed databases
Advanced hands‑on knowledge of data pipeline frameworks (Apache Kafka, Apache Spark, Apache Flink)
Strong background in API design and development at scale (REST, GraphQL, gRPC)
Expert knowledge of caching technologies (Redis, Memcached) and search engines (Elasticsearch, OpenSearch)
Experience with vector databases (Pinecone, Weaviate, ChromaDB) and embedding systems
Deep understanding of cloud data services on AWS (RDS, OpenSearch, SQS, EKS, S3) and Azure
Proven track record optimizing application performance, query performance, and system throughput
Strong knowledge of microservices architecture, event‑driven systems, and distributed computing
Experience with container orchestration and service mesh technologies
Preferred
Advanced degree in Computer Science, Data Engineering, or related field
Experience in cybersecurity data processing, SIEM systems, or security analytics platforms
Background with machine learning infrastructure and LLM serving architectures
Knowledge of stream processing, complex event processing, and real‑time analytics
Experience with data mesh architectures and modern data stack technologies
Understanding of time‑series databases and IoT data processing patterns
Contributions to open‑source data engineering or backend projects
Experience with observability and monitoring of high‑throughput data systems
Knowledge of data privacy, security, and compliance requirements (GDPR, SOC2)
Build a stronger defense with your automated AI security team.
HQ: 3101 Wilson Blvd Suite 500 Arlington, VA 22201
#J-18808-Ljbffr
Location:
This position is based in Atlanta, GA and requires full‑time in‑office work at our Atlanta R&D Center
About Bricklayer AI Inc. Bricklayer AI is the first multi‑agent LLM‑based AI solution that brings autonomous AI specialists and human experts into a single collaborative and effective security team. Our automated AI Security Analyst, Threat Intelligence Analyst, and Incident Responder autonomously handle complex incidents, with minimal human oversight. They learn continuously, collaborate seamlessly, and enable human teams to build a stronger defense.
Description Our Principal Engineer (Backend and Data Architecture) will serve as both the hands‑on technical leader and architect for Bricklayer AI’s backend systems and data infrastructure, personally implementing solutions while mentoring engineers across the organization. This individual contributor role with significant technical influence reports directly to the CTO and requires 80% hands‑on development and 20% mentoring/architecture guidance. The ideal candidate will personally write high‑performance code, architect complex data systems, and scale our technical capabilities as we grow from processing thousands to millions of security events per second. You’ll be in the code daily while serving as the technical authority for our most critical backend challenges.
Location:
This position is based in Atlanta, GA and requires full‑time in‑office work at our Atlanta R&D Center
Responsibilities
Hands‑on System Architecture:
Personally design and implement highly scalable backend architectures, writing production code for real‑time cybersecurity data processing and AI agent orchestration
Performance Engineering:
Lead by example in performance optimization, personally implementing solutions to achieve sub‑100ms API response times and optimize data pipeline throughput for millions of security events
Data Architecture Implementation:
Personally architect and implement data lakes, streaming architectures, and storage solutions, writing the code that processes cybersecurity data patterns and AI workloads
Database Development:
Design, implement, and optimize database schemas, indexing strategies, and query patterns through hands‑on development for both transactional and analytical workloads
Hands‑on API Development:
Personally design and implement high‑performance REST and GraphQL APIs, writing the code that supports complex cybersecurity workflows and integrations
Pipeline Implementation:
Build and optimize ETL/ELT pipelines through hands‑on development, personally implementing solutions for ingesting, processing, and transforming large‑scale cybersecurity data streams
Caching & Storage:
Implement advanced caching strategies, data partitioning, and storage optimization for multi‑tenant SaaS architecture
AI/ML Integration:
Architect backend systems that efficiently serve LLM requests, manage vector embeddings, and support real‑time agent decision‑making
Technical Mentorship:
Mentor senior engineers through code reviews, architecture discussions, and hands‑on pairing while personally establishing backend engineering best practices through implementation
Innovation & Research:
Evaluate cutting‑edge technologies and implement proof‑of‑concepts to push the boundaries of cybersecurity data processing
Qualifications Required
8+ years of hands‑on backend engineering experience with 3+ years focused on personally implementing high‑performance data systems
Proven track record personally building systems that process millions of events per minute with sub‑second latency
Experience scaling technical implementations as companies grow from thousands to millions of transactions
Expert‑level proficiency in Python with hands‑on experience with high‑performance languages (Go, Rust, or Java)
Deep hands‑on expertise in database technologies including PostgreSQL, ClickHouse, and distributed databases
Advanced hands‑on knowledge of data pipeline frameworks (Apache Kafka, Apache Spark, Apache Flink)
Strong background in API design and development at scale (REST, GraphQL, gRPC)
Expert knowledge of caching technologies (Redis, Memcached) and search engines (Elasticsearch, OpenSearch)
Experience with vector databases (Pinecone, Weaviate, ChromaDB) and embedding systems
Deep understanding of cloud data services on AWS (RDS, OpenSearch, SQS, EKS, S3) and Azure
Proven track record optimizing application performance, query performance, and system throughput
Strong knowledge of microservices architecture, event‑driven systems, and distributed computing
Experience with container orchestration and service mesh technologies
Preferred
Advanced degree in Computer Science, Data Engineering, or related field
Experience in cybersecurity data processing, SIEM systems, or security analytics platforms
Background with machine learning infrastructure and LLM serving architectures
Knowledge of stream processing, complex event processing, and real‑time analytics
Experience with data mesh architectures and modern data stack technologies
Understanding of time‑series databases and IoT data processing patterns
Contributions to open‑source data engineering or backend projects
Experience with observability and monitoring of high‑throughput data systems
Knowledge of data privacy, security, and compliance requirements (GDPR, SOC2)
Build a stronger defense with your automated AI security team.
HQ: 3101 Wilson Blvd Suite 500 Arlington, VA 22201
#J-18808-Ljbffr