Point Wild
Point Wild helps customers monitor, manage, and protect against the risks associated with their identities and personal information in a digital world. Backed by WndrCo, Warburg Pincus and General Catalyst, Point Wild is dedicated to creating the world’s most comprehensive portfolio of industry-leading cybersecurity solutions. Our vision is to become THE go-to resource for every cyber protection need individuals may face - today and in the future.
Join us for the ride!
Lat61 Mission The Lat61 platform will power the next generation of cybersecurity and AI-enabled decision-making. As a
Data Engineer
on this team, you will help deliver:
Multi-Modal Data Ingestion:
Bringing together logs, telemetry, threat intel, identity data, cryptographic assets, and third-party feeds into a unified lakehouse.
AI Agent Enablement:
Supporting Retrieval-Augmented Generation (RAG) workflows, embeddings, and feature stores to fuel advanced AI use cases across Point Wild products.
Analytics & Decision Systems:
Providing real-time insights into risk posture, compliance, and security events through scalable pipelines and APIs.
Future-Proofing for Quantum:
Laying the groundwork for automated remediation and transition to post-quantum cryptographic standards.
Your work won’t just be about pipelines and data models - it will directly shape how enterprises anticipate, prevent, and respond to cybersecurity risks in an era of quantum disruption.
About the Role We are seeking a highly skilled
Data Engineer
with deep experience in
Databricks
and modern lakehouse architectures to join the Lat61 platform team. This role is critical in designing, building, and optimizing the pipelines, data structures, and integrations that power Lat61.
You will collaborate closely with data architects, AI engineers, and product leaders to deliver a scalable, resilient, and secure foundation for advanced analytics, machine learning, and cryptographic risk management use cases.
Responsibilities
Build and optimize data ingestion pipelines on Databricks (batch and streaming) to process structured, semi-structured, and unstructured data.
Implement scalable data models and transformations leveraging Delta Lake and open data formats (Parquet, Delta).
Design and manage workflows with Databricks Workflows, Airflow, or equivalent orchestration tools.
Implement automated testing, lineage, and monitoring frameworks using tools such as Great Expectations and Unity Catalog.
Build integrations with enterprise and third-party systems via cloud APIs, Kafka/Kinesis, and connectors into Databricks.
Partner with AI/ML teams to provision feature stores, integrate vector databases (Pinecone, Milvus, Weaviate), and support RAG-style architectures.
Optimize Spark and SQL workloads for speed and cost efficiency across multi-cloud environments (AWS, Azure, GCP).
Apply secure-by-design data engineering practices aligned with Point Wild’s cybersecurity standards and evolving post-quantum cryptographic frameworks.
Qualifications
At least 5 years in Data Engineering with strong experience building production data systems on Databricks.
Expertise in PySpark, SQL, and Python.
Strong expertise with various AWS services.
Strong knowledge of Delta Lake, Parquet, and lakehouse architectures.
Experience with streaming frameworks (Structured Streaming, Kafka, Kinesis, or Pub/Sub).
Familiarity with DBT for transformation and analytics workflows.
Strong understanding of data governance and security controls (Unity Catalog, IAM).
Exposure to AI/ML data workflows (feature stores, embeddings, vector databases).
Detail-oriented, collaborative, and comfortable working in a fast‑paced, innovation‑driven environment.
Bonus Points
Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
Data Engineering experience in a B2B SaaS organization.
Why Join Us?
Opportunity to build a next‑generation lakehouse platform at the intersection of cybersecurity, cryptography, and AI.
A role with direct impact on how global enterprises defend against quantum‑era risks.
Collaborative, mission‑driven culture with a focus on innovation and agility.
A chance to shape the future of data + AI across the Point Wild portfolio.
Company Culture & Impact As part of Point Wild, you will solve real customer problems and see your impact every day. You will accelerate your career in a fast‑paced, growth‑oriented environment with opportunities to learn new technologies, products, and markets. We prioritize inclusion; no employee or applicant will face discrimination or harassment based on protected categories. Point Wild is committed to being an inclusive community where all feel welcome.
Important Privacy Information Important privacy information for United States based job applicants can be found here.
#J-18808-Ljbffr
Join us for the ride!
Lat61 Mission The Lat61 platform will power the next generation of cybersecurity and AI-enabled decision-making. As a
Data Engineer
on this team, you will help deliver:
Multi-Modal Data Ingestion:
Bringing together logs, telemetry, threat intel, identity data, cryptographic assets, and third-party feeds into a unified lakehouse.
AI Agent Enablement:
Supporting Retrieval-Augmented Generation (RAG) workflows, embeddings, and feature stores to fuel advanced AI use cases across Point Wild products.
Analytics & Decision Systems:
Providing real-time insights into risk posture, compliance, and security events through scalable pipelines and APIs.
Future-Proofing for Quantum:
Laying the groundwork for automated remediation and transition to post-quantum cryptographic standards.
Your work won’t just be about pipelines and data models - it will directly shape how enterprises anticipate, prevent, and respond to cybersecurity risks in an era of quantum disruption.
About the Role We are seeking a highly skilled
Data Engineer
with deep experience in
Databricks
and modern lakehouse architectures to join the Lat61 platform team. This role is critical in designing, building, and optimizing the pipelines, data structures, and integrations that power Lat61.
You will collaborate closely with data architects, AI engineers, and product leaders to deliver a scalable, resilient, and secure foundation for advanced analytics, machine learning, and cryptographic risk management use cases.
Responsibilities
Build and optimize data ingestion pipelines on Databricks (batch and streaming) to process structured, semi-structured, and unstructured data.
Implement scalable data models and transformations leveraging Delta Lake and open data formats (Parquet, Delta).
Design and manage workflows with Databricks Workflows, Airflow, or equivalent orchestration tools.
Implement automated testing, lineage, and monitoring frameworks using tools such as Great Expectations and Unity Catalog.
Build integrations with enterprise and third-party systems via cloud APIs, Kafka/Kinesis, and connectors into Databricks.
Partner with AI/ML teams to provision feature stores, integrate vector databases (Pinecone, Milvus, Weaviate), and support RAG-style architectures.
Optimize Spark and SQL workloads for speed and cost efficiency across multi-cloud environments (AWS, Azure, GCP).
Apply secure-by-design data engineering practices aligned with Point Wild’s cybersecurity standards and evolving post-quantum cryptographic frameworks.
Qualifications
At least 5 years in Data Engineering with strong experience building production data systems on Databricks.
Expertise in PySpark, SQL, and Python.
Strong expertise with various AWS services.
Strong knowledge of Delta Lake, Parquet, and lakehouse architectures.
Experience with streaming frameworks (Structured Streaming, Kafka, Kinesis, or Pub/Sub).
Familiarity with DBT for transformation and analytics workflows.
Strong understanding of data governance and security controls (Unity Catalog, IAM).
Exposure to AI/ML data workflows (feature stores, embeddings, vector databases).
Detail-oriented, collaborative, and comfortable working in a fast‑paced, innovation‑driven environment.
Bonus Points
Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
Data Engineering experience in a B2B SaaS organization.
Why Join Us?
Opportunity to build a next‑generation lakehouse platform at the intersection of cybersecurity, cryptography, and AI.
A role with direct impact on how global enterprises defend against quantum‑era risks.
Collaborative, mission‑driven culture with a focus on innovation and agility.
A chance to shape the future of data + AI across the Point Wild portfolio.
Company Culture & Impact As part of Point Wild, you will solve real customer problems and see your impact every day. You will accelerate your career in a fast‑paced, growth‑oriented environment with opportunities to learn new technologies, products, and markets. We prioritize inclusion; no employee or applicant will face discrimination or harassment based on protected categories. Point Wild is committed to being an inclusive community where all feel welcome.
Important Privacy Information Important privacy information for United States based job applicants can be found here.
#J-18808-Ljbffr