
Agentic Platform Engineer
Signature IT World Inc, Santa Clarita, CA, United States
Role: Senior Agentic Platform Engineer
Location :
Austin, TX/SCV, CA Type: Contract
We are seeking a Senior Agentic Platform Engineer to serve as the lead architect and implementer for our Intelligence Layer. This role is designed for a senior individual contributor who can operate with high autonomy to design, develop, and deploy production-ready solutions that optimize both internal data operations and external-facing intelligence.
You will be responsible for the end-to-end delivery of the connective tissue between our multi-agent architecture and our knowledge graph, supporting the existing team by accelerating technical milestones and hardening the CI/CD processes required for reliable AI deployment.
Core Delivery Categories Agentic Systems & Orchestration: Lead the build-out of a multi-agent architecture using LangGraph. You will design cyclic, stateful workflows, implement persistence, and manage Human-in-the-Loop (HITL) checkpoints. Graph-Native Intelligence: Autonomously design the "Ontology-to-Schema" pipeline, mapping OWL/SKOS enterprise ontologies into TigerGraph. You will develop high-performance GSQL and architect multigraph memory systems for agentic reasoning. Trusted Data Materialization: Own the materialization of data products in Snowflake using dbt, specifically focusing on the DQ rules engine and automated Trust Score computation. Internal & External Skill Development: Design and deploy agents tailored for Internal Data Ops (automating metadata harvest and DQ remediation) as well as External-Facing Skills that provide governed, high-trust insights to end-users.
Technical Expertise & Experience Requirements 10+ years of Senior Software & Data Engineering experience, with a proven track record of production delivery within a global enterprise environment. Advanced Agentic Orchestration (2+ years): Deep, hands-on mastery of LangGraph (StateGraph, Command, and Persistence) and LangChain. Multi-LLM Mastery: Expert implementation of frontier models (including Anthropic Claude, OpenAI GPT, and Llama) and the Model Context Protocol (MCP) for standardized tool-calling and context injection across model providers. TigerGraph & GSQL Specialist (5+ years): Expert-level proficiency in GSQL development, including writing distributed graph algorithms and optimizing complex sub-queries. Knowledge Modeling: Direct experience modeling enterprise ontologies using OWL, SKOS, or RDF and successfully mapping them to Labeled Property Graph (LPG) schemas. Analytics Engineering Mastery (5+ years): Expert-level dbt (Core/Cloud) and Snowflake architecture, with specific experience building automated Data Quality (DQ) monitors and trust-score pipelines. Development Stack: High proficiency in Python (specifically Asynchronous programming, FastAPI, and Pydantic) and advanced SQL. Internal Data Ops Optimization: Demonstrated experience building agents and skills specifically designed to automate Data Governance and Data Operations (e.g., automated glossary curation, schema discovery, and policy enforcement).
CI/CD, DevOps & Process Optimization Spec-Driven Development: Champion a "Spec-First" approach to AI development, ensuring agent behaviors, tool contracts, and data schemas are defined via rigorous specifications (e.g., OpenAPI, AsyncAPI, or custom DSLs) before implementation. AI-Optimized CI/CD: Support the team in designing and implementing robust CI/CD pipelines tailored for GenAI, focusing on model-agnostic deployment patterns and high-frequency delivery cycles. Process Engineering: Optimize team development workflows to support iterative AI loops, including the implementation of specialized observability for agentic traces and automated feedback loops for data quality.
Preferred Experience Unstructured Data & Vectors: Experience with unstructured data management and the implementation of vector databases (e.g., Pinecone, Weaviate, or Snowflake Cortex Search) within RAG architectures. Enterprise Metadata Management: Hands-on experience with DataHub or similar data catalog and metadata management solutions to drive automated discovery. Domain Expertise: Familiarity with Sales B2B and B2C data processes and associated tooling (e.g., Salesforce), including experience navigating CRM schemas for agentic tool-calling. Governance & Security: Familiarity with data privacy and security frameworks (GDPR, SOC2) as they apply to autonomous agents and Large Language Models. Community Engagement: Contributions to open-source agentic frameworks or participation in the development of the Model Context Protocol (MCP) ecosystem.
Austin, TX/SCV, CA Type: Contract
We are seeking a Senior Agentic Platform Engineer to serve as the lead architect and implementer for our Intelligence Layer. This role is designed for a senior individual contributor who can operate with high autonomy to design, develop, and deploy production-ready solutions that optimize both internal data operations and external-facing intelligence.
You will be responsible for the end-to-end delivery of the connective tissue between our multi-agent architecture and our knowledge graph, supporting the existing team by accelerating technical milestones and hardening the CI/CD processes required for reliable AI deployment.
Core Delivery Categories Agentic Systems & Orchestration: Lead the build-out of a multi-agent architecture using LangGraph. You will design cyclic, stateful workflows, implement persistence, and manage Human-in-the-Loop (HITL) checkpoints. Graph-Native Intelligence: Autonomously design the "Ontology-to-Schema" pipeline, mapping OWL/SKOS enterprise ontologies into TigerGraph. You will develop high-performance GSQL and architect multigraph memory systems for agentic reasoning. Trusted Data Materialization: Own the materialization of data products in Snowflake using dbt, specifically focusing on the DQ rules engine and automated Trust Score computation. Internal & External Skill Development: Design and deploy agents tailored for Internal Data Ops (automating metadata harvest and DQ remediation) as well as External-Facing Skills that provide governed, high-trust insights to end-users.
Technical Expertise & Experience Requirements 10+ years of Senior Software & Data Engineering experience, with a proven track record of production delivery within a global enterprise environment. Advanced Agentic Orchestration (2+ years): Deep, hands-on mastery of LangGraph (StateGraph, Command, and Persistence) and LangChain. Multi-LLM Mastery: Expert implementation of frontier models (including Anthropic Claude, OpenAI GPT, and Llama) and the Model Context Protocol (MCP) for standardized tool-calling and context injection across model providers. TigerGraph & GSQL Specialist (5+ years): Expert-level proficiency in GSQL development, including writing distributed graph algorithms and optimizing complex sub-queries. Knowledge Modeling: Direct experience modeling enterprise ontologies using OWL, SKOS, or RDF and successfully mapping them to Labeled Property Graph (LPG) schemas. Analytics Engineering Mastery (5+ years): Expert-level dbt (Core/Cloud) and Snowflake architecture, with specific experience building automated Data Quality (DQ) monitors and trust-score pipelines. Development Stack: High proficiency in Python (specifically Asynchronous programming, FastAPI, and Pydantic) and advanced SQL. Internal Data Ops Optimization: Demonstrated experience building agents and skills specifically designed to automate Data Governance and Data Operations (e.g., automated glossary curation, schema discovery, and policy enforcement).
CI/CD, DevOps & Process Optimization Spec-Driven Development: Champion a "Spec-First" approach to AI development, ensuring agent behaviors, tool contracts, and data schemas are defined via rigorous specifications (e.g., OpenAPI, AsyncAPI, or custom DSLs) before implementation. AI-Optimized CI/CD: Support the team in designing and implementing robust CI/CD pipelines tailored for GenAI, focusing on model-agnostic deployment patterns and high-frequency delivery cycles. Process Engineering: Optimize team development workflows to support iterative AI loops, including the implementation of specialized observability for agentic traces and automated feedback loops for data quality.
Preferred Experience Unstructured Data & Vectors: Experience with unstructured data management and the implementation of vector databases (e.g., Pinecone, Weaviate, or Snowflake Cortex Search) within RAG architectures. Enterprise Metadata Management: Hands-on experience with DataHub or similar data catalog and metadata management solutions to drive automated discovery. Domain Expertise: Familiarity with Sales B2B and B2C data processes and associated tooling (e.g., Salesforce), including experience navigating CRM schemas for agentic tool-calling. Governance & Security: Familiarity with data privacy and security frameworks (GDPR, SOC2) as they apply to autonomous agents and Large Language Models. Community Engagement: Contributions to open-source agentic frameworks or participation in the development of the Model Context Protocol (MCP) ecosystem.