Lead Graph Developer
Qode - Phila, Pennsylvania, United States
Work at Qode
Overview
- View job
Overview
Location: Pittsburgh, PA/ Cleveland, OH (Hybrid)
Employment Type:
Full-Time
Job Summary:
We are seeking an experienced
Lead Graph Developer
to architect, design, and implement cutting-edge graph-based data solutions. In this role, you will lead the development of large-scale graph data models, integrate graph databases with enterprise systems, and build graph-powered applications to support complex querying, analytics, and recommendation use cases.
You will collaborate with data scientists, software engineers, and business stakeholders to deliver performant and scalable graph solutions using technologies like
Neo4j, Amazon Neptune, TigerGraph , or
RDF triple stores .
Key Responsibilities:
Lead the architecture, design, and implementation of graph-based data solutions. Model complex relationships using
Labeled Property Graphs (LPG)
or
RDF/OWL
depending on use case. Develop and optimize queries using
Cypher ,
Gremlin ,
SPARQL , or similar graph query languages. Collaborate with engineering teams to integrate graph databases with APIs, microservices, or existing data platforms. Work closely with stakeholders to define graph use cases (e.g., recommendation engines, fraud detection, knowledge graphs). Implement data ingestion pipelines to load, transform, and maintain graph datasets. Ensure scalability, performance, and security best practices in graph solutions. Mentor junior developers and promote best practices in graph modeling and query design. Contribute to proof-of-concepts and evaluate new tools and frameworks in the graph space. Document architecture, design decisions, and graph schemas for reusability and governance. Required Skills & Qualifications:
10+ years of experience in software development or data engineering roles. 3+ years of
hands-on experience with graph database technologies
(Neo4j, Amazon Neptune, TigerGraph, Stardog, etc.). Proficient in one or more
graph query languages : Cypher, Gremlin, SPARQL. Strong knowledge of
data modeling , schema design, and semantic technologies (RDF/OWL a plus). Experience with
Python, Java, or Scala
for integration and data pipeline development. Familiarity with
data integration tools ,
ETL pipelines , and APIs for graph ingestion. Strong understanding of
cloud services
(AWS, GCP, or Azure), especially managed graph solutions. Experience working in Agile development teams and version-controlled environments (Git, CI/CD). Excellent problem-solving, communication, and leadership skills. Preferred Qualifications:
Experience with
knowledge graphs ,
entity resolution , or
ontology design . Familiarity with
machine learning
or
graph-based AI
use cases. Prior experience with
graph visualization tools
(Bloom, Gephi, Linkurious). Certifications in Neo4j, AWS Neptune, or relevant cloud/graph platforms. Technologies / Tools:
Graph Databases: Neo4j, Amazon Neptune, TigerGraph, JanusGraph Languages: Cypher, Gremlin, SPARQL, Python, Java, Scala Cloud Platforms: AWS (Lambda, S3, Glue), GCP, Azure DevOps: Git, Jenkins, Docker, Kubernetes Visualization: Neo4j Bloom, Gephi, GraphXR
#J-18808-Ljbffr