Logo
Data Freelance Hub

Data Engineer SME with Security Clearance

Data Freelance Hub, Arlington, Virginia, United States, 22201

Save Job

Data Engineer SME with Security Clearance ⭐ - Featured Role | Apply direct with Data Freelance Hub

This role is for a Data Engineer SME with an active TS/SCI security clearance, focused on designing data pipelines for AI/ML. Required skills include Apache Airflow, Python, and ElasticSearch. It is an onsite position in Crystal City, VA, with competitive pay.

Job Description:

We are seeking a Data Engineering SME to design, build, and operate data pipelines that ingest, store, and process high‑volume, multi‑source data primarily for modern AI/ML processes. You will partner with software, analytics, and product teams to create model‑ready datasets (features, embeddings, and prompts), implement scalable storage layers (data lakehouse and vector stores), and enable low‑latency retrieval for query, inference, and RAG.

Responsibilities:

Orchestrate streaming and batch pipelines, optimizing compute for GPU/CPU workloads, enforcing data quality and governance, and instrumenting observability.

Design, develop, and implement scalable data pipelines and ETL processes using Apache Airflow, with a focus on data for AI applications.

Develop messaging solutions utilizing Kafka to support real‑time data streaming and event‑driven architectures.

Build and maintain high‑performance data retrieval solutions using ElasticSearch/OpenSearch.

Implement and optimize Python‑based data processing solutions.

Integrate batch and streaming data processing techniques to enhance data availability and accessibility.

Ensure adherence to security and compliance requirements when working with classified data.

Work closely with cross‑functional teams to define data strategies and develop technical solutions aligned with mission objectives.

Deploy and manage cloud‑based infrastructure to support scalable and resilient data solutions.

Optimize data storage, retrieval, and processing efficiency.

Required Skills & Experience:

Experience with Apache Airflow for workflow orchestration.

Strong programming skills in Python.

Experience with ElasticSearch/OpenSearch for data indexing and search functionalities.

Understanding of vector databases, embedding models, and vector search for AI applications.

Expertise in event‑driven architecture and microservices development.

Hands‑on experience with cloud services (e.g. MinIO), including data storage and compute resources.

Strong understanding of data pipeline orchestration and workflow automation.

Knowledge of Linux environments and database optimization techniques.

Strong understanding of version control with Git.

Due to US Government Contract Requirements, only US Citizens are eligible for this role.

Nice to Have Skills:

Proficiency in Kafka for messaging and real‑time data processing.

Understanding of LLM prompt engineering and associated ETL applications.

Knowledge of SuperSet for data visualization and analytics.

Familiarity with Kubernetes for container orchestration.

Exposure to Apache Spark for large‑scale data processing.

Education & Certifications:

Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field (or equivalent experience). Advanced degrees are a plus.

Security Clearance:

An active TS/SCI security clearance is REQUIRED, and candidates must have or be willing to obtain a CI Poly. Candidates without this clearance will not be considered.

Equal Opportunity Employer / Individuals with Disabilities / Protected Veterans

#J-18808-Ljbffr