Logo
Leidos Inc

ETL Developer (SME)

Leidos Inc, Bethesda, Maryland, us, 20811

Save Job

Overview Leidos has a new and exciting opportunity for an ETL Developer SME in our National Security Sector's (NSS) Cyber & Analytics Business Area (CABA). Our team works in Security Engineering, Computer Network Operations (CNO), Mission Software, Analytical Methods and Modeling, Signals Intelligence (SIGINT), and Cryptographic Key Management. Leidos offers competitive benefits, including Paid Time Off, 11 paid holidays, 401K with company matching and immediate vesting, flexible schedules, discounted stock purchase plans, technical upskilling, education and training support, parental paid leave, and more. Join us and make a difference in National Security!

Job Description We have an IMMEDIATE NEED for an ETL Developer to lead and implement cutting‑edge data flow solutions centered around Apache NiFi. The candidate will provide technical expertise and support in the design, development, implementation and testing of customer tools and applications in support of Extracting, Transforming and Loading (ETL) of data into an enterprise Data Lake. The candidate will be responsible for defining architectural best practices, optimizing performance in large‑scale environments, and mentoring junior developers, ensuring the delivery of robust, scalable, and secure data flow solutions that drive critical customer needs. Based in a DevOps framework, the ETL Developer participates in and directs major deliverables of projects through all aspects of the software development lifecycle.

Responsibilities

Architecting and leading complex NiFi data pipeline design: design and develop enterprise‑level ETL architectures and implement NiFi data pipelines for large‑scale data ingestion, transformation, and processing from diverse sources.

Performance optimization and tuning: lead the optimization of NiFi data flows, including processor tuning, memory management, and load balancing, ensuring optimal performance for batch and real‑time processing.

Advanced troubleshooting and problem resolution: identify, diagnose, and resolve complex NiFi data flow issues, including performance bottlenecks, data discrepancies, and integration failures.

Integrating with big data and cloud technologies: integrate NiFi with databases, big data ecosystems, and cloud platforms (e.g., AWS, OCI, Azure), with experience in services such as Kafka, Elasticsearch, S3, SQS/SNS.

Defining best practices and standards: lead the establishment of best practices for NiFi development, deployment, security, and governance, ensuring adherence to enterprise data management policies.

Documentation and knowledge sharing: create and maintain documentation for NiFi data flows, mappings, architectures, and standard operating procedures to promote efficient team operations.

Collaboration and communication: collaborate with data architects, data engineers, application/service developers, and other stakeholders to translate business requirements into robust technical solutions and communicate complex technical concepts.

Mentorship and team leadership: mentor junior developers, provide technical guidance, conduct code reviews, and foster a collaborative learning environment.

Basic Qualifications

Extensive experience designing, developing, and managing complex NiFi data flow solutions in large‑scale enterprise environments.

Extensive knowledge of NiFi architecture, processors, and configurations, with hands‑on experience with NiFi Registry and clustering for high availability and scalability.

Proficiency in Java and Python for custom NiFi processor development and scripting for automation.

Proficiency writing and optimizing complex queries, and experience with relational and NoSQL databases (e.g., Postgres, Elasticsearch, DynamoDB).

In‑depth experience with real‑time streaming and API integration (REST) for seamless data connectivity.

In‑depth experience with cloud platforms like AWS, Azure, or OCI and related data services.

Strong ability to analyze complex data challenges, identify root causes, and implement effective solutions.

Strong ability to collaborate with cross‑functional teams, articulate technical concepts clearly, and provide effective mentorship.

Master's degree with 15+ years of relevant experience or Doctorate with 13+ years of relevant experience. Will consider experience in lieu of a degree.

Active TS/SCI with polygraph security clearance is required.

Preferred Qualifications

Extensive experience deploying ETL solutions in an AWS environment.

Relevant certifications in data engineering or cloud platforms (e.g., AWS Certified Big Data – Specialty, Google Cloud Professional Data Engineer).

CABARESTON

At Leidos, we don’t want someone who “fits the mold” — we want someone who melts it down and builds something better. This is a role for the restless, the over‑caffeinated, and the ones who ask, “what’s next?” before the dust settles on “what’s now.”

If you’re already scheming step 20 while everyone else is still debating step 2… good. You’ll fit right in.

#J-18808-Ljbffr