Logo
Leidos

Principal ETL Developer

Leidos, Bethesda, Maryland, us, 20811

Save Job

Overview

Leidos

has a new and exciting opportunity for a

Principal ETL Developer

in our National Security Sector's (NSS) Cyber & Analytics Business Area (CABA). Our team leads in Security Engineering, Computer Network Operations (CNO), Mission Software, Analytical Methods and Modeling, Signals Intelligence (SIGINT), and Cryptographic Key Management. Leidos offers competitive benefits, including Paid Time Off, 11 paid Holidays, 401K with a 6% company match and immediate vesting, Flexible Schedules, Discounted Stock Purchase Plans, Technical Upskilling, Education and Training Support, Parental Paid Leave, and more. Join us and make a difference in National Security! Job Description We have an immediate need for an ETL developer to play a pivotal role in shaping, leading, and implementing cutting-edge data flow solutions centered around Apache NiFi. The candidate will provide technical expertise and support in the design, development, implementation and testing of customer tools and applications in support of Extracting, Transforming and Loading (ETL) of data into an enterprise Data Lake. The candidate will be responsible for defining architectural best practices, optimizing performance in large-scale environments, and mentoring junior developers, ensuring the delivery of robust, scalable, and secure data flow solutions that drive critical customer needs. Based in a DevOps framework, the ETL Developer participates in and directs major deliverables of projects through all aspects of the software development lifecycle. Responsibilities Architecting complex NiFi data pipeline design: Design and develop enterprise-level ETL architectures and implement NiFi data pipelines for large-scale data ingestion, transformation, and processing from diverse sources. Performance optimization and tuning: Optimize NiFi data flows, including processor tuning, memory management, and load balancing, ensuring optimal performance for batch and real-time processing. Advanced troubleshooting and problem resolution: Identify, diagnose, and resolve complex NiFi data flow issues, including performance bottlenecks, data discrepancies, and integration failures. Integrating with big data and cloud technologies: Seamlessly integrate NiFi with various databases, big data ecosystems, and cloud platforms (e.g., AWS, OCI, Azure), demonstrating expertise in relevant services (e.g., Kafka, Elasticsearch, S3, SQS/SNS). Defining best practices and standards: Establish best practices for NiFi development, deployment, security, and governance, ensuring adherence to enterprise-wide data management policies. Documentation and knowledge sharing: Create and maintain comprehensive documentation for NiFi data flows, mappings, architectures, and standard operating procedures, ensuring knowledge transfer and promoting efficient team operations. Collaboration and communication: Collaborate effectively with data architects, data engineers, application/service developers, and other stakeholders to translate business requirements into robust technical solutions and effectively communicate complex technical concepts to both technical and non-technical audiences. Mentorship and team leadership: Mentor junior developers, provide technical guidance, conduct code reviews, and foster a collaborative learning environment. Basic Qualifications In-depth experience designing, developing, and managing complex NiFi data flow solutions in large-scale enterprise environments. In-depth knowledge of NiFi architecture, processors, and configurations, along with hands-on experience with NiFi Registry and clustering for high availability and scalability. Proficiency in programming languages like Java and Python for custom NiFi processor development and scripting for automation. Proficiency writing and optimizing complex queries, along with experience in managing relational and NoSQL databases (e.g., Postgres, Elasticsearch, DynamoDB). Direct experience with real-time streaming, and API integration (REST) for seamless data connectivity. Direct experience with cloud platforms like AWS, Azure, or OCI and related data services. Strong ability to analyze complex data challenges, identify root causes, and implement effective solutions. Strong ability to collaborate effectively with cross-functional teams, articulate technical concepts clearly, and provide effective mentorship. Bachelor’s degree with 12 or more years of prior relevant experience or Master’s degree with 10 or more years of relevant experience. Additional years of experience may be substituted in lieu of a degree. Active TS/SCI with polygraph security clearance is required. Preferred Qualifications In-depth experience deploying ETL solutions in an AWS environment. Additional notes: This role expects a candidate capable of operating within a security-sensitive environment and adhering to enterprise data governance policies. Original Posting

September 11, 2025 Pay Range

Pay Range $126,100.00 - $227,950.00 The Leidos pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Additional factors considered in extending an offer include responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.

#J-18808-Ljbffr