Amches, Inc.
Role and Responsibilities
The Cloud Software Engineer develops, maintains, and enhances complex and diverse Big-Data Cloud systems based on documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop ecosystem including implementing Java applications, distributed computing, information retrieval (IR), and object oriented design. Works individually or as part of a team. Reviews and tests software components for adherence to the design requirements and documents test results. Resolves software problem reports. Utilizes software development and software design methodologies appropriate to the development environment. Provides input to the software components of system design including hardware/software trade-offs, software reuse, use of Commercial Off-the-shelf (COTS) / Government Off-the-shelf (GOTS) in place of new development, and requirements analysis and synthesis from system level to individual software components. We are looking for a senior-level person skilled in DevOps, particularly in cloud solutioning and administration, capable of designing and implementing cloud-based solutions; maintaining and securing Linux-based operating systems; and designing, managing, and securing data flow management using NiFi. The candidate should be technically savvy with strong communication and leadership skills, self-motivated, with problem-solving abilities, adaptability, and knowledge of Agile processes and workflows to help us serve our customers effectively. Required and Nice-to-Have Skills
Experience with cloud platforms such as AWS and Microsoft Azure Knowledge of containerization using Docker, Kubernetes, or other container orchestration tools Ability to troubleshoot and resolve complex issues Good communication skills for collaboration with cross-functional teams and stakeholders Familiarity with agile development methodologies like Scrum (nice to have) Experience with Elastic stack (indexing, searching, managing data) Familiarity with NiFi management Ansible scripting Mandatory Skills
Twenty-four years of experience in software engineering in programs of similar scope and complexity, with a Bachelors degree in Computer Science or related field; four years must be in Big-Data cloud technologies and/or distributed computing. Four years of cloud software engineering experience on such projects may substitute for a bachelor’s degree. A Master’s degree may substitute for two years of experience. Cloudera Certified Hadoop Developer may substitute for one year of cloud experience. The following cloud-related experiences are required: Two years of Cloud and/or Distributed Computing Information Retrieval (IR) One year of experience with implementing code interacting with Cloud Big Table One year of experience with implementing code interacting with Cloud Distributed File System One year of experience with implementing complex MapReduce analytics One year of experience with implementing code interacting with Cloud Distributed Coordination Frameworks One year of experience in architecting Cloud Computing solutions One year of experience debugging problems with Cloud-based Distributed Computing Frameworks One year of experience managing multi-node Cloud-based installations Experience in Computer Network Operations Experience with Utility Computing, Network Management, Virtualization (VMware or VirtualBox), Cloud Computing Multi-node management and installation experience with Cloud and Distributed Computing across multiple nodes, using Python, CFEngine, Bash, Ruby, or related technologies Experience in Information Assurance: securing cloud-based and distributed applications via industry-standard techniques (firewalls, PKI certificates, server authentication) with corporate authentication services Experience in Information Technology: object oriented design and programming, Java, Eclipse or similar, Maven, RESTful web services Cloud and Distributed Computing technologies: experience with YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase, JMS, concurrent programming, multi-node implementations/installations, and related technologies Cloud and Distributed Computing Information Retrieval: experience with HDFS, HBase, Apache Lucene, Apache Solr, MongoDB Ingesting, parsing, and analyzing disparate data sources and formats (XML, JSON, CSV, binary formats, Avro, etc.) Aspect Oriented Design and Development Debugging and profiling cloud and distributed installations: JVM memory management, profiling Java applications Unix/Linux (CentOS) Experience in SIGINT disciplines and geolocation/emitter identification and signal applications; joint program dataflow architectures Experience with CentOS, Linux/RedHat, and configuration management tools such as Subversion, ClearCase/UCM, or similar Optional Skills
In-depth knowledge of Information Retrieval; assisting the software development team in designing, developing and testing cloud information retrieval Designing and implementing complex workflows that manage Cloud MapReduce analytics Interacting with Cloud Distributed Coordination Frameworks Leading one or more software development tasks and ensuring alignment with the software development process Recommending improvements to documentation and process standards Serving as a subject matter expert for Cloud Computing and Hadoop-related technologies Debugging issues with cloud-based distributed computing frameworks Managing multi-node cloud-based installations Delegating programming and testing tasks and monitoring performance Choosing software development processes in coordination with customers Recommending new technologies and processes for complex cloud projects Ensuring quality control of all developed or modified software Architecting solutions to complex cloud software engineering problems for efficient data processing and retrieval Requirements
TS/SCI Full Scope Polyrequired Benefits Snapshot
401K: up to 3% discretionary profit sharing contribution + 100% match on the first 7% of pay PTO: 20 days per year Healthcare, dental, vision, free for a single participant $50,000 life insurance provided, additional voluntary life insurance available
#J-18808-Ljbffr
The Cloud Software Engineer develops, maintains, and enhances complex and diverse Big-Data Cloud systems based on documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop ecosystem including implementing Java applications, distributed computing, information retrieval (IR), and object oriented design. Works individually or as part of a team. Reviews and tests software components for adherence to the design requirements and documents test results. Resolves software problem reports. Utilizes software development and software design methodologies appropriate to the development environment. Provides input to the software components of system design including hardware/software trade-offs, software reuse, use of Commercial Off-the-shelf (COTS) / Government Off-the-shelf (GOTS) in place of new development, and requirements analysis and synthesis from system level to individual software components. We are looking for a senior-level person skilled in DevOps, particularly in cloud solutioning and administration, capable of designing and implementing cloud-based solutions; maintaining and securing Linux-based operating systems; and designing, managing, and securing data flow management using NiFi. The candidate should be technically savvy with strong communication and leadership skills, self-motivated, with problem-solving abilities, adaptability, and knowledge of Agile processes and workflows to help us serve our customers effectively. Required and Nice-to-Have Skills
Experience with cloud platforms such as AWS and Microsoft Azure Knowledge of containerization using Docker, Kubernetes, or other container orchestration tools Ability to troubleshoot and resolve complex issues Good communication skills for collaboration with cross-functional teams and stakeholders Familiarity with agile development methodologies like Scrum (nice to have) Experience with Elastic stack (indexing, searching, managing data) Familiarity with NiFi management Ansible scripting Mandatory Skills
Twenty-four years of experience in software engineering in programs of similar scope and complexity, with a Bachelors degree in Computer Science or related field; four years must be in Big-Data cloud technologies and/or distributed computing. Four years of cloud software engineering experience on such projects may substitute for a bachelor’s degree. A Master’s degree may substitute for two years of experience. Cloudera Certified Hadoop Developer may substitute for one year of cloud experience. The following cloud-related experiences are required: Two years of Cloud and/or Distributed Computing Information Retrieval (IR) One year of experience with implementing code interacting with Cloud Big Table One year of experience with implementing code interacting with Cloud Distributed File System One year of experience with implementing complex MapReduce analytics One year of experience with implementing code interacting with Cloud Distributed Coordination Frameworks One year of experience in architecting Cloud Computing solutions One year of experience debugging problems with Cloud-based Distributed Computing Frameworks One year of experience managing multi-node Cloud-based installations Experience in Computer Network Operations Experience with Utility Computing, Network Management, Virtualization (VMware or VirtualBox), Cloud Computing Multi-node management and installation experience with Cloud and Distributed Computing across multiple nodes, using Python, CFEngine, Bash, Ruby, or related technologies Experience in Information Assurance: securing cloud-based and distributed applications via industry-standard techniques (firewalls, PKI certificates, server authentication) with corporate authentication services Experience in Information Technology: object oriented design and programming, Java, Eclipse or similar, Maven, RESTful web services Cloud and Distributed Computing technologies: experience with YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase, JMS, concurrent programming, multi-node implementations/installations, and related technologies Cloud and Distributed Computing Information Retrieval: experience with HDFS, HBase, Apache Lucene, Apache Solr, MongoDB Ingesting, parsing, and analyzing disparate data sources and formats (XML, JSON, CSV, binary formats, Avro, etc.) Aspect Oriented Design and Development Debugging and profiling cloud and distributed installations: JVM memory management, profiling Java applications Unix/Linux (CentOS) Experience in SIGINT disciplines and geolocation/emitter identification and signal applications; joint program dataflow architectures Experience with CentOS, Linux/RedHat, and configuration management tools such as Subversion, ClearCase/UCM, or similar Optional Skills
In-depth knowledge of Information Retrieval; assisting the software development team in designing, developing and testing cloud information retrieval Designing and implementing complex workflows that manage Cloud MapReduce analytics Interacting with Cloud Distributed Coordination Frameworks Leading one or more software development tasks and ensuring alignment with the software development process Recommending improvements to documentation and process standards Serving as a subject matter expert for Cloud Computing and Hadoop-related technologies Debugging issues with cloud-based distributed computing frameworks Managing multi-node cloud-based installations Delegating programming and testing tasks and monitoring performance Choosing software development processes in coordination with customers Recommending new technologies and processes for complex cloud projects Ensuring quality control of all developed or modified software Architecting solutions to complex cloud software engineering problems for efficient data processing and retrieval Requirements
TS/SCI Full Scope Polyrequired Benefits Snapshot
401K: up to 3% discretionary profit sharing contribution + 100% match on the first 7% of pay PTO: 20 days per year Healthcare, dental, vision, free for a single participant $50,000 life insurance provided, additional voluntary life insurance available
#J-18808-Ljbffr