RPMGlobal
Overview
Base-2 Solutions is seeking a Cloud Software Engineer who is skilled in DevOps, particularly as it relates to cloud solutioning and administration, capable of designing and implementing cloud-based solutions; maintaining and securing Linux-based operating systems; and designing, managing, and securing data flow management utilizing NiFi. The role requires being technically savvy with strong communication and leadership skills, self-motivated, and able to demonstrate problem-solving and adaptability, knowledge of Agile processes and workflows, to help us continue to be successful, grow, and best serve our customers. This role develops, maintains, and enhances complex and diverse Big-Data Cloud systems based on documented requirements. It contributes to all stages of back-end processing, analyzing, and indexing, and provides expertise in Cloud Computing, Hadoop Ecosystem, including implementing Java applications, Distributed Computing, Information Retrieval, and Object Oriented Design. The candidate may work individually or as part of a team, reviews and tests software components for adherence to design requirements and documents test results, resolves software problem reports, and uses software development and design methodologies appropriate to the development environment. It provides input to system design including hardware/software trade-offs, reuse of software, COTS/GOTS options, and requirements analysis from system level to individual software components. Responsibilities
Develops, maintains, and enhances complex Big-Data Cloud systems based on documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Ecosystem, including implementing Java applications, Distributed Computing, Information Retrieval, and Object Oriented Design. Works individually or as part of a team. Reviews and tests software components for adherence to design requirements and documents test results. Resolves software problem reports. Utilizes software development and software design methodologies appropriate to the development environment. Provides input to software components of system design, including hardware/software trade-offs, software reuse, and requirements analysis from system level to individual software components. Required Skills
Experience with cloud platforms like Amazon Web Services (AWS) and Microsoft Azure. Knowledge in containerization using Docker, Kubernetes, or other container orchestration tools. Ability to troubleshoot and resolve complex issues. Good communication skills are essential for collaborating with cross-functional teams and stakeholders. The following Cloud related experiences are required: Two (2) years of Cloud and/or Distributed Computing Information Retrieval (IR). One (1) year of experience with implementing code that interacts with Cloud Big Table. One (1) year of experience with implementing code that interacts with Cloud Distributed File System. One (1) year of experience with implementing complex MapReduce analytics. One (1) year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks. One (1) year of experience in architecting Cloud Computing solutions. One (1) year of experience in debugging problems with Cloud-based Distributed Computing Frameworks. One (1) year of experience in managing multi-node Cloud-based installation. Experience in Computer Network Operations: Utility Computing, Network Management, Virtualization (VMWare or VirtualBox), Cloud Computing. Multi Node Management and Installation: Management and installation of Cloud and Distributed Computing on multiple nodes, Python, CFEngine, Bash, Ruby or related technologies. Experience in Information Assurance: Securing Cloud Based and Distributed applications through industry standard techniques such as Firewalls, PKI Certificate and Server Authentication with experience in Corporate authentication service(s). Experience in Information Technology: Object Oriented Design and Programming, Java, Eclipse or similar development environment, MAVEN, RESTful web services. Cloud and Distributed Computing Technologies: at least one or a combination of several areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase, JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies. Cloud and Distributed Computing Information Retrieval: at least one or a combination of several areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB. Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies. Aspect Oriented Design and Development. Debugging and Profiling Cloud and Distributed Installations: JVM memory management, Profiling Java Applications. UNIX/LINUX, CentOS. Experience in SIGINT: Experience with at least one SIGINT collection discipline areas (FORNSAT, CABLE, Terrestrial/Microwave, Overhead, and ELINT). Geolocation, emitter identification, and signal applications. Joint program collection platforms and dataflow architectures; signals characterization analysis. Experience with Other: CentOS, Linux/RedHat Configuration management tools such as Subversion, ClearQuest, or Razor. Qualifications
Twelve (12) years experience software engineering experience in programs and contracts of similar scope, type, and complexity is required. Bachelors degree in Computer Science or related discipline from an accredited college or university is required; four (4) years of which must be in programs utilizing Big-Data cloud technologies and/or Distributed Computing. Four (4) years of cloud software engineering experience on projects with similar Big-Data systems may be substituted for a bachelors degree. Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. Desired Skills
Familiarity with agile development methodologies like Scrum is beneficial. Experience with the Elastic stack solutioning including indexing, searching, and managing data. Familiarity with Niagarafiles(NiFi) management. Ansible scripting. Provide in-depth knowledge of Information Retrieval; assisting the software development team in designing, developing and testing Cloud Information Retrieval. Implement complex workflows that manage Cloud MapReduce analytics. Implement code that interacts with Cloud Distributed Coordination Frameworks. Oversee one or more software development tasks and ensures the work is completed in accordance with the constraints of the software development process being used on any particular project. Make recommendations for improving documentation and software development process standards. Serve as a subject matter expert for Cloud Computing and corresponding technologies including Hadoop assisting the software development team in designing, developing and testing Cloud Computing Systems. Debug problems with Cloud based Distributed Computing Frameworks. Manage multi-node Cloud based installation. Delegate programming and testing responsibilities to one or more teams and monitor their performance. Select the software development process in coordination with the customer and system engineering. Recommend new technologies and processes for complex cloud software projects. Ensure quality control of all developed and modified software. Architect solutions to complex Cloud Software Engineering Problems such as efficiently processing and retrieving large amounts of data. Make recommendations for improving documentation and software development process standards.
#J-18808-Ljbffr
Base-2 Solutions is seeking a Cloud Software Engineer who is skilled in DevOps, particularly as it relates to cloud solutioning and administration, capable of designing and implementing cloud-based solutions; maintaining and securing Linux-based operating systems; and designing, managing, and securing data flow management utilizing NiFi. The role requires being technically savvy with strong communication and leadership skills, self-motivated, and able to demonstrate problem-solving and adaptability, knowledge of Agile processes and workflows, to help us continue to be successful, grow, and best serve our customers. This role develops, maintains, and enhances complex and diverse Big-Data Cloud systems based on documented requirements. It contributes to all stages of back-end processing, analyzing, and indexing, and provides expertise in Cloud Computing, Hadoop Ecosystem, including implementing Java applications, Distributed Computing, Information Retrieval, and Object Oriented Design. The candidate may work individually or as part of a team, reviews and tests software components for adherence to design requirements and documents test results, resolves software problem reports, and uses software development and design methodologies appropriate to the development environment. It provides input to system design including hardware/software trade-offs, reuse of software, COTS/GOTS options, and requirements analysis from system level to individual software components. Responsibilities
Develops, maintains, and enhances complex Big-Data Cloud systems based on documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Ecosystem, including implementing Java applications, Distributed Computing, Information Retrieval, and Object Oriented Design. Works individually or as part of a team. Reviews and tests software components for adherence to design requirements and documents test results. Resolves software problem reports. Utilizes software development and software design methodologies appropriate to the development environment. Provides input to software components of system design, including hardware/software trade-offs, software reuse, and requirements analysis from system level to individual software components. Required Skills
Experience with cloud platforms like Amazon Web Services (AWS) and Microsoft Azure. Knowledge in containerization using Docker, Kubernetes, or other container orchestration tools. Ability to troubleshoot and resolve complex issues. Good communication skills are essential for collaborating with cross-functional teams and stakeholders. The following Cloud related experiences are required: Two (2) years of Cloud and/or Distributed Computing Information Retrieval (IR). One (1) year of experience with implementing code that interacts with Cloud Big Table. One (1) year of experience with implementing code that interacts with Cloud Distributed File System. One (1) year of experience with implementing complex MapReduce analytics. One (1) year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks. One (1) year of experience in architecting Cloud Computing solutions. One (1) year of experience in debugging problems with Cloud-based Distributed Computing Frameworks. One (1) year of experience in managing multi-node Cloud-based installation. Experience in Computer Network Operations: Utility Computing, Network Management, Virtualization (VMWare or VirtualBox), Cloud Computing. Multi Node Management and Installation: Management and installation of Cloud and Distributed Computing on multiple nodes, Python, CFEngine, Bash, Ruby or related technologies. Experience in Information Assurance: Securing Cloud Based and Distributed applications through industry standard techniques such as Firewalls, PKI Certificate and Server Authentication with experience in Corporate authentication service(s). Experience in Information Technology: Object Oriented Design and Programming, Java, Eclipse or similar development environment, MAVEN, RESTful web services. Cloud and Distributed Computing Technologies: at least one or a combination of several areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase, JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies. Cloud and Distributed Computing Information Retrieval: at least one or a combination of several areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB. Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies. Aspect Oriented Design and Development. Debugging and Profiling Cloud and Distributed Installations: JVM memory management, Profiling Java Applications. UNIX/LINUX, CentOS. Experience in SIGINT: Experience with at least one SIGINT collection discipline areas (FORNSAT, CABLE, Terrestrial/Microwave, Overhead, and ELINT). Geolocation, emitter identification, and signal applications. Joint program collection platforms and dataflow architectures; signals characterization analysis. Experience with Other: CentOS, Linux/RedHat Configuration management tools such as Subversion, ClearQuest, or Razor. Qualifications
Twelve (12) years experience software engineering experience in programs and contracts of similar scope, type, and complexity is required. Bachelors degree in Computer Science or related discipline from an accredited college or university is required; four (4) years of which must be in programs utilizing Big-Data cloud technologies and/or Distributed Computing. Four (4) years of cloud software engineering experience on projects with similar Big-Data systems may be substituted for a bachelors degree. Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. Desired Skills
Familiarity with agile development methodologies like Scrum is beneficial. Experience with the Elastic stack solutioning including indexing, searching, and managing data. Familiarity with Niagarafiles(NiFi) management. Ansible scripting. Provide in-depth knowledge of Information Retrieval; assisting the software development team in designing, developing and testing Cloud Information Retrieval. Implement complex workflows that manage Cloud MapReduce analytics. Implement code that interacts with Cloud Distributed Coordination Frameworks. Oversee one or more software development tasks and ensures the work is completed in accordance with the constraints of the software development process being used on any particular project. Make recommendations for improving documentation and software development process standards. Serve as a subject matter expert for Cloud Computing and corresponding technologies including Hadoop assisting the software development team in designing, developing and testing Cloud Computing Systems. Debug problems with Cloud based Distributed Computing Frameworks. Manage multi-node Cloud based installation. Delegate programming and testing responsibilities to one or more teams and monitor their performance. Select the software development process in coordination with the customer and system engineering. Recommend new technologies and processes for complex cloud software projects. Ensure quality control of all developed and modified software. Architect solutions to complex Cloud Software Engineering Problems such as efficiently processing and retrieving large amounts of data. Make recommendations for improving documentation and software development process standards.
#J-18808-Ljbffr