Cloud Software Engineer (Various Levels)

Minimum Clearance Required to Start:
Top Secret SCI w/Polygraph
Job Description:

Ready to be part of a team that tackles defense and data challenges? Do you want to enhance your skills developing, maintaining, and enhancing complex and diverse Big-Data Cloud systems? Parsons is now hiring experienced Cloud Software Engineers with a desire to work on projects that change the world.

Parsons' extensive experience in this field, combined with your knowledge, will propel your career forward with opportunity for advancement with top performance. We can offer training, development, and opportunities to work on marquee projects as you and our fast-paced business grow and evolve. We need our Cloud Software Engineers to be versatile, enthusiastic to work in highly flexible, team-oriented environments and who have exceptional communication, analytical and organizational skills.

PROJECT:

Parsons is sourcing candidates for a project within the Department of Defense to provide a wide range of skills and expertise to perform technology development, testing, integration of hardware and software applications, and associated documentation and training.

RESPONSIBILITIES:

Parsons is seeking Cloud Software Engineers of various experience levels to perform the following:
  • Develop, maintain, and enhance complex and diverse Big-Data Cloud systems based upon documented requirements.
  • Directly contributes to all stages of back-end processing, analyzing, and indexing.
  • Provide expertise in Cloud Computing, Hadoop Eco-System including implementing Java applications, Distributed Computing, Information Retrieval (IR), and Object-Oriented Design.
  • Review and test software components for adherence to the design requirements and documents test results.
  • Resolve software problem reports. Utilizes software development and software design methodologies appropriate to the development environment.
  • Provide specific input to the software components of system design to include hardware/software trade-offs, software reuse, use of Commercial Off-the-shelf (COTS)/Government Off-the-shelf (GOTS) in place of new development, and requirements analysis and synthesis from system level to individual software components.
  • Work individually or as part of a team.

REQUIRED CAPABILITIES

LEVEL 1:
  • Hadoop/Cloud Certification
  • Understanding of Cloud Scalability
  • Experience developing and deploying analytics within a heterogeneous schema environment.
  • Experience developing and deploying data driven analytics; event driven analytics; sets of analytics orchestrated through rules engines.
  • Experience documenting ontologies, data models, schemas, formats, data element dictionaries, software application program interfaces and other technical specifications.
  • Experience with linguistics (grammar, morphology, concepts).

LEVEL 2:

All capabilities described in Level 1, plus:
  • Experience with LDAP protocol configuration management and cluster performance management (e.g. Nagios).
  • Experience with taxonomy construction for analytic disciplines, knowledge areas and skills.

LEVEL 3:

All capabilities described in Level 1 and Level 2, plus:
  • Experience with data formats/techniques such as ASDF, XML (Schema, XSL/T, XQuery), Streaming parsers (Stax or SAX, DOM), protobuf, or Avro
  • Experience developing and deploying:
    • analytics that include foreign language processing
    • analytic processes that incorporate/integrate multi-media technologies, including speech, text, image and video exploitation
    • analytics that function on massive data sets, for example, more than a billion rows or larger than 10 Petabytes; analytics that employ semantic relationships (i.e., inference engines) between structured and unstructured data sets
    • analytics that identity latent patterns between elements of massive data sets, for example more than a billion rows or larger than 10 Petabytes
    • analytics that employ techniques commonly associated with Artificial Intelligence, for example genetic algorithms.

QUALIFICATIONS

LEVEL 1:
  • Four (4) years of general experience in software development/engineering, computer science, computer engineering, mathematics, or a related discipline.
  • A bachelor's degree in computer science, engineering, mathematics, or a related discipline may be substituted for four (4) years of general experience.
  • One (1) year of experience developing software with high level languages such as Java.
  • Work or academic experience with distributed scalable Big Data Store (NoSQL) such as HBase, CloudBase/Accumulo, Big Table, as well as Source Code Management (e.g. Git, Stash, or Subversion, etc.).
  • One (1) year of experience with the Map Reduce programming model, the Hadoop Distributed File System (HDFS), and technologies such as Hadoop, Hive, etc.
  • Excellent verbal and written communications skills.

LEVEL 2:
  • Five (5) years of general experience in software development/engineering.
  • A bachelor's degree in computer science, engineering, mathematics, or a related discipline may be substituted for four (4) years of general experience.
  • Three (3) years of experience developing software with high level languages such as Java, C, C++.
  • Three (3) years of experience developing software in UNIX/Linux (Red Hat versions 3-5+) operating systems.
  • Three (3) years of experience in software integration and software testing, to include developing and implementing test plans and test scripts.
  • Two (2) years of experience with distributed scalable Big Data Store (NoSQL) such as HBase, CloudBase/Accumulo, Big Table, etc.
  • Two (2) years of experience with the Map Reduce programming model, the Hadoop Distributed File System (HDFS), and technologies such as Hadoop, Hive, Pig, etc.
  • Experience with Source Code Management (e.g. Git, Stash, or Subversion, etc.) and with serialization such as JSON and/or BSON.
  • Experience in the design and development of at least one Object Oriented system, developing restful services and Ruby on Rails framework, and experience developing solutions integrating and extending FOSS/COTS products.
  • Good technical writing skills and shall have generated technical documents in support of a software development project
  • Excellent verbal and written communications skills.

LEVEL 3:
  • Eight (8) years of general experience in software development/engineering.
  • A bachelor's degree in computer science, engineering, mathematics, or a related discipline may be substituted for four (4) years of general experience.
  • Six (6) years of experience developing software with high level languages such as Java.
  • Four (4) years of experience with distributed scalable Big Data Store (NoSQL) such as HBase, CloudBase/Accumulo, Big Table, etc., as well as four (4) years of experience with the Map Reduce programming model, the Hadoop Distributed File System (HDFS), and technologies such as Hadoop, Hive, Pig, etc.
  • Three (3) years of experience developing software in UNIX/Linux (Red Hat versions 3-5+) operating systems
  • Three (3) years of experience in software integration and software testing, to include developing and implementing test plans and test scripts.
  • Experience with Source Code Management (e.g. Git, Stash, or Subversion, etc.), serialization such as JSON and/or BSON, and experience with developing restful services and Ruby on Rails framework.
  • Experience in the design and development of at least one Object Oriented system, as well as, experience developing solutions integrating and extending FOSS/COTS products.
  • Technical writing skills and shall have generated technical documents in support of a software development project.
  • Experience in Build Process (e.g. Maven, Ant, Jenkins, yum, puppet, PRM, etc.).
  • Experience showing successful collaboration with customers on project management tasks.
  • Excellent verbal and written communications skills
Must be able to obtain, maintain and/or currently possess a security clearance.

86148737