Systems Engineer

Washington DC Leidos


Join us at Leidos, where your most important work is ahead!

At Leidos, everything we do is built on our commitment to do the right thing for our customers, our employees, and our communities. Our mission is to make the world safer, healthier, and more efficient through information technology, engineering, and science. 

Leidos employees enjoy great benefits such as paid time off (PTO), flexible schedules, discounted stock purchase plans, career growth, unlimited education and training support, parental paid leave and more! You will have the opportunity to work under the best leadership in the industry at all levels, where supporting their employees and what they do is what matters.

This position requires a current and active TS/SCI with Polygraph security clearance. This position does not have the ability to sponsor candidates for clearance processing.

Our National Security Sector is seeking a Systems Engineer to support an enterprise IT program for an Intelligence Community client. The client is seeking support for the migration of legacy data, development of new capabilities, and extension and maintenance of bulk data pipeline.

Primary Responsibilities:

Our client requires support for the DevOps operations that include migration of legacy data, development of new system capabilities, extension and maintenance of bulk data pipeline and enhancements for multiple mission applications in the client’s cloud computing environment. Core hours shall be 9:00AM – 3:00PM Monday – Friday with schedule flexibility outside Core hours.  Weekend or after-hours support may be required for operational issues, deployments and critical activities.

In this role you will:

  • Evaluate, format and maintain data from multiple sources; develop and manage data processing workflows in customer pipeline; and evaluate, extract and process System and Event Logs.
  • Develop and maintain processing pipelines and documentation; assist with identifying, classifying, and verifying incoming dataset types; and identify and implement COTS and Open Source Data Enhancement capabilities.
  • Code scripts and software modules as needed basis to triage and convert datasets that arrive in non-standard formats to fit a de-normalized row and column data model.
  • Store, standardize, extract, reference, and validate data; develop and deploy processing pipelines to cloud analytics platforms; and develop requirements to support data analytics customers and integration partners.
  • Develop software standards and practices documentation and support Agile development.

Basic Qualifications:

  • Active TS/SCI with Polygraph security clearance.
  • Typically requires BS degree and 12+ years of prior relevant experience or Masters with 10+ years of prior relevant experience. May possess a Doctorate in technical domain. Additional years of prior relevant experience may be substituted for a degree.
  • Demonstrated experience utilizing big data processing tech such as Spark, Pyspark, and Python.
  • Demonstrated experience data mapping, extraction, transformation and loading.
  • Demonstrated experience building analytic reports in tools such as CloudWatch and Kibana.
  • Demonstrated experience using AWS Step Functions to coordinate ETL pipelines.
  • Demonstrated experience processing and converting of OS and Data Logs into reports and metrics Dashboards.
  • Demonstrated experience with Regular Expressions (RegEx).
  • Demonstrated experience with SQL, MySQL, PostgresSQL.
  • Demonstrated experience with Data file type processing XML, JSON.
  • Demonstrated experience with IDEs and Data Modeling through Notebooks and Visual Studio.
  • Demonstrated experience ETL’ing data from disparate structured & unstructured data formats into enriched, query-friendly structured data in indexed files.
  • Demonstrated experience performing extensive data review and data quality analysis.
  • Demonstrated experience developing ETL design documentation including source and target mapping and data dictionary information.
  • Demonstrated experience interfacing with customers and integration partners for gaining and clarifying detailed objectives.
  • Demonstrated experience supporting Agile development by contributing to tasking definition, scope and review.

Preferred Qualifications:

  • Demonstrated experience deploying capabilities on the Databricks unified analytics platform.
  • Demonstrated experience in optimizing Databricks Delta Tables for query, merge and stream operations.
  • Demonstrated experience with CI/CD using Jenkins.
  • Demonstrated experience in joining multiple complex data sets using Spark.
  • Demonstrated experience in tuning Spark streaming and batch jobs for cluster utilization and speed.
  • Demonstrated experience in deploying complex, notebook-based pipelines.
  • Demonstrated experience in Python data analysis libraries such as Pandas.
  • Demonstrated experience utilizing Cloud services such as Lambda, SNS/SQS, or EC2.
  • Demonstrated experience with DevOps tools to include Cloudwatch, Lambda, SQS, Dynamo and RDS.
  • Demonstrated experience working with Elastic, ElasticSearch/Logstash/Kibana (ELK stack).

Original Posting Date:


While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.

Pay Range:

Pay Range $122,200.00 – $220,900.00

The Leidos pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.


To apply for this job please visit