What are the responsibilities and job description for the Cleared Software Developer Mid level (TS/SCI) position at All Points Logistics?
Software Engineering
Chantilly , Virginia
Direct Hire
Sep 8, 2025
*US CITIZENSHIP REQUIRED*
*TOP SECRET CLEARANCE WITH SCI ELIGIBILITY REQUIRED*
*THIS POSITION IS CONTINGENT UPON AWARD*
DESCRIPTION OF POSITION/DUTIES:
All Points is seeking a Top Secret Cleared Software Developer to support the Federal Bureau of Investigation (FBI) in Chantilly, VA. This position requires an active, current, and fully adjudicated US Department of Defense Top Secret Clearance with SCI eligiblity. This position is contingent upon contract award (expected October 2025).
The selected candidate will support IT networks, systems, and initiatives with activities that include information systems operations, information systems development and modernization, and IT security. The team you will join is also responsible for the development, implementation, and subsequent operations and maintenance of new applications each fiscal year. The operation of existing applications and new application initiatives are critical to the success of the FBI mission. Additionally, thousands of personnel in the law enforcement, intelligence, legal, and administrative communities use these services daily to perform mission-critical work. These applications require responsive, effective, and specialized technical expertise to limit disruption to FBI operations.
*THIS POSITION IS CONTINGENT UPON AWARD*
Duties may include but are not limited to:
Summary: Responsible for guidance and facilitation of SAFe Principles. Plan and manage the work, and values to uphold while promoting alignment, collaboration, and delivery across large numbers of agile teams, currently aligned in two Agile Release Trains. The first aligns teams to focus on enabling operational insight by increasing data discoverability through enterprise search or query. The second aligns teams to focus on enabling the operational analysis and triage of data through visualization, enrichments, and analytics within Global Mission Analytics (GMA).
EDUCATION:
ALL POINTS IS AN EQUAL OPPORTUNITY AND AFFIRMATIVE ACTION EMPLOYER
*TOP SECRET CLEARANCE WITH SCI ELIGIBILITY REQUIRED*
*THIS POSITION IS CONTINGENT UPON AWARD*
DESCRIPTION OF POSITION/DUTIES:
All Points is seeking a Top Secret Cleared Software Developer to support the Federal Bureau of Investigation (FBI) in Chantilly, VA. This position requires an active, current, and fully adjudicated US Department of Defense Top Secret Clearance with SCI eligiblity. This position is contingent upon contract award (expected October 2025).
The selected candidate will support IT networks, systems, and initiatives with activities that include information systems operations, information systems development and modernization, and IT security. The team you will join is also responsible for the development, implementation, and subsequent operations and maintenance of new applications each fiscal year. The operation of existing applications and new application initiatives are critical to the success of the FBI mission. Additionally, thousands of personnel in the law enforcement, intelligence, legal, and administrative communities use these services daily to perform mission-critical work. These applications require responsive, effective, and specialized technical expertise to limit disruption to FBI operations.
*THIS POSITION IS CONTINGENT UPON AWARD*
Duties may include but are not limited to:
- Responsible for developing, analyzing, and maintaining software applications for a Federal Law Enforcement customer.
- Utilize expertise in data analysis, data query, data aggregation, data visualization, and software development to implement business intelligence and data warehousing solutions that enable cyber professionals to effectively analyze case data and advance investigations.
- Proficiency in Power BI, data analysis, data query, data aggregation, data visualization, software development, system architecture, data transformation, data management, and big data technology.
- Capable of utilizing Extract, Transform, Load (ETL) processes.
- Experienced in writing complex statements, scripts, stored procedures, triggers, and views implementing business intelligence and data warehousing solutions.
- Solid foundation in, and application of, data warehousing and reporting platform concepts with ability to recommend solutions and drive best practices.
- Familiarity with multiple business intelligence environments is preferred.
- Develop ETL mappings to extract data from a source data system (databases, files, etc.), transform the data based on business requirements and load the data into a target data system (databases, files, etc.).
- Conduct performance tuning of the mappings developed to ensure that the data is extracted, transformed, and loaded in a reasonable amount of time based on the requirements.
- Develop Business Intelligence (BI) reports using BI software and perform unit testing of the reports developed.
- Work on multiple phases of the Development Life Cycle (requirements, design, development, testing and deployment) of BI Dashboard and reports.
- Develop required documentation in support of the development life cycle.
Summary: Responsible for guidance and facilitation of SAFe Principles. Plan and manage the work, and values to uphold while promoting alignment, collaboration, and delivery across large numbers of agile teams, currently aligned in two Agile Release Trains. The first aligns teams to focus on enabling operational insight by increasing data discoverability through enterprise search or query. The second aligns teams to focus on enabling the operational analysis and triage of data through visualization, enrichments, and analytics within Global Mission Analytics (GMA).
EDUCATION:
- Bachelor’s degree from an accredited college in a related discipline
- 5-7 Years
- Experience with Atlassian Software products (JIRA, Confluence, Service Desk, etc.)
- Experience with ELK stack, OpenSearch, SonarQube, Cypress, PowerShell, C#, and Databricks
- Experience with Docker, SQL, Angular, Spring Boot, Nifi, AWS, python, scala, shell scripting, and XML processing
- Experience in AWS solution architecture
- Maintaining the Apache Hadoop Ecosystem, especially utilizing HBase, MapReduce, and Spark.
- ETL processes utilizing Linux shell scripting, Perl, Python, and Apache Airflow.
- AWS services such as CloudWatch, CloudTrail, ELB, EMR, KMS, SQS, SNS, and Systems Manager.
- Experience in mobile platform development, such as web mobile, Android, microservices
- Working knowledge using proxies and proxy creation
- Vue.js, ASP.NET (C#), NodejsReact, JavaScript, HTML, CSS, PostgreSQL, Liquibase, Elasticsearch, and Git.
- Vue.js, .NET, Postgres, Oracle DB
- AWS CloudFormation
- Maintenance of geospatial applications such as ESRI ArcGIS and Geoserver
- Assume responsibility for quarterly OpenStreetMap refresh and deployment, MongoDB DBA, ElasticSearch geospatial repository development, DBA, and performance tuning.
- Re-establish OpenStreetMap, Openaddresses, and Geonames feeds as a source of current geospatial data by importing these datasets into ElasticSearch DB.
- AngularJS, BoostrapVue, Chart.js, CoordinateSharp, D3.js, jQuery, jTable, Leaflet.JS, Microsoft .NET 4.6 Framework, Microsoft .NET 8 Framework, Newtonsoft Json.net0, NGA GeoTrans modules, PDFSharp, Telerik, Vue.JS, Oracle Instant Client
- Ability to deliver an advanced visual analytic application to include developing data analytics for desktop and web-developed visual analytic software; facilitating the bulk analysis of relational information utilizing advanced graphics, visualization, and high-performance computing
- Experience utilizing big data technologies and knowledge of data science models to analyze large amounts of data and create valuable insights
- Experience integrating third-party components, the introduction of new custom capabilities, modification of existing code, process improvement recommendation, and automation of operations and maintenance tasks
- Expertise in architect and utilization of large-scale data processing, computationally intensive statistical modeling, and advanced analytics
- Databricks and Lakehouse architectures
- AWS OpenSearch
- Experience with AWS, Splunk, Databricks, and other Oracle/SQL based platforms.
- Experience with python, Microsoft VBA, and Databricks
- Apache Niagra Files (NiFi)
- Apache TIKA
- Databricks and Lakehouse architecture
- ElasticSearch
- AWS SQS
- Informatica and custom software components
- ElasticSearch and OpenSearch
- .NET, C#, Javascript, and Java, Python
- Terraform
- Experience developing and maintaining components in AWS
- SpringBoot framework
- Experience in Entity Resolution, developing and implementing methods to identify and link records across different data sources that represent the same real-world entity (e.g., person, organization, product).
- Experience with Natural Language Processing (NLP) models for tasks like text classification, sentiment analysis, and entity recognition.
- Experience with Natural Language Generation (NLG) models for tasks like text summarization.
- Databricks, Spark, PyTorch, Tensorflow, Python, Scala, CoreNLP, and Transformer foundational models
- Experience as a scrum participant and software release processes
- Available to work after hours when mission requires
- Communicate work using SMART
- Agile methodologies experience
- Understanding of data warehousing and business intelligence concepts
- Ability to solve problems involving several options in situations
- US Citizenship required
- Active, current, and fully adjudicated (no interim) US Department of Defense Top Secret Clearance with SCI eligibility. May be required to take CI Polygraph.
ALL POINTS IS AN EQUAL OPPORTUNITY AND AFFIRMATIVE ACTION EMPLOYER