Data Architect (61031777)
State of South Carolina
Posted: April 13, 2026 (1 day ago)
This job was posted recently. Fresh listings typically have less competition.
National Geospatial-Intelligence Agency
Department of Defense
Location
Saint Louis, Missouri
Salary
$76,573 - $158,322
per year
Type
Full-Time
More Engineering jobs →Closes
Base salary range: $147,649 - $221,900
Typical requirements: Executive-level leadership experience. Senior executive qualifications required.
Note: Actual salary includes locality pay (15-40%+ depending on location).
This job involves building and maintaining data systems that handle large amounts of information, creating pipelines to move and process data from various sources, and working with teams to support tools like AI models.
It's ideal for someone with a technical background in programming and data handling who enjoys solving complex problems in a structured government environment.
Good fits include recent graduates with relevant degrees or professionals with hands-on experience in data integration.
Data Engineers develop, construct, test, optimize, and maintain microservice architectures.
Data engineers employ Continuous Integration (CI)/Continuous Deployment (CD) methods to deploy containerized solutions and data pipelines to support various capabilities (e.g., machine learning/AI models, data cleaning and preparation).
They solve problems associated with data access and integration.
They c MANDATORY QUALIFICATION CRITERIA: For this particular job, applicants must meet all competencies reflected under the Mandatory Qualification Criteria to include education (if required).
Online resumes must demonstrate qualification by providing specific examples and associated results, in response to the announcement's mandatory criteria specified in this vacancy announcement: Demonstrated experience utilizing standards and best practices for data interfaces such as extract-transform-load (ETL) and application programming interfaces (API).
Demonstrated experience working with data storage, access and exchange technologies, e.g. S3, FTP WFS, FeatureServer, Elasticsearch, SDK, etc.
Demonstrated proficiency in applied programming and/or manipulation of data with any programing language such as Python, R, or Java. EDUCATION REQUIREMENT: A.
Education: Bachelor's degree from an accredited college or university in Computer Science, Data Science, Engineering, Information Science, Information Systems Management, Mathematics, Operations Research, Physical Sciences, Statistics, Technology Management, or a degree that provided a minimum of 24 semester hours in one or more of the fields identified above and required the development or adaptation of applications, systems, or networks.
-OR- B.
Combination of Education and Experience: A minimum of 24 semester (36 quarter) hours of coursework in any area listed in option A, plus experience in experience in designing, implementing, and maintaining scalable data pipelines, ETL processes, and data integration solutions, or in a related field that demonstrates the ability to successfully perform the duties associated with this work.
As a rule, every 30 semester (45 quarter) hours of coursework is equivalent to one year of experience. Candidates should show that their combination of education and experience totals 4 years. -OR- C.
Experience: A minimum of 4 years of experience in experience in designing, implementing, and maintaining scalable data pipelines, ETL processes, and data integration solutions, or in a related field that demonstrates the ability to successfully perform the duties associated with this work.
-AND- IT-related experience demonstrating each of the four competencies: Attention to Detail, Customer Service, Oral Communication, and Problem Solving.
DESIRABLE QUALIFICATION CRITERIA: In addition to the mandatory qualifications, experience in the following is desired: Demonstrated experience implementing, configuring, or designing databases with one or more current database technologies.
e.g., relational (Oracle, MySQL, Postgresql, etc.), noSQL (MongoDB, Accumulo, Elastic, etc.), graph (Allegrograph, JanusGraphy, Neo4J, etc.).
Demonstrated experience with data modeling and/or schema development. Demonstrated experience with coding and scripting languages such as Python, JSON, BASH, and/or PowerShell.
Certification in cloud native services such as Amazon Web Services (AWS), containers, Kubernetes, DevSecOps pipeline tooling, and/or microservices architecture Major Duties:
ADDITIONAL INFORMATION: Duties: 1. Utilize a variety of languages and tools (e.g., scripting languages) to build data pipelines to pull together information from different source systems 2.
Collaborate with data architects, modelers, and IT team members on project goals; ensure systems meet Agency requirements and industry practices 3.
Design, construct, install, test, and maintain highly scalable data management systems 4. Develop data set processes for data discovery, modeling, mining, and production
Check your resume before applying to catch common mistakes