Social Insurance Specialist (Disability Examiner) - Direct Hire
Social Security Administration
Posted: February 5, 2026 (0 days ago)
This job was posted recently. Fresh listings typically have less competition.
Social Security Administration
Other Agencies and Independent Organizations
Location
Salary
$143,913 - $197,200
per year
Type
Full Time
More Engineering jobs →Closes
Base salary range: $104,604 - $135,987
Typical requirements: 1 year specialized experience at GS-13. Senior expert or supervisor.
Note: Actual salary includes locality pay (15-40%+ depending on location).
This job involves building and managing data systems using Kafka technology to handle real-time information for the Social Security Administration's operations, including feeding data to AI models.
The role requires working with teams to ensure these systems are reliable and efficient, while troubleshooting issues and mentoring others.
It's a good fit for experienced IT professionals with strong skills in data pipelines and a passion for scalable software solutions in a government setting.
IT Specialist (APPSW) Kafka Engineer positions are being filled through the Office of Personnel Management's delegated Direct Hire Authority, open to all U.S. citizens.
Selections made under this bulletin will be processed as new appointments to the civil service. Current civil service employees would, therefore, be given new appointments to the civil service.
Under the provisions of the Direct Hire Authority, Veterans Preference and the "Rule of Many" do not apply.
Resumes exceeding two pages in length will not be considered, please visit the new resume guidance for more information.
Duties Design, develop, and maintain robust Kafka-based applications and data pipelines that support SSA's business operations, including the real-time or near-real-time data to AI/ML models.
Collaborate with development, operations, and infrastructure teams to deliver reliable, scalable, and high-performing Kafka solutions.
Ensure the availability, reliability, and performance of Kafka clusters and related systems. Work closely with architects, data engineers, and stakeholders to define requirements and deliver solutions.
Troubleshoot and resolve issues in Kafka applications, ensuring minimal downtime and optimal performance.
Document code, design decisions, processes, configurations, and best practices for future reference and team knowledge sharing.
Mentor junior developers and share Kafka expertise, fostering a culture of learning and growth. Stay current with the latest Kafka releases, features, and ecosystem advancements.
Perform statistical analysis to monitor team performance, improve processes, and ensure customer satisfaction. Define and set SLAs for projects, ensuring high standards of service delivery.
READ ALL SECTIONS OF THIS ANNOUNCEMENT IN ITS ENTIRETY. THIS INFORMATION IS CRUCIAL TO SUBMITTING A SUCCESSFUL APPLICATION. Applicants must qualify for the series and grade of the posted position.
Experience must be IT related; the experience may be demonstrated by paid or unpaid experience and/or completion of specific, intensive training (for example, IT certification), as appropriate.
Your resume must provide sufficient experience and/or education, knowledge, skills, abilities, and proficiency of any required competencies to perform the specific position for which you are applying.
To qualify for the 2210 IT Specialist series, the applicant must demonstrate the following competencies: Attention to Detail - Is thorough when performing work and conscientious about attending to detail.
Customer Service - Works with clients and customers (that is, any individuals who use or receive the services or products that your work unit produces, including the general public, individuals who work in the agency, other agencies, or organizations outside the Government) to assess their needs, provide information or assistance, resolve their problems, or satisfy their expectations; know about available products and services; and is committed to providing quality products and services.
Oral Communication - Expresses information (for example, ideas or facts) to individuals or groups effectively, taking into account the audience and nature of the information (for example, technical, sensitive, controversial); makes clear and convincing oral presentations; listens to others, attends to nonverbal cues, and responds appropriately.
Problem Solving - Identifies problems; determines accuracy and relevance of information; uses sound judgment to generate and evaluate alternatives, and to make recommendations.
Minimum Qualifications: Grade 14 To qualify at the GS-14 level, you must have at least 52 weeks of specialized experience at the GS-13 level, or equivalent, designing, developing, and maintaining scalable, fault-tolerant data pipelines using Apache Kafka; managing and administering Kafka clusters throughout the Systems Development Life Cycle (SDLC), including upgrades and patching; leading large-scale projects, serving as a Product Owner or Agile/Scrum team lead; demonstrating strong programming skills in Java, with Python experience as a plus; utilizing Kafka APIs (Producer, Consumer, Streams, Connect) for event-driven and microservices-based solutions; applying knowledge of serialization formats (Avro, Protobuf, JSON) and schema registry/data governance, including Hackolade for data modeling; optimizing producer/consumer performance and handling large-scale data ingestion; implementing unit and integration testing for Kafka applications; configuring and tuning Kafka clusters for performance, reliability, and scalability; monitoring and troubleshooting Kafka clusters using tools such as Prometheus and Grafana; supporting hybrid integration architecture patterns.
Minimum Qualifications: Grade 15 To qualify at the GS-15 level, you must have at least 52 weeks of specialized experience at the GS-14 level, or equivalent, leading the design, development, and implementation of enterprise-scale, fault-tolerant data pipelines using Apache Kafka; providing expert-level management and administration of Kafka clusters throughout the Systems Development Life Cycle (SDLC), including upgrades and patching; overseeing large-scale, cross-functional projects as a senior Product Owner or Agile/Scrum leader, ensuring alignment with organizational goals; demonstrating advanced proficiency in Java programming, with experience in Python as a plus; architecting event-driven and microservices-based solutions leveraging Kafka APIs (Producer, Consumer, Streams, Connect); establishing and enforcing best practices for serialization formats (Avro, Protobuf, JSON) and schema registry/data governance, including Hackolade for data modeling; directing the optimization of producer/consumer performance and large-scale data ingestion strategies; leading the implementation of unit and integration testing frameworks for Kafka applications; managing hybrid integration architecture patterns and ensuring reliability, scalability, and performance of Kafka clusters; overseeing monitoring and troubleshooting activities using tools such as Prometheus and Grafana; providing technical guidance and mentorship to teams on Kafka cluster setup, configuration, and tuning; ensuring compliance with organizational standards and data governance policies.
PLEASE NOTE: This specialized experience is REQUIRED and must be explicitly documented/described in your resume, or you will be disqualified from further consideration.
Qualification standards and additional information for this position can be found here: http://www.opm.gov/policy-data-oversight/classification-qualifications/general-schedule-qualification-standards/2200/information-technology-it-management-series-2210-alternative-a/ Major Duties:
Check your resume before applying to catch common mistakes