Requirements
Must have:
- Bachelors degree in a relevant field or equivalent experience combined with 4-8 years of professional experience; or 2-6 years of related professional experience with a masters degree.
- Minimum of 4 years in software engineering, data engineering, or business/data analysis, preferably within Agile/Scrum frameworks.
- Practical experience in software development using Python, Java, SQL, and familiarity with JavaScript and HTML.
- Proficient in using distributed version control systems like Git and Bitbucket.
- Strong skills in data analytics and visualization tools, such as Kibana, Power BI, Tableau, and the ELK stack (Elasticsearch, Logstash, Kibana).
- Experience in designing, developing, and optimizing ETL processes and data pipelines, with a focus on integrating with event streaming platforms like Kafka.
- Solid foundation in data modeling, integration, and analytics to drive data-centric projects.
- Capability in implementing system integrations, particularly with Kafka and Elastic platforms.
- Understanding of networking and internet protocols, especially in network-centric or data-driven settings.
- Proficient in software development and deployment in UNIX/Linux command line environments.
- Excellent communication and teamwork skills, able to effectively collaborate with cross-functional teams and stakeholders.
- Experience with Agile project management tools, specifically JIRA and Confluence.
- Active Secret DoD Security clearance prior to commencement of employment.
- Active Security+ Certification (or other applicable DoD 8570 IAT II certification) prior to commencement of employment.
Responsibilities:
- Contribute to data engineering initiatives by facilitating the integration and enhancement of DISN network topology data for advanced analytics.
- Engage in technical discussions with both internal and external stakeholders to aid solution design and execution.
- Create, test, and implement data pipelines and integration solutions across distributed systems and cloud environments, utilizing Python, JavaScript, Java, and SQL.
- Assist in requirements gathering and collaborate with stakeholders to design and deploy data enrichment pipelines, merging various data sources into Confluent (Kafka) and Elastic platforms.
- Design and maintain Kibana visualizations and dashboards for operational insights.
- Support integrations of the Kafka system with Elasticsearch/Logstash and other systems.
- Collaborate actively within Agile scrum teams, contributing to team outputs and knowledge sharing with colleagues.
- Communicate and coordinate effectively with team members located across different geographical regions to achieve project goals.
- Troubleshoot and address installation, infrastructure, and system-related issues; report and manage technical risks.
- Create and maintain technical documentation, including compliance with DoD standards, interface documentation, and security artifacts.
- Ensure that solutions adhere to DoD security standards and guidelines, while supporting platform reliability by addressing operational challenges as they arise.
Company:
We are the Leidos Digital Modernization Group, seeking a Data Engineering Specialist to join our Global Management System (GMS) Team for the Global Solutions Management – Operations II (GSM-O II) contract. This role involves supporting the Defense Information System Network (DISN) within the Department of Defense. We are committed to innovation, aiding DISAs mission, and transforming operational processes. Our team works in a dynamic Agile environment located near Scott AFB or Ft. Meade.
Benefits of working at Leidos include competitive compensation, comprehensive health and wellness programs, income protection, paid leave, and retirement plans. Join us as we strive to outthink and disrupt conventional processes in pursuit of excellence.