Barbaricum is a rapidly growing government contractor providing leading-edge support to federal customers, with a particular focus on Defense and National Security mission sets. We leverage more than 15 years of support to stakeholders across the federal government, with established and growing capabilities across Intelligence, Analytics, Engineering, Mission Support, and Communications disciplines. Founded in 2008, our mission is to transform the way our customers approach constantly changing and complex problem sets by bringing to bear the latest in technology and the highest caliber of talent.
Headquartered in Washington, DC's historic Dupont Circle neighborhood, Barbaricum also has a corporate presence in Tampa, FL, Bedford, IN, and Dayton, OH, with team members across the United States and around the world. As a leader in our space, we partner with firms in the private sector, academic institutions, and industry associations with a goal of continually building our expertise and capabilities for the benefit of our employees and the customers we support. Through all of this, we have built a vibrant corporate culture diverse in expertise and perspectives with a focus on collaboration and innovation. Our teams are at the frontier of the Nation's most complex and rewarding challenges. Join us.
Barbaricum is seeking a Data Engineer to support the Department of Defense’s Chief Data and Artificial Intelligence Officer (CDAO) in accelerating the DoD’s adoption of data, analytics, and AI. The Search Portfolio serves the fundamental need to accelerate decision advantage through information accessibility, information retrieval and insight extraction. The Portfolio will sustain the GAMECHANGER, Contract Search, and JBook Search applications on the Advana platform as the platform upgrades and evolves. This role will be essential as it focuses on developing, maintaining, and optimizing data pipelines and infrastructure, which are fundamental for enabling data-driven operations and insights within defense and intelligence applications.
Responsibilities
- Support the configuration and ingestion of designated structured, unstructured, and semi-structured data repositories into capabilities that satisfy mission partner requirements and support a data analytics and DevOps pipeline to drive rapid delivery of functionality to the client.
- Maintain all operational aspects of data transfers while accounting for the security posture of the underlying infrastructure and the systems and applications that are supported and monitoring the health of the environment through a variety of health tracking capabilities.
- Automate configuration management, leverage tools, and stay current on data extract, transfer, and load (ETL) technologies and services.
- Work under general guidance, demonstrate an initiative to develop approaches to solutions independently, review architecture, and identify areas for automation, optimization, right-sizing, and cost reduction to support the overall health of the environment.
- Apply comprehension of data engineering-specific technologies and services, leverage expertise in databases and a variety of approaches to structuring and retrieving of data, comprehend Cloud architectural constructs, and support the establishment and maintenance of Cloud environments programmatically using vendor consoles.
- Engage with multiple functional groups to comprehend client challenges, prototype new ideas and new technologies, help to create solutions to drive the next wave of innovation, and design, implement, schedule, test, and deploy full features and components of solutions.
- Maintain an existing collection of web scraping tools used as the initial step of the ETL process.
- Identify and implement scalable and efficient coding solutions.
Qualifications
- Must have an active DoD Top Secret clearance and must be able to achieve a TS/SCI clearance with scope.
- Bachelor’s degree plus 5-7 years experience, or a Masters Degree plus 3 years of experience.
- Experience with Big Data systems, including Apache Spark / Databricks.
- Experience with ETL processes.
- Experience with Amazon Web Services (AWS), Microsoft Azure, or MilCloud 2.0.
- Applying DoD Security Technical Implementation Guides (STIGs) and automating that process.
- Experience with multiple coding languages.
Additional Information
For more information about Barbaricum, please visit our website at www.barbaricum.com. We will contact candidates directly to schedule interviews. No phone calls please.