• ETL Developer*

    Job Locations US-VA-Reston
    # of Openings
    Information Technology
  • Overview

    ETL Developer

    Government Project

    Location:  Herndon,  VA

    Long Term --  6 Months Contract  to Hire


    US Citizen, Green Card or GC_EAD - Clearable to Public Trust

    In-Person Interview Mandatory



    The team is responsible for the development, maintenance, and enhancement (DM&E) and operation of selected Federal Acquisition IT Systems. The program supports the operations of multiple business applications, as well as development of new applications across different technologies. The software development team is responsible for the software design and implementation of web applications supporting multiple Business Lines. Successful candidates are modern web development specialists experienced in translating business requirements into software architecture.  In addition to strong software development skills, ideal candidates have demonstrated experience in working on an Agile Scrum team



    • Perform ETL job design, development and automation activities with minimal supervision
    • Support manual ETL tasks for various development and implementation projects. 
    • Work with a team of talented engineers to enhance a highly scalable, fault tolerant, and responsive big data platform with next generation streaming data technologies.
    • Troubleshoot, monitor and coordinate defect resolution related to ETL processing
    • Design and support data scientists with existing ETL processes across various data assets.. 
    • Help prove concepts, evaluate technologies and contribute to ideas that can turn into actionable implementations
    • Work with multiple platforms, architectures, and diverse technologies gathering requirements, design and developing ETL processes to transform information from multiple sources.
    • Perform ETL development, articulate and implement best practices for ETL reusability and modularity, across Pentaho PDI primarily and tools such as Apache NiFi and Kafka
    • Important member of the team tasked with transforming a Cloud platform from a traditional enterprise data architecture to a streaming architecture based on open source and Hadoop technologies


    Required Skills:

    • Bachelor’s degree in Computer Science or related discipline
    • 2+ years of experience in with Pentaho PDI development is essential
    • Hands-on experience of Apache NiFi Data Services development
    • Hands-on experience with developing and integrating Kafka into data feeds / data flows
    • Ability to design complex data feeds and experience with integrating web services.
    • Experience in Python based development
    • Experience in Linux scripting is essentials

    Desired Skills:

    • Experience with AWS services, specifically lambda functions, Data pipeline and Glue is an advantage
    • Working experience with AWS Redshift and Mongo DB preferred
    • Self-motivated to learn and enhance cloud based engineering and automation for data management


    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed