Nava logo

Principal Software Engineer (Data Engineering)

Nava
Full-time
Remote
United States
$135,900 - $171,000 USD yearly

Nava - Data Engineer

About Nava: Nava is a consultancy and public benefit corporation working to make government services simple and effective. Since 2013, federal, state, and local agencies have trusted Nava to help solve technology modernization challenges.

Role Overview: You will modernize data architectures and pipelines in critical government programs, working with stakeholders to design and implement data models, improve pipelines, and enhance data security. This role involves processing terabytes of data and providing subject matter expertise for data integration requirements from raw data through consumable analytics.

Location: Remote position available in: Alabama, Arizona, California, Colorado, DC, Florida, Georgia, Illinois, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Missouri, Nevada, North Carolina, New Jersey, New York, Oklahoma, Oregon, Pennsylvania, Rhode Island, South Carolina, Texas, Tennessee, Utah, Virginia, Washington, Wisconsin.

Key Responsibilities:

  • Document and maintain data strategies, logical/physical data models, and data architecture roadmaps
  • Standardize data ingestion and processing pipelines to scale with increasing utilization
  • Audit and reverse-engineer business rules in legacy systems for data platform integration
  • Implement large-scale data ecosystems within cloud platforms with data governance
  • Work with cross-functional teams to translate business requirements into technical specifications
  • Design, develop, test, and deploy data engineering solutions in cloud platforms
  • Develop automated testing, monitoring, alerting, and CI/CD for production systems
  • Maintain security and privacy standards throughout all data pipeline aspects

Qualifications:

  • 7+ years of data engineering experience
  • 3+ years of cloud data architecture experience (AWS preferred) and big data technologies
  • Experience with Kimball's Dimensional modeling methodology
  • Proficiency building ETL/ELT pipelines in Python or Java
  • Data lakehouse architecture experience
  • Advanced SQL and relational database expertise
  • Experience with data cleaning, modeling, and protecting sensitive data
  • Proficiency building data integrations using API and file-based protocols
  • Databricks experience preferred
  • Must be legally authorized to work in the United States without visa sponsorship

Compensation & Benefits: $135,900 - $171,000 annually Comprehensive health coverage, 401(k) match (4%), annual performance bonus, parental leave, sabbatical leave after continuous service, and professional development budget.