Skip to main content
Weightless
Koniag Government Services
Engineering

Senior Databricks Engineer

Koniag Government Services

Full-Time
Senior
$140k – $160k/yr
Remote
Posted Yesterday

Tech Stack

PythonJavaScriptGoJavaElasticsearchAzureRESTSparkGitCI/CD

Job Description

Koniag Emerging Technologies, LLC a Koniag Government Services company, is seeking a Senior Databricks Engineer with a Secret security clearance to support KET and our government customer in Washington, DC. This is a remote position. We offer competitive compensation and an extraordinary benefits package including health, dental and vision insurance, 401K with company matching, flexible spending accounts, paid holidays, three weeks paid time off, and more. Essential Functions, Responsibilities \& Duties may include, but are not limited to: The Senior Databricks Engineer will be responsible for designing, developing, and deploying scalable data infrastructure while driving key data engineering initiatives. By leveraging the Databricks platform and deep expertise in modern data warehousing and the Medallion Architecture, the engineer will build data pipelines supporting the Department of State’s “4Ps” data lake. The ideal candidate will be able to support the following functions: * Cross\-Functional Collaboration: Work closely with project managers, team members, and business owners to gather requirements, communicate progress, and ensure data meets customer needs. * Architecture \& Design: Should be capable of architecting and implementing scalable, highly available data pipelines using Databricks, adhering strictly to the Medallion Architecture (Bronze, Silver, Gold layers) to ensure data quality, consistency, and optimized reporting. * Data Quality Governance \& Monitoring: Establish comprehensive data quality standards and implement monitoring dashboards to enable business owners and responsible parties to track and proactively address data issues. * Pipeline Development: Write clean, efficient, and production\-ready code using PySpark and SQL for complex data extraction, transformation, and loading (ETL/ELT) processes. Expertise in DLT and structured streaming pipelines. * API Integration: Design and build robust data ingestion pipelines to reliably pull from various third\-party external systems using RESTful API integrations. * Automated Deployments (CI/CD): Implement infrastructure and pipelines as code. Utilize Databricks Asset Bundles (DABs) integrated with Azure DevOps to manage and automate deployments across Development, Staging, and Production environments safely and reliably. * Code Reusability: Develop, package, and maintain custom Python libraries and modules to establish common frameworks used across multiple data projects and teams. * Data Warehousing: Apply foundational and advanced data warehousing concepts (e.g., dimensional modeling, slowly changing dimensions) to optimize data storage and retrieval for downstream analytics and business intelligence. * Performance Tuning: Monitor, troubleshoot, and optimize Databricks clusters and Spark jobs to ensure maximum efficiency and cost\-effectiveness. * External BI \& Native Dashboarding: Securely sharing data with external BI tools (such as PowerBI or SAP BusinessObjects) and developing dashboards and capabilities natively within the Databricks environment. * Mentorship \& Leadership: Serve as a technical SME for the data engineering team, conducting code reviews, establishing best practices, and mentoring other staff members. **Required Skills:** * Knowledge of Object Oriented Programming (OOP) concepts and proficiency in one or more programming or scripting languages (Java, C\#, Python, JavaScript, PowerShell) * Experience: 8\+ years of dedicated Data Engineering experience, with at least 3\+ years functioning in a senior capacity focused on the Databricks ecosystem. * Core Languages: Expert\-level proficiency in PySpark and complex SQL. * Python Library Management: Proven ability to create, package, and manage custom Python libraries (e.g., building Wheel files) for scalable reuse across multiple data pipelines and projects. * API Data Ingestion: Deep understanding of HTTP REST methods (GET, POST, PUT, etc.) with hands\-on experience extracting, paginating, and processing data from external API sources. * Frameworks \& Methodology: Deep theoretical and practical understanding of modern Data Warehousing concepts and hands\-on experience building out the Medallion Architecture. * DevOps \& CI/CD: Proven experience automating Databricks deployments. You must have hands\-on experience configuring and deploying Databricks Asset Bundles (DABs) using Azure DevOps (creating YAML pipelines, managing service principals, and handling multi\-workspace deployments). * Cloud Platforms: Strong familiarity with the broader cloud ecosystem surrounding Databricks (e.g., Azure Data Lake Storage Gen2, Azure Key Vault). * Version Control: Proficiency with Git\-based version control workflows. * Concise and comprehensive written and oral communication skills for both technically understanding and customer usage support. **Clearance Requirement:** * Secret Security clearance **Desired Skills:** * Experience with a wide range of data products, BI tools, and visualization libraries such as Tableau, PowerBI, Jupyter Notebooks, d3js, and Analytics Search Tools such as Elasticsearch, SOLR, Lucene * Experience working with HR application (PeopleSoft) data for reporting purposes. * 3\+ year(s) .NET, Java or similar OOP experience * Experience writing Oracle PL/SQL queries, packages, and procedures. * Experience architecting Data Vault 2\.0 solutions. Our Equal Employment Opportunity Policy The company is an equal opportunity employer. The company shall not discriminate against any employee or applicant because of race, color, religion, creed, ethnicity, sex, sexual orientation, gender or gender identity (except where gender is a bona fide occupational qualification), national origin or ancestry, age, disability, citizenship, military/veteran status, marital status, genetic information or any other characteristic protected by applicable federal, state, or local law. We are committed to equal employment opportunity in all decisions related to employment, promotion, wages, benefits, and all other privileges, terms, and conditions of employment. The company is dedicated to seeking all qualified applicants. If you require an accommodation to navigate or apply for a position on our website, please get in touch with Heaven Wood via e\-mail at accommodations@koniag\-gs.com or by calling 703\-488\-9377 to request accommodations. Koniag Government Services (KGS) is an Alaska Native Owned corporation supporting the values and traditions of our native communities through an agile employee and corporate culture that delivers Enterprise Solutions, Professional Services and Operational Management to Federal Government Agencies. As a wholly owned subsidiary of Koniag, we apply our proven commercial solutions to a deep knowledge of Defense and Civilian missions to provide forward leaning technical, professional, and operational solutions. KGS enables successful mission outcomes for our customers through solution\-oriented business partnerships and a commitment to exceptional service delivery. We ensure long\-term success with a continuous improvement approach while balancing the collective interests of our customers, employees, and native communities. For more information, please visit www.koniag\-gs.com. Equal Opportunity Employer/Veterans/Disabled. Shareholder Preference in accordance with Public Law 88\-352

Get jobs like this in your inbox

Join thousands of digital nomads getting the best remote jobs delivered weekly. Free, no spam.