Data Engineer-Kiewit Nuclear Solutions
Remote, US Atlanta, GA, US Idaho Falls, ID, US Lenexa, KS, US Albuquerque, NM, US Aiken, SC, US Oak Ridge, TN, US Dallas, TX, US Houston, TX, US Seattle, WA, US
Requisition ID: 180210
Job Level: Mid Level
Home District/Group: Kiewit Nuclear Solutions
Department: Technology Group
Market: Nuclear
Employment Type: Full Time
Position Overview
The Technical Systems Analyst / Data Engineer supports enterprise-scale project data platforms that power analytics, reporting, and application integration across Kiewit’s business. This role operates within the Digital Project Execution (DPE) team and is responsible for designing, building, and optimizing modern data pipelines and cloud-based data architectures.
This position combines hands-on data engineering with cross-functional collaboration. The individual will work closely with Project Managers, Digital Execution Managers, Product Managers, Architects, Business Analysts, and core Technology teams to deliver reliable, scalable, and high-performance data solutions.
The ideal candidate is both technically strong / business-aware, and capable of translating complex project workflows into durable, well-architected data systems.
District Overview
This position is within the Kiewit Technology Group, supporting Kiewit Nuclear Solutions.
Location
This is a remote position that may require some travel.
#LI-LH1
Responsibilities
Data Engineering & Architecture
- Design, build, and maintain scalable ETL/ELT pipelines using Azure Databricks (PySpark, Spark SQL) as the primary platform.
- Develop and optimize batch, streaming, and micro-batch processing architectures.
- Implement data ingestion workflows using Azure Data Factory, Azure Data Lake, SQL Server, and Snowflake.
- Design data models that support analytics, reporting, and database-driven application integration.
- Optimize performance of large-scale data systems, including query tuning and workload optimization.
Platform & Operations
- Support day-to-day operations of enterprise data platforms.
- Troubleshoot end-to-end data pipeline issues across ingestion, transformation, storage, and consumption layers.
- Implement data quality checks, validation rules, and monitoring processes.
- Contribute to architectural standards, governance frameworks, and documentation.
Collaboration & Business Enablement
- Partner with project teams to understand complex business rules and translate them into scalable data solutions.
- Support visualization and analytics teams by delivering clean, structured, high-performance datasets.
- Contribute to continuous improvement of data processes, standards, and best practices.
- Stay aligned with evolving Kiewit data strategies and emerging industry technologies.
Qualifications
Technology stack
Primary:
- Azure Databricks (PySpark, Spark SQL)
- Azure Data Lake
- Azure Data Factory
- SQL Server
- Snowflake
Supporting:
- Python
- Git (source control)
- Data modeling and warehousing frameworks
Required qualifications
- 6+ years of experience in Data Engineering, Data Analytics, Data Warehousing, or related roles.
- Experience in Azure-based architectures.
- Strong proficiency in advanced SQL development and performance optimization.
- Hands-on experience building ETL/ELT pipelines in a cloud environment.
- Familiarity with EPC / construction industry workflows highly preferred.
- Experience with Databricks or Spark-based distributed data processing.
- Experience troubleshooting complex data workflows across multiple systems.
- Strong understanding of data modeling principles (relational and dimensional).
- Experience documenting functional requirements, technical designs, and data workflows.
- Excellent written and verbal communication skills.
PREFERRED QUALIFICATIONS
- Familiarity with CI/CD and version control practices (Git).
- Experience implementing data quality frameworks and governance processes.
- Experience supporting enterprise-scale analytics environments.
BONUS EXPERIENCE
- Knowledge of engineering design tool ecosystems (Hexagon Smart Suite).
- Experience supporting data environments tied to project lifecycle systems.
- Exposure to large-scale project-based data environments.
Other Requirements:
• Regular, reliable attendance
• Work productively and meet deadlines timely
• Communicate and interact effectively and professionally with supervisors, employees, and others individually or in a team environment
• Perform work safely and effectively. Understand and follow oral and written instructions, including warning signs, equipment use, and other policies.
• Work during normal operating hours to organize and complete work within given deadlines. Work overtime and weekends as required.
• FIELD ROLES ONLY May work at various different locations and conditions may vary
Base Compensation: $136,000/yr - $142,000/yr
(Actual compensation is subject to variation due to such factors as education, experience, skillset, and/or location)
We offer our fulltime staff employees a comprehensive benefits package that’s among the best in our industry, including top-tier medical, dental and vision plans covering eligible employees and dependents, voluntary wellness and employee assistance programs, life insurance, disability, retirement plans with matching, and generous paid time off.
Equal Opportunity Employer, including disability and protected veteran status.
Job Segment:
Nuclear Engineering, Data Modeler, Construction, Database, SQL, Engineering, Data, Technology