Skip to main content

Lead, Data Engineer

Remote

Job ID 18662
Postuler Maintenant

L3Harris is dedicated to recruiting and developing diverse, high-performing talent who are passionate about what they do. Our employees are unified in a shared dedication to our customers’ mission and quest for professional growth. L3Harris provides an inclusive, engaging environment designed to empower employees and promote work-life success. Fundamental to our culture is an unwavering focus on values, dedication to our communities, and commitment to excellence in everything we do.

L3Harris Technologies is the Trusted Disruptor in the defense industry. With customers’ mission-critical needs always in mind, our employees deliver end-to-end technology solutions connecting the space, air, land, sea and cyber domains in the interest of national security.

Job Title: Lead, Data Engineer (Remote)

Job Code: 18662

Job Location: Remote Opportunity 

Job Schedule:9/80: Employees work 9 out of every 14 days – totaling 80 hours worked – and have every other Friday off

Job Description:

L3Harris Enterprise Data, Analytics, and Automation team is seeking a Lead Data Engineer with extensive experience in managing enterprise-level data life cycle processes. This role includes overseeing data ETL/ELT pipelines, ensuring adherence to data standards, maintaining data frameworks, conducting data cleansing, orchestrating data pipelines, and ensuring data consolidation. The selected individual will play a pivotal role in maintaining ontologies, building scalable data solutions, and developing dashboards to provide actionable insights for the enterprise.

This position will support the company’s modern data platform, focusing on data pipeline development and maintenance, platform design, documentation, and user training. The goal is to ensure seamless access to data for all levels of the organization, empowering decision-makers with clean, reliable data.

Essential Functions:

  • Design, build, and maintain robust data pipelines to ensure reliable data flow across the enterprise.
  • Maintain data pipeline schedules, orchestrate workflows, and monitor the overall health of data pipelines to ensure continuous data availability.
  • Create, update, and optimize data connections, datasets, and transformations to align with business needs.
  • Troubleshoot and resolve data sync issues, ensuring consistent and correct data flow from source systems.
  • Collaborate with cross-functional teams to uphold data quality standards and ensure accurate data is available for use.
  • Utilize Palantir Foundry to establish data connections to source applications, extract and load data, and design complex logical data models that meet functional and technical specifications.
  • Develop and manage data cleansing, consolidation, and integration mechanisms to support big data analytics at scale.
  • Build visualizations using Palantir Foundry tools and assist business users with testing, troubleshooting, and documentation creation, including data maintenance guides.

Qualifications:

  • Bachelor's with 9 years prior experience in data engineering, Graduate Degree with 7 years prior experience in data engineering. In lieu of a degree, minimum of 13 years of prior related experience in data engineering.
  • 1 year experience with designing and developing data pipelines in PySpark, Spark SQL, or Code Build.
  • 1 year experience in building and deploying data synchronization schedules and maintaining data pipelines using Palantir Foundry.
  • 2 years of experience with Data Pipeline development or ETL tools such as Palantir Foundry, Azure Data Factory, SSIS, or Python.
  • 3 years of experience in Data Integration with Palantir.

Preferred Additional Skills:

  • Strong understanding of Business Intelligence (BI) and Data Warehouse (DW) development methodologies.
  • Experience with the Snowflake Cloud Data Platform, including data architecture, query optimization, and performance tuning.
  • Proficiency in Python, Pandas, Databricks, JavaScript, or other scripting languages for data processing and automation.
  • Experience with other ETL tools such as Azure Data Factory (ADF), SSIS, Informatica, or Talend is highly desirable.
  • Familiarity with connecting and extracting data from various ERP applications, including Oracle EBS, SAP ECC/S4, Deltek Costpoint, and more.

#LI-Remote

In compliance with pay transparency requirements, the salary range for this role in Colorado, Hawaii, Illinois, Washington State and New York State is $112,500 - $209,500. For California, Seattle and New York City, the salary range for this role is $129,500 - $240,500. This is not a guarantee of compensation or salary, as final offer amount may vary based on factors including but not limited to experience and geographic location. L3Harris also offers a variety of benefits, including health and disability insurance, 401(k) match, flexible spending accounts, EAP, education assistance, parental leave, paid time off, and company-paid holidays. The specific programs and options available to an employee may vary depending on date of hire, schedule type, and the applicability of collective bargaining agreements.

Postuler Maintenant

Devenez membre de notre communauté de talents

Inscrivez-vous aux alertes-emploi et soyez le premier à être informé de nos postes vacants.

Sélectionnez une catégorie d’emploi ou un lieu d’implantation, puis cliquez sur « Ajouter » pour chaque recherche sauvegardée. Enfin, cliquez sur « S’inscrire » pour créer votre alerte-emplois.

Interessé(e) par

Télécharger le CV (facultatif)

En soumettant vos renseignements, vous reconnaissez avoir lu notre politique de confidentialité (ce contenu ouvre dans une nouvelle fenêtre) et vous consentez à recevoir des communications par courriel de la part de L3Harris Technologies.