About the company:
BETA Technologies is creating an electric transportation ecosystem that’s safe, reliable and sustainable. A relentlessly focused team is building an extensive charging infrastructure and ALIA, the world’s most technologically advanced electric vertical aircraft (EVA).
BETA’s platform and products are strikingly simple. Prioritization of safety and a pragmatic approach to certification drive elegant redundancy, appropriate diversity of implementation and simplicity of control. ALIA’s fixed-pitch propellers and centrally located batteries make it an inherently stable aircraft that is safe to fly and easy to maneuver.
We are looking for a Data Engineer to join our growing Data team. The person in this position will be responsible for expanding and optimizing our data and data pipeline architecture. The ideal candidate is an experienced data pipeline builder who enjoys optimizing data systems and building them from the ground up. They must be self-directed and comfortable supporting the data needs of multiple teams, domains, systems and products. The person in this position will work in a small team environment and support the development of new data products in the evolving world of eVTOL aircraft design.
Essential Duties and Responsibilities:
Create and maintain optimal data pipeline and storage architecturesIdentify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.Design, write, test and deploy production-ready codeWork with other members of the team to create data products that meet the needs of a growing and diverse companyBuild the infrastructure required for optimal collection, extraction, transformation, and loading of data from a variety of data sources using cloud ‘big data’ technologies as appropriateWork with data and subject matter experts to strive for greater functionality in our data systems.Mentor interns and junior engineersMinimum Qualifications (Knowledge, Skills, and Abilities):
Bachelor’s Degree or Master’s in Computer Science, Statistics, Software Engineering, or a relevant field.3 years experience in a Cloud/Big Data Engineering roleExtensive experience architecting and programming large scale software applications in PythonExtensive experience with cloud platforms such as AWS or GCPAdvanced working SQL knowledge and experience working with relational databases, query authoring as well as working familiarity with a variety of databases including columnar (RedShift, BigQuery, etc.) and NoSQL.Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.Experience working with message queuing, stream processing, and highly scalable ‘big data’ data stores.Experience working with GIT version control and CI/CD systems.Strong project management and organizational skills.Stellar troubleshooting skills with the ability to spot issues before they become problems.Excellent communication skills, both written and verbal.Experience supporting and working with cross-functional teams in a dynamic environment.Preferred Qualifications (Knowledge, Skills, and Abilities):
Experience developing Infrastructure as Code (IaC) using AWS CDK, Cloudformation or Terraform.Familiarity with Labview, Matlab, and/or Simulink.Proficiency in building RESTful APIs and web servicesExperience with Apache Big Data tools such as Avro, Beam, Parquet, etc.