Lucid is the new generation of EV. Our relentless focus on innovation, luxury, and sustainability drives us into a reality where you no longer have to choose between doing great things, doing the right thing, and doing everything with the highest regard for efficiency and design. There were luxury cars, then EVs, now there’s Lucid.
Leading the future in luxury electric and mobility
At Lucid, we set out to introduce the most captivating, luxury electric vehicles that elevate the human experience and transcend the perceived limitations of space, performance, and intelligence. Vehicles that are intuitive, liberating, and designed for the future of mobility.
We plan to lead in this new era of luxury electric by returning to the fundamentals of great design – where every decision we make is in service of the individual and environment. Because when you are no longer bound by convention, you are free to define your own experience.
Come work alongside some of the most accomplished minds in the industry. Beyond providing competitive salaries, we’re providing a community for innovators who want to make an immediate and significant impact. If you are driven to create a better, more sustainable future, then this is the right place for you.
Principal Data Architect is responsible to define and lead the Data Architecture, Data Quality, Data Governance at lucid, for both Vehicle and Operational data, ingesting, processing, and storing Trillions of rows of data per day. This hands-on role helps solve real big data problems, which most of the standard tools on the market are not capable of handling. You will be designing solutions, writing codes and automation, defining standards, and establish best practices across the company.
At Lucid, we don’t just welcome diversity - we celebrate it! Lucid Motors is proud to be an equal opportunity workplace. We are committed to equal employment opportunity regardless of race, color, national or ethnic origin, age, religion, disability, sexual orientation, gender, gender identity and expression, marital status, and any other characteristic protected under applicable State or Federal laws and regulations.
Notice regarding COVID-19 vaccination requirement as a condition of gainful employment within the United States
At Lucid, we prioritize the health and wellbeing of our employees, families, and friends above all else. In response to the novel Coronavirus, and the increased transmissibility with recent variants, all new Lucid employees, whose job will be based in the United States, must provide original documentation confirming status as having received the prescribed inoculation (doses) based on the manufacturer's guidelines on their first day of employment.
Individuals seeking a medical and/or religious exemption from this requirement may be granted such an accommodation after submitting a formal request to and the subsequent review and approval thereof by our dedicated Covid-19 Response team.
To all recruitment agencies: Lucid Motors does not accept agency resumes. Please do not forward resumes to our careers alias or other Lucid Motors employees. Lucid Motors is not responsible for any fees related to unsolicited resumes.
Role
Design, implement and lead Data Architecture, Data Quality, Data Governance across LucidDefining data modeling standards and foundational best practicesDevelop and evangelize data quality standards and practicesEstablish data governance processes, procedures, policies, and guidelines to maintain the integrity and security of the data.Drive the successful adoption of organizational data utilization and self-serviced data platform. Create and maintain critical data standards and metadata that allows data to be understood and leveraged as a shared assetDevelop standards and write templet codes for sourcing, collecting, and transforming data for streaming or batch processing data. Design data schemes, object models, and flow diagrams to structure, store, process, and integrate dataProvide architectural assessments, strategies, and roadmaps for data management.Apply hands-on subject matter expertise in the Architecture and administration of Big Data platforms, Data Lake Technologies (AWS S3/Hive), and experience with ML and Data Science platforms.Implement and manage industry best practice tools and processes such as Data Lake, Delta Lake, S3, Spark ETL, Airflow, Hive Catalog, Ranger, Redshift, Spline, Kafka, MQTT, Timeseries Database, Cassandra, Redis, Presto, Kubernetes, Docker, CI/CD, DevOpsTranslate big data and analytics requirements into data models that will operate at a large scale and high performance and guide the data analytics engineers on these data models.Define templates and processes for the design and analysis of data models, data flows, and integration.Lead and mentor Data Analytics team members in best practices, processes, and technologies in Data platformsQualifications
B.S. or M.S. in Computer Science, or equivalent.15+ years of hands-on experience in Data Warehouse, ETL, Data Modeling & Reporting.7+ years of hands-on experience in productionizing and deploying Big Data platforms and applications, Hands-on experience working with: Relational/SQL, distributed columnar data stores/NoSQL databases, time-series databases, Spark streaming, Kafka, Hive, Parquet, Avro, and moreextensive experience in understanding a variety of complex business use cases and modeling the data in the data warehouse. Highly skilled in SQL, Python, Spark, AWS S3, Hive Data Catalog, Parquet, Redshift, Airflow, and Tableau or similar tools.Proven experience in building a Custom Enterprise Data Warehouse or implementing tools like Data Catalogs, Spark, Ranger, Presto, Tableau, Kubernetes, and DockerKnowledge of infrastructure requirements such as Networking, Storage, and Hardware Optimization with Hands-on experience with Amazon Web Services (AWS)Strong verbal and written communications skills are a must and work effectively across internal and external organizations and virtual teams.Demonstrated industry leadership in the fields of Data Warehousing, Data Science, and Big Data related technologies.Strong understanding of distributed systems and container-based development using Docker and Kubernetes ecosystem Deep knowledge of data structures and algorithms.Experience in working in large teams using CI/CD and agile methodologies.