Data Engineering Intern (Fall)
Aircapture
Date: 15 hours ago
City: Berkeley, CA
Contract type: Intern

At Aircapture we're creating and scaling a circular carbon economy to solve what we believe to be our lifetime's most pressing challenge: the climate crisis. We supply commercial and industrial customers with clean CO2 captured from our atmosphere to radically improve the environment, our economy, and our lives. We value building a team of people who represent diverse backgrounds--be it thought, education, gender, ethnicity, age, sexual orientation--to reach our goals. Thank you for considering us.
As a part time Fall Data Engineering Intern you will play a key role in creating new data architectures and monitoring, optimizing, and automating existing data pipelines and models. You will work alongside our data analysis, test engineering and electrical engineering teams to improve Aircapture's data processing speed, efficiency, and cost. If you are excited to mitigate the impact of climate change at a groundbreaking climate technology startup, this is the role for you!
Pay Rate: $25 per hour
What You'll Do Here
As a part time Fall Data Engineering Intern you will play a key role in creating new data architectures and monitoring, optimizing, and automating existing data pipelines and models. You will work alongside our data analysis, test engineering and electrical engineering teams to improve Aircapture's data processing speed, efficiency, and cost. If you are excited to mitigate the impact of climate change at a groundbreaking climate technology startup, this is the role for you!
Pay Rate: $25 per hour
What You'll Do Here
- This role is onsite at our Berkeley headquarters either approximately 10+ hours per week–your work hours can be adjusted to fit your class schedule
- Partner with the test engineering and data analysis teams to understand Aircapture's data requirements and translate business needs into technical solutions
- Build and maintain scalable data pipelines to extract, transform, and load (ETL) operational data from Aircapture's projects and machines
- Develop and maintain dashboards, reports, and visualizations to communicate key insights and contribute to ongoing data-driven initiatives
- Serve as a point of contact for internal IT needs, working closely with external support partners when needed
- Set up and document IT scripts and services supporting core data systems, ensuring smooth operations and reliable data workflows
- Extract, transform, and load sensor and instrument data into databases to create analytics that inform decision making
- An open mind and curiosity to try new things–you're ready to share your perspective with thoughtfulness and conviction
- Currently entering senior year or equivalent in Computer Science, Data Science, or a related field, with at least one completed course, internship, or hands-on project in data engineering; or recently complete bachelor's degree
- Proficiency in Python (pandas/numpy) + SQL (Pyspark a plus)
- Experience working in any cloud a plus (we use AWS)
- Basic understanding of data science concepts (data cleaning, feature engineering, regression, machine learning principles)
- Knowledge of Apache software (such as Spark, Airflow, and Kafka), Linux, and Docker a plus–if you don't know them, you can start learning them here!
How to apply
To apply for this job you need to authorize on our website. If you don't have an account yet, please register.
Post a resume