Transforming our agriculture system is the single most important thing we can do to combat climate change.
That’s why FluroSat and Dagan have combined forces to launch Regrow, a company that empowers the food and agriculture industries to adopt, scale and monetize resilient agricultural practices.
Regrow is a multinational team of scientists, agronomists, engineers, and software developers committed to transforming the supply chain from farm to fork to ensure a prosperous future for people and planet.
We are a climate tech company committed to reversing climate change. How do we reach this lofty goal? By ushering the agriculture industry into a new era!
Founded by globally recognized innovators in science and ag technology, Regrow is unlocking the power and profitability of resilient agriculture across the supply chain — supporting industry leaders, from growers to global food brands. Regrow combines best-in-class agronomy, soil and carbon modeling, remote sensing, and AI to deliver customized, site-specific, scalable solutions to the agri-food industry. Our mission is to make agriculture resilient globally, on every acre and every farm.
Regrow serves over 100 organizations that have collectively invested more than $19M to help farmers adopt regenerative practices. These actions will abate more than 800k tonnes of CO2e, equivalent to the carbon sequestered by 934k acres of U.S. forests in one year.
Does this sound like your dream job? If so, apply for the position, and we’ll contact you. We appreciate your interest in Regrow. We are committed to fostering a diverse, inclusive environment and to promoting these values in everyone on our team.
What will you do?
Design, build, test, and deploy scalable geospatial APIs, services, and data pipelinesCollaborate with science teams, product managers, and other backend engineers to deliver complex geospatial data solutionsBrainstorm features with engineering management, product managers and domain experts based on your knowledge of the codebaseExpertly review code, have your code reviewed, and mentor other engineersBreakdown work into clearly defined tasks that can be completed by other engineersQualifications:
5+ years of experience building and deploying high-quality customer facing production data pipelines and services primarily in PythonExperience working with raster and vector data structures, and proficient in geospatial libraries/tools such as GDAL, Rasterio, Shapely, and QGIS.Skilled at manipulating multi-dimensional raster data and large-scale geospatial arrays (e.g., using NumPy, xarray, or Dask).Exposure to geospatial data standards, including OGC-compliant services (WMS, WFS, WCS) and SpatioTemporal Asset Catalog (STAC) specifications for organizing and querying data.Proven experience creating and deploying data processing workflows ( Airflow, Argo, Kafka, etc) that enable scalable, reproducible, and version-controlled geospatial data generation.Working knowledge of REST APIs with Python Web Frameworks such as FastAPI, Flask, Django, etc.Strong understanding of relational databases (PostgreSQL (with PostGIS), MySQL) and data warehouses like BigQuery, with proficiency in querying geospatial data and working with indexes, geometry types, and geospatial joins.Experience deploying apps/services on at least one major enterprise cloud platform (AWS, GCP, or Azure).Excellent English language presentation and communication skillsWell-versed in Kubernetes and Docker (building and scheduling containers).Proficient in profiling, debugging, tracing, and or parallelizing/optimizing Python code.