Beekin is on a mission to make housing fair, affordable and efficient for millions of renters. Our platform leverages cutting edge machine learning, has helped thousands of renters find enduring communities and happy homes. It has also added millions of dollars in profit into the pocket of landlords, all with better data.
As our future colleague,
You will be an engineer. You can understand and appreciate code to transform noisy real-world data into high-signal models that stand the test of time.
You will be a storyteller. You will communicate your insights in a way that resonates with your partners - including Beekin's leadership - to turn theory into action.
You will be an entrepreneur. You will come to understand the nature of how real estate operates, and strive to make housing fair, transparent and affordable
A Beekin day for you, could mean
- Having thoughtful discussions with Product Manager to understand Customers data engineering requirements.
- Breaking complex requirements into smaller tasks for execution.
- Ability to work efficiently with a solid sense for setting priorities; ability to guide your own learning and contribute to domain knowledge building
- Mentoring and guiding team members on ETL/ ELT processes in the cloud using tools like AirFlow, Glue, Stitch, Cloud Data Fusion, DataFlow.
- Ability to work at an abstract level and build consensus; ability to see from and sell to multiple viewpoints
- Designing and implementing Data Lake, Data Warehouse, and Data Marts in AWS
- Creating efficient SQL queries and understanding query execution plans for tuning queries on engines like PostgreSQL.
- Performance tuning of OLAP/ OLTP databases by creating indices, tables, and views.
- Write Python, Scala scripts for orchestration of data pipeline
- 5+ years experience in data engineering
- Strong Python, Scala programming ability with several hands-on projects implemented.
- Interest in leading a team of Data Engineers in future with the successful implementation of Data Pipelines on public cloud infrastructure
- Strong understanding of Data Engineering concepts including ETL, ELT, Data Lake, Data Warehousing, and Data Pipelines.
- Experience designing and implementing Data Pipelines, Data Lakes, Data Warehouses, and Data Marts that support terabytes scale data.
- A clear understanding of Database concepts like indexing, query performance optimization, views, and various types of schemas.Hands-on SQL programming experience with knowledge of windowing functions, subqueries, and various types of joins.
- Proficiency with Data Modeling experience (e.g., Relational, Dimensional, Columnar, Big Data)
- Proficiency with complex SQL and NoSQL experience
- A career trajectory you can own
- Stock Option Plan
- Training & Development
- Work From Home (fully remote possible)
- Leave Package