Senior Data Platform Engineer
Shape the Future of Finance
Pagaya is building a leading artificial intelligence network to help our partners grow their businesses and better serve their customers.
Pagaya powers a leading artificial intelligence network that enables banks, fintechs, merchants, lenders, and other B2C businesses to provide their customers with greater access to financial services. We help partners grow their customer base while managing risk, all with a seamless customer experience.
Pagaya’s network enables our partners’ customers access to credit across Auto, Credit Card, Personal loans, and Point of Sale markets. We are also developing products in insurance, real estate, and more. Our network is fully automated and operating at scale - with the support of the Pagaya network, our partners have processed millions of applications, with a new application typically analyzed every second.
Let's create better outcomes together!
About the Role
Software is fundamental to research. From the humanities to physics, biology to archaeology, software plays a vital role in generating results. The Data Engineering group is a cross-functional team responsible for all data activities, including integration, monitoring, quality, and accessibility.
The Data Platform Engineer will have responsibility for working on a variety of data projects. This includes building data infrastructures with Big Data tools/architectures as well as designing and engineering data pipelines on top of the infrastructure.
- Build out and operate our foundational data infrastructure, including storage (cloud data warehouse, S3 data lake), orchestration (Airflow), and processing (Spark, DBT).
- Creates robust and automated pipelines to ingest and process structured data from source systems into analytical platforms using batch and streaming mechanisms leveraging cloud-native toolset.
- Developing and maintaining data lake and data warehouse schematics, layouts, architectures, and non-relational databases for data access and Advanced Analytics.
- Leverages the right tools for the right job to deliver testable, maintainable, and modern data solutions.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Work with other members of the data group, including data architects, data analysts, and data scientists.
- Use the tools and languages that are best suited to the job - Complete flexibility to problem-solving with novelty and creativity encouraged.
- Open-source projects and frameworks are recommended.
- Work with a team of highly motivated, bright, fun, and creative people.
- Your intellectual curiosity and hard work contributions will be welcome to our culture of knowledge sharing, transparency, and shared fun and achievement.
- Contribute to our software engineering culture of writing correct, maintainable, elegant, and testable code.
- At least 4 years of experience as a Python Developer.
- A data-oriented mindset.
- At least 1 year of experience with Data Warehousing (Snowflake, Redshift, Firebase, BigQuery, etc..).
- Experience with AWS cloud services: EMR, EKS, Kinesis, EventBridge, DynamoDB, Lambdas.
- Deep understanding of ETL, ELT, data ingestion/cleansing, and engineering skills.
- Building and designing large-scale applications.
- Undergraduate degree in Computer Science, Computer Engineering, or similar disciplines from rigorous academic institutions.
Any of the below would be an advantage:
- Experience with data pipeline and workflow management tools: Airflow, Azkaban, Luigi, etc.
- Experience with building, running, and testing DBT models and macros.
- Experience using data tools and frameworks like: Spark, Flink, Hadoop, Presto, Hive, or Kafka.
- Experience with Apache Iceberg and project Nessie.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Operating systems, especially UNIX, Linux, and Mac OS.
- Experience supporting and working with cross-functional teams in a dynamic environment.
Pagaya was founded in 2016 by seasoned research, finance, and technology entrepreneurs, and we are now 500+ strong in New York, Los Angeles, and Tel Aviv.
We move fast and smart, identifying new opportunities and building end-to-end solutions from AI models and unique data sources. Every Pagaya team member is solving new and exciting challenges every day in a culture based on partnership, collaboration, and community.
Join a team of builders who are working every day to enable better outcomes for our partners and their customers.
Our values are at the heart of everything we do. We believe great solutions are built through a great community.
- Continuous Learning: It’s okay to not know something yet, but have the desire to grow and improve.
- Win for all: We exist to make sure all participants in the system win, which in turn helps Pagaya win.
- Debate and commit: Share openly, question respectfully, and once a decision is made, commit to it fully.
- The Pagaya way: Break systems down to their most foundational element, and rebuild them unique to Pagaya.
More than just a job
We believe health, happiness, and productivity go hand-in-hand. That's why we're continually looking to enhance the ways we support you with benefits programs and perks that allow every Pagayan to do the best work of their life.
Pagaya is an equal opportunity employer. Pagaya is encouraging diversity and actively seeking applicants from all backgrounds, as are committed to creating a diverse workforce together with an inclusive environment for all. Employment is decided on the basis of qualifications, skills, and business needs.