Employee Applicant Privacy Notice
Who we are:
Shape a brighter financial future with us.
Together with our members, we’re changing the way people think about and interact with personal finance.
We’re a next-generation financial services company and national bank using innovative, mobile-first technology to help our millions of members reach their goals. The industry is going through an unprecedented transformation, and we’re at the forefront. We’re proud to come to work every day knowing that what we do has a direct impact on people’s lives, with our core values guiding us every step of the way. Join us to invest in yourself, your career, and the financial world.
The role:
We are seeking a Senior Data Engineer to join our Risk Data Team as a hands-on technical lead supporting Credit, Collections, and Fraud. This role blends deep production data engineering with formal technical and people leadership.
You will own architectural decisions for the Risk data platform, define modeling standards, elevate engineering rigor, and build scalable data systems that power risk decisioning across the organization. This role exists to ensure that Risk data pipelines are reliable, well-modeled, observable, and built with long-term maintainability in mind.
You will contribute directly to production data pipelines while setting standards for data modeling, dbt architecture, code quality, and observability. This is not an architect-only or strategy-only role — it requires hands-on execution and demonstrated team leadership ownership.
What you’ll do:
Technical Leadership
- Serve as technical lead for the Risk Data Engineering team.
- Own architectural decisions and data modeling strategy across the Risk domain.
- Define naming conventions, modeling standards, and layered dbt architecture (staging → intermediate → marts).
- Lead architecture discussions and technical planning sessions.
- Conduct code reviews focused on maintainability, readability, and long-term scalability.
- Translate business priorities into well-scoped, production-ready technical deliverables.
Production Data Engineering
- Design and build production-grade Snowflake data models.
- Develop scalable dbt projects, including reusable macros and testing frameworks.
- Manage Apache Airflow DAGs, including idempotency, retry logic, and failure handling.
- Implement CI/CD best practices for dbt and data pipelines.
- Drive automation initiatives to reduce manual operational overhead.
Data Modeling
- Design dimensional and relational models aligned to business definitions.
- Apply modeling best practices including grain declaration, SCD strategies, and surrogate key management.
- Balance normalization and performance trade-offs.
- Evolve models safely as business requirements change.
- Ensure all models are clearly documented with lineage and business logic.
Data Quality & Observability
- Own the dbt testing framework (schema tests, custom tests, generic tests).
- Define and enforce freshness checks, SLA standards, and row-count validations.
- Implement monitoring and observability using DataDog.
- Proactively identify and reduce reliability incidents.
- Establish measurable data quality SLAs in partnership with stakeholders.
People Leadership
- Participate in hiring, onboarding, and team building.
- Run regular 1:1s and provide structured performance feedback.
- Develop engineers toward ownership and technical growth.
- Address underperformance early and constructively.
- Foster a culture of accountability, documentation, and engineering excellence.
Collaboration & Stakeholder Engagement
- Partner with Risk Data Product Managers, Data Science, ML, and business stakeholders.
- Communicate modeling decisions, trade-offs, and pipeline health clearly.
- Influence cross-functional technical direction across Risk and platform teams.
Operational Excellence
- Maintain scalable, secure data systems aligned with enterprise governance standards.
- Improve documentation practices including runbooks and architecture decision records.
- Contribute to workforce planning and technical roadmap discussions.
This role requires collaboration during core business hours. Remote candidates must be able to work cross-functionally with distributed teams.
What you’ll need:
- Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or related field (or equivalent work experience).
- 8+ years of hands-on data engineering experience.
- 2+ years of experience serving as a tech lead or leading engineers formally.
- Deep expertise in dimensional and relational data modeling, including SCD strategies and grain design.
- Advanced dbt experience, including layered architecture, macros, advanced testing, and semantic layer concepts.
- Strong hands-on Snowflake experience, including modeling and performance optimization.
- Production-level experience managing Apache Airflow DAGs.
- Advanced SQL skills, including query optimization and performance tuning.
- Strong Python skills for data pipeline development and automation.
- Demonstrated ownership of a data quality and monitoring framework.
- Experience working in regulated or high-accuracy environments.
- Experience participating in hiring, onboarding, and performance management.
- Strong communication skills and ability to influence cross-functional stakeholders.
Nice to have:
- Experience with Snowflake advanced capabilities (Snowpark, Cortex AI, ML functions).
- Familiarity with LLM tooling, RAG systems, or AI-assisted data workflows.
- Financial services experience (Credit, Fraud, Collections).
- AWS experience (S3, Glue, Lambda) and infrastructure-as-code familiarity.
- Experience implementing data governance frameworks at scale.
Compensation and Benefits
The base pay range for this role is listed below. Final base pay offer will be determined based on individual factors such as the candidate’s experience, skills, and location.
To view all of our comprehensive and competitive benefits, visit our
Benefits at SoFi page!
SoFi provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion (including religious dress and grooming practices), sex (including pregnancy, childbirth and related medical conditions, breastfeeding, and conditions related to breastfeeding), gender, gender identity, gender expression, national origin, ancestry, age (40 or over), physical or medical disability, medical condition, marital status, registered domestic partner status, sexual orientation, genetic information, military and/or veteran status, or any other basis prohibited by applicable state or federal law.
The Company hires the best qualified candidate for the job, without regard to protected characteristics.
Pursuant to the San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records.
SoFi is committed to an inclusive culture. As part of this commitment, SoFi offers reasonable accommodations to candidates with physical or mental disabilities. If you need accommodations to participate in the job application or interview process, please let your recruiter know or email accommodations@sofi.com.
Due to insurance coverage issues, we are unable to accommodate remote work from Hawaii or Alaska at this time.
Internal Employees
If you are a current employee, do not apply here - please navigate to our Internal Job Board in Greenhouse to apply to our open roles.