← Back to all jobs

Analytics Engineer

Risk Labs logo

Risk Labs

📍 Remote🌐 Remote💰Competitive🕐 Posted 2 days ago
Data EngineerRemoteumaacross
pythondbtbigqueryairflowsqlgcp
Apply

Job Description

Who is Risk Labs?

Risk Labs is the foundation and core team behind UMA and Across — decentralised protocols governed by community members across the globe. UMA's optimistic oracle (OO) can record any verifiable truth or data onto a blockchain. Across is leading the future of interoperability with its frontier intents-based architecture.

We are a remote-first, globally distributed team focused on building infrastructure that pushes crypto forward.

Why This Role Exists

This is the first analytics engineering role at Risk Labs. The transformation layer has reached a level of complexity that demands a dedicated owner. You'll sit within Data Engineering, reporting to the Platform Engineering Lead, and work most closely with the Data Analytics Lead and Product team.

We are serious about building a truly agentic data platform, and this hire is a prerequisite for that. Agentic systems are only as good as the data they run on.

What You'll Own

  • The Transformation Layer — DRI for everything between raw ingestion and the clean data layer. Own the modelling strategy.

  • Refactor and Legacy Migration — Audit, cut redundancy, and rebuild into something clean, traceable, and maintainable.

  • Data Quality and Testing — Design and own the approach to data quality: testing, alerting, and column-level lineage.

  • BigQuery Cost Optimisation — Own query and storage efficiency, refactor materialisation strategies to reduce unnecessary spend.

  • Event Data and Product Observability — Build a robust event data model that gives meaningful observability across the full product suite.

Skills and Experience

Required

  • Deep expertise in data modelling across multiple time horizons, dimensions, and levels of granularity

  • Advanced SQL: performant, readable, and warehouse-aware

  • Experience owning a transformation layer in production, including a meaningful refactor or migration

  • Hands-on experience designing and implementing data quality frameworks: testing, alerting, and lineage

  • Experience with event data and product analytics tooling (Amplitude, Segment, or similar)

  • Experience with crypto data environments characterised by high normalisation and irregular schemas

Nice to Have

  • Experience with dbt

  • Familiarity with BigQuery: query optimisation, partitioning, clustering, materialisation strategies

  • Practical use of AI/LLM tooling to accelerate workflows

Tech Stack

BigQuery, dbt, Python, Airflow, Amplitude, Preset/Superset, Hex, GCP (Cloud Run, Cloud Build, Cloud Functions, Datastream)

Compensation and Benefits

  • Competitive salary with mix of salary, tokens, and equity

  • Paid in stablecoins or fiat, your choice

  • Unlimited vacation — and they actually take it

  • 100% remote

  • At least two company-wide offsites per year

  • Family care, training, and development support

Unchain Data provides Web3 data job aggregation as a common good. Jobs are posted by third parties and are not individually verified. Always exercise caution: never download software requested during a hiring process, avoid clicking unfamiliar links in interviews, make sure to verify URLs are legit, and use trusted meeting tools like Google Meet or Zoom.