Job Opening at Edge Delta


Principal Data Engineer



Seattle, WA

What We Do

We are an advanced monitoring technology designed to optimize centralized monitoring solutions by intelligently summarizing machine data (logs, metrics, events, and traces). Edge Delta customers enhance their business continuity by providing limitless observability while simultaneously making existing monitoring systems significantly more efficient. 

We are building this platform the way startups should — with ruthless prioritization, and with a live and demanding customer base. By joining as a Data Engineer, you have the opportunity to make our vision a reality, one feature at a time.

What You Will Be Doing

  • Building, improving, maintaining, and scaling stream processing services.
  • Writing code. Reviewing code. Revising code.
  • Making every engineering decision with < 1% CPU Core utilization and 100MB of ram at top of mind
  • Giving feedback on our standards. Holding your teammates to them.
  • Collaborating with teammates on major feature designs. Sometimes, you will own features, sometimes others will.
  • Helping our team grow organically. We value referrals. We value your feedback on candidates.

Who You Are

  • You are a software engineer. (We treat our data systems as software systems, and engineer them accordingly.)
  • You love working with data. (Small data. Big data. All the data.)
  • You are excited to optimize for bytes per second (not requests per second).
  • You know (or want to write software in) Go.
  • You love collecting data about your software as much as writing software that analyzes data. We measure everything. We make data-driven decisions.
  • You are collaborative. Nothing this hard can be accomplished by working alone. We work as a team.
  • Masters in Computer Science or equivalent experience (PHD in CS is a huge plus)

Bonus Points

  • Experience developing observability tooling (log management, metrics monitoring, distributed tracing)
  • Familiarity with industry standard observability technologies (Splunk, New Relic, Elastic, DataDog, Sumo Logic, Prometheus, Jaeger, etc.)
  • Performance optimization techniques for stream processing engines (Flink, Kinesis, Flume, Storm, etc.)
To apply, email your cover letter and resume to

This website uses cookies to ensure you get the best experience on our website.