Ideas

The Edge is the Place to Be

Real Time Analysis of Data in the Cloud is Archaic, Expensive, and Limited. Welcome to the future.
Ozan Unlu
Founder & CEO
Jul 24, 2021
5 minute read
Subscribe to Our Newsletter
Get weekly product updates and industry news.
Share

See Edge Delta in Action

Like the adage that knowledge is power, the new truth is that data is valuable. However, data must be analyzed to unlock its value. In fact, idle data is worse than useless – it incurs cost while adding nothing. With data exponentially expanding, corporations face increasingly significant challenges in order to unlock that value, especially for DevOps, Security and SRE teams.

That is our vision at Edge Delta: To help corporations unlock the value of data. And others see the value in what we’re doing. My co-founder, Fatih Yildiz, and I are very proud to announce a $15-million Series A round led by Menlo Ventures and Tim Tully, the former CTO of Splunk. Previous investors, MaC Venture Capital and Amity Ventures, are participating as well.

But let’s take a step back. Even five years ago, data volumes were comparitively smaller, and it was no problem to simply forward all observability data to the cloud, index it, then analyze it. But in that short time, gigabytes have turned into terabytes, and terabytes are turning into petabytes. Trying to effectively analyze those massive volumes of data in a reasonable timeframe is now a serious challenge with the traditional, centralized cloud model causing massive bottlenecks with severe financial and technical limitations.

That’s where Edge Delta comes in.

Edge Delta is changing the way that people analyze data. We firmly believe it makes no sense for companies to push and centralize all their data within the cloud before gaining any value from it. The operational requirements for real-time insights and decision making have far surpassed what can be supported with the centralized model. Waiting for all data to be compressed, encrypted, transmitted, unpacked, indexed, queried, dashboarded, and finally alerted on in the cloud is slow, inefficient, costly and at large enough volumes, simply impossible. Our premise is to start the analysis of data where it’s created (K8s containers, AWS EC2, Azure Functions, physical machines, network devices) rather than where it ultimately ends up – that’s how we’ve defined the Edge.

This take on machine data analytics lets corporations start to analyze machine data output (logs, metrics, events, traces, telemetry) at the source and allows for a much more sophisticated and deeper real time analysis. The upsides of “on-the-edge” stream processing, rather than first dumping everything into a data lake, are clear: within seconds of deploying, your data visibility goes from 5 to 100 percent.

Edge Delta makes intelligent decisions to summarize, compress, and uplevel data at the edge, which allows enterprises to analyze all of their data without worrying about overages or crushing costs. Edge Delta typically results in an almost 90-percent improvement in TCO compared to traditional centralized monitoring solutions.

And, suddenly, teams get complete visibility of all of their data. It is transformative.

We know that the strictly centralized cloud model is no longer viable because we’ve worked for the competitors. I was in the trenches with customers when I realized that the system was primitive and archaic. My past customers were frustrated, time and time again I heard, “It’s too expensive. It doesn’t work correctly. It’s not fast enough.”

Those costs, speeds, and perceptions have real-world implications – it leads to DevOps, Security, and SRE teams continually being told no. No, you can’t put data into Splunk, Datadog, New Relic, or Elastic, because we’re at our license limit. No, performance is already dreadful and our queries are getting queued. No, we don’t think that data is necessary to accomplish X and Y. Teams are forced to make an unnecessary choice between controlling ballooning analytics costs and serving their customers.

Meanwhile, there’s a notable disconnect. The CIO believes that the company is analyzing all of the data and is representing that to the board. Companies only have to ask one layer down to discover this is not the case. When DevOps, Security, and SRE teams are forced to try to predict the future and prioritize data by handpicking what gets analyzed and neglecting the rest, it’s the business that suffers.

Edge Delta believes that this choice is unnecessary. With our engineering backgrounds – it is painful and frustrating to see great teams forced to throw away massive amounts of data (and value) for financial budgetary reasons. It’s unscientific. It is, in fact, possible to derive all of the value from the data while also making rational decisions about costs. And Edge Observability is what makes this new reality possible.

We are only at the start of this journey. But right now, know this: we’re about a year into serving clients, and the results have been game-changing, reinforcing our core belief that it is time to break ties with a model that is slow, expensive, and severely limited.

The possibilities are enormous, the skies are blue, and we’re moving as fast as we can. After all, there’s a world of data out there that is waiting to be analyzed.

See the coverage on TechCrunch:
https://techcrunch.com/2021/06/25/edge-delta-raises-15m-series-a-to-take-on-splunk/

Read why Tim Tully at Menlo Ventures invested:
https://medium.com/@timtullydevnull/why-menlo-ventures-invested-in-edge-delta-1074028e1e58

Follow the press release:
https://www.prnewswire.com/news-releases/edge-delta-raises-15-million-series-a-funding-to-continue-changing-the-way-that-data-is-analyzed-301320237.html

Stay in Touch

Sign up for our newsletter to be the first to know about new articles.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
#banner script