Industry

Living On The Edge: Less Servers. Less Code. More Security.

Today we’re releasing our integration with Cloudflare; an adaptive authentication layer implemented on the edge. This is not just yet another way of integrating Castle — it’s the codeless way of integrating Castle. The module is built on top of Cloudflare’s recent release of Apps with Workers, and the plan is to eventually open-source it and bring the same functionality to other serverless environments. If you’re already on Cloudflare, you can check out the Castle app here.

The increasing complexity of customer authentication

With the growing threat of account compromise and abuse, there’s been an uptick in new security and risk management solutions aimed to protect your app and your users: credit card chargeback prevention, passport verification, spam prevention, bot detection, IP blacklists, and of course adaptive or risk-based authentication (often referred to as account takeover prevention in the fraud community).

Having access to an arsenal of anti-abuse tools is indeed powerful, but we’ve also seen how the adoption of these technologies is often too slow to keep up with the rapid rise of threats.

Managing user risk in a modern customer-facing app requires a considerable amount of collaboration between your security, fraud, product and growth teams; they’ll need to work together on A/B testing different verification flows, measuring conversation at each step, and running experiments until they’ve found the ideal balance between security and user experience.

This need for iteration especially becomes an issue when one of the most common obstacles in implementing a new security and risk management solution is that it consumes a considerable amount of development cycles; each API has its own way of ingesting user, event, and request parameters, as well as its own preference on how to deal with the risk response: CAPTCHA, MFA, email verification, or asking the user to contact support.

If you’re on the security or fraud team, you know how hard it can be to unlock engineering hours for a new proof of concept. Consequently, since the evaluation and iteration processes require quite a bit of knowledge and resources, the initial implementation will typically remain untouched after the evaluation period.

The ideal scenario would be a standard for adaptive authentication; a framework like Active Directory where every API could simply plug into a predefined protocol. Unfortunately, such a standard doesn’t exist for the consumer internet; every web framework has its own way of handling authentication and authorization. There are a lot of options for customer authentication both on the open-source as well as commercial side, but they all have their own hooks, flows, actions, and ways to integrate into the user journey.

The vision: Centralize the authentication logic to the edge

With the Cloudflare app, we’re taking the first step towards what we think could become a standard for adaptive authentication and customer risk management.

The idea is to move all the risk logic out of the web application into the request layer and intercept key business logic routes such as logins, email updates, and transactions. Once the request has been vetted by the adaptive authentication layer, it forwards the request to your internal application and attaches a verdict that will help you determine whether you should let the user proceed, go through a verification flow, or denied and logged out.

This entirely relieves your web services from keeping track of all of the risk scenarios you need to handle. For example, what should we do when a user with admin rights tries to update their email address from a previously unseen device? Instead, the web service will simply look at the information from the gateway and issue the appropriate response.

Cloudflare is our first integration that allows customers to centralize user risk management, but this serverless approach will work in pretty much any environment where we have write access to both the incoming and outgoing HTTP requests:

  • CDN: Cloudflare, Fastly, Lambda@Edge, StackPath. The advantage here is that you can block traffic at the edge before it reaches your app, and you’ll be able to protect all your services with the push of a button. That said, most CDNs aren’t at the same maturity level yet as Cloudflare when it comes to running serverless code in-line of the request flow.
  • Web server: NGINX, Apache, IIS. Similar to the CDN approach but in this case the adaptive authentication might be sitting behind a load balancer and there could be multiple deployments needed to cover all your services. However, you’re not locked in with a specific CDN vendor.
  • Web Application Firewall (WAF): Imperva, F5, Barracuda. Identical to the webserver implementation but will require the WAF vendor to support the framework, or allow running custom code.
  • Web application: Rails/Rack, Python/Django, Node/Express. Ideal for a simple single-service application and you’re less dependent on what CDN and webserver that sits in front of your app. The only caveat here is that you’ll need to install an authentication module for each service, so it’s less ideal if you have a microservice architecture.

We’re excited about where the serverless community is headed, and we see Cloudflare as a pioneer in the space and a first step to delivering a robust solution that can be deployed from the edge. Serverless is a priority for them and performance benchmarks are proving over and over that they’re spearheading the shift to edge computing. We’re proud to be among the first set of use-cases of their Apps with Workers functionality, and we have some pretty groundbreaking innovations in the pipeline that’ll be leveraging Workers. Stay tuned!

Learn how our customers are protecting their online accounts with Castle.