Your first ngrok endpoint is live, awesome! The next step many teams hit is scale: how to keep traffic flowing if one endpoint slows or restarts.
View in browser

Hey jmqst011@gmail.com,

 

Your first ngrok endpoint is live, awesome! The next step many teams hit is scale: how to keep traffic flowing if one endpoint slows or restarts. ngrok Endpoint Pools solves this problem.

 

Let’s make your app more resilient: use Endpoint Pools with traffic policies to balance requests between multiple replicas of your service using the following two ngrok http commands.

Load balance anything, anywhere with ngrok's Endpoint Pools

Create an endpoint pool with two of your ngrok endpoints. First, fire up one endpoint:

ngrok http 8080 https://your-app.ngrok.dev --pooling-enabled

Then, run the same command again in a separate terminal (or another machine):

ngrok http 8080 https://your-app.ngrok.dev --pooling-enabled

Traffic to https://myapp.ngrok.dev will now be load balanced between the two.

 

That's it. Truly. 

 

When two or more endpoints share a URL, ngrok load-balances between them whether they're running on different machines, environments, networks, or even in different clouds.


Set up load balancing

 

With pools, you can:

  • Increase capacity to handle traffic surges
  • Swap endpoints in and out instantly without (de)registering them
  • Tolerate failures in any replica or its environment

Take the next step and keep your app online to handle what the public internet throws at it.

 

<3

Team @ ngrok

GitHub
LinkedIn
X
YouTube

ngrok Inc., 2261 Market Street , STE 71467, San Francisco, California 94114, United States

Unsubscribe Manage preferences