Avoiding reminiscence leaks with Spring Boot WebClient | bol.com


For those who’re performing net requests with Spring Boot’s WebClient you maybe, identical to us, learn that defining the URL of your request ought to be accomplished utilizing a URI builder (e.g. Spring 5 WebClient):

webClient .get() 
          .uri(uriBuilder -> uriBuilder.path("/v2/merchandise/{id}")
          .construct(productId))

If that’s the case, we suggest that you just ignore what you learn (until searching hard-to-find reminiscence leaks is your pastime) and use the next for establishing a URI as a substitute:

webClient .get() .uri("/v2/merchandise/{id}", productId))

On this weblog put up we’ll clarify the right way to keep away from reminiscence leaks with Spring Boot WebClient and why it’s higher to keep away from the previous sample, utilizing our private expertise as motivation.

How did we uncover this reminiscence leak?

Some time again we upgraded our software to make use of the most recent model of the Axle framework. Axle is the bol.com framework for constructing Java purposes, like (REST) companies and frontend purposes. It closely depends on Spring Boot and this improve additionally concerned updating from Spring Boot model 2.3.12 to model 2.4.11.

When working our scheduled efficiency assessments, all the things seemed high quality. Most of our software’s endpoints nonetheless offered response occasions of below 5 milliseconds. Nevertheless, because the efficiency take a look at progressed, we observed our software’s response occasions rising as much as 20 milliseconds, and after a protracted working load take a look at over the weekend, issues received so much worse. The response occasions skyrocketed to seconds – not good.

After a protracted stare down contest with our Grafana dashboards, which give insights into our software’s CPU, thread and reminiscence utilization, this reminiscence utilization sample caught our eye:

grafana-log

This graph exhibits the JVM heap measurement earlier than, throughout, and after a efficiency take a look at that ran from 21:00 to 0:00. Throughout the efficiency take a look at, the applying created threads and objects to deal with all incoming requests. So, the capricious line displaying the reminiscence utilization throughout this era is strictly what we’d anticipate. Nevertheless, when the mud from the efficiency take a look at settles down, we’d anticipate the reminiscence to additionally settle right down to the identical degree as earlier than, however it’s really larger. Does anybody else scent a reminiscence leak?

Time to name within the MAT (Eclipse Reminiscence Analyzer Software) to seek out out what causes this reminiscence leak.

What brought on this reminiscence leak?

To troubleshoot this reminiscence leak we:

  • Restarted the applying.
  • Carried out a heap dump (a snapshot of all of the objects which are in reminiscence within the JVM at a sure second).
  • Triggered a efficiency take a look at.
  • Carried out one other heap dump as soon as the take a look at finishes.

This allowed us to make use of MAT’s superior characteristic to detect the leak suspects by evaluating two heap dumps taken a while aside. However we didn’t must go that far, since, the heap dump from after the take a look at was sufficient for MAT to seek out one thing suspicious:

mat report

Right here MAT tells us that one occasion of Spring Boot’s AutoConfiguredCompositeMeterRegistry occupies nearly 500MB, which is 74% of the overall used heap measurement. It additionally tells us that it has a (concurrent) hashmap that’s liable for this. We’re nearly there!

With MAT’s dominator tree characteristic, we are able to checklist the biggest objects and see what they stored alive – That sounds helpful, so let’s use it to have a peek at what’s within this humongous hashmap:

hashmap

Utilizing the dominator tree we have been in a position to simply flick through the hashmap’s contents. Within the above image we opened two hashmap nodes. Right here we see lots of micrometer timers tagged with “v2/merchandise/…” and a product id. Hmm, the place have we seen that earlier than?

What does WebClient must do with this?

So, it’s Spring Boot’s metrics which are liable for this reminiscence leak, however what does WebClienthave to do with this? To seek out that out you actually have to know what causes Spring’s metrics to retailer all these timers.

Inspecting the implementation of AutoConfiguredCompositeMeterRegistrywe see that it shops the metrics in a hashmap named meterMap. So, let’s put a well-placed breakpoint on the spot the place new entries are added and set off our suspicious name our WebClientperforms to the “v2/product/{productId}” endpoint.

We run the applying once more and … Gotcha! For every name the WebClientmakes to the “v2/product/{productId}” endpoint, we noticed Spring creating a brand new Timerfor every distinctive occasion of product identifier. Every such timer is then saved within the AutoConfiguredCompositeMeterRegistry bean. That explains why we see so many timers with tags like these:

/v2/merchandise/9200000109074941 /v2/merchandise/9200000099621587

How will you repair this reminiscence leak?

Earlier than we establish when this reminiscence leak may have an effect on you, let’s first clarify how one would repair it. We’ve talked about within the introduction, that by merely not utilizing a URI builder to assemble WebClient URLs, you’ll be able to keep away from this reminiscence leak. Now we are going to clarify why it really works.

After just a little on-line analysis we got here throughout this put up (https://rieckpil.de/expose-metrics-of-spring-webclient-using-spring-boot-actuator/) of Philip Riecks, during which he explains:

“As we normally need the templated URI string like “/todos/{id}” for reporting and never a number of metrics e.g. “/todos/1337” or “/todos/42″ . The WebClient gives a number of methods to assemble the URI […], which you’ll be able to all use, besides one.”

And that methodology is utilizing the URI builder, coincidentally the one we’re utilizing:

webClient .get() 
          .uri(uriBuilder -> uriBuilder.path("/v2/merchandise/{id}")
          .construct(productId))

Riecks continues in his put up that “[w]ith this answer the WebClient doesn’t know the URI template origin, because it will get handed the ultimate URI.

So the answer is so simple as utilizing a type of different strategies to cross within the URI, such that the WebClient WebClient will get handed the templated – and never the ultimate – URI:

webClient .get() .uri("/v2/merchandise/{id}", productId))

Certainly, once we assemble the URI like that, the reminiscence leak disappears. Additionally, the response occasions are again to regular once more.

When may the reminiscence leak have an effect on you? – a easy reply

Do you’ll want to fear about this reminiscence leak? Nicely, let’s begin with the obvious case. In case your software exposes its HTTP shopper metrics, and makes use of a technique that takes a URI builder to set a templated URI onto a WebClient, you need to positively be fearful.

You’ll be able to simply examine in case your software exposes http shopper metrics in two alternative ways:

  1. Inspecting the “/actuator/metrics/http.shopper.requests” endpoint of your Spring Boot software after your software made not less than one exterior name. A 404 means your software doesn’t expose them.
  2. Checking if the worth of the applying property administration.metrics.allow.http.shopper.metrics is about to true, during which case your software does expose them.

Nevertheless, this doesn’t imply that you just’re secure in the event you’re not exposing the HTTP shopper metrics. We’ve been passing templated URIs to the WebClient utilizing a builder for ages, and we’ve by no means uncovered our HTTP shopper metrics. But, hastily this reminiscence leak reared its ugly head after an software improve.

So, may this reminiscence leak have an effect on you then? Simply don’t use URI builders together with your WebClient and you have to be protected towards this potential reminiscence leak. That may be the easy reply. You do not take easy solutions? Truthful sufficient, learn on to seek out out what actually brought on this for us.

When may the reminiscence leak have an effect on you? – a extra full reply

So, how did a easy software improve trigger this reminiscence leak to rear its ugly head? Evidently, the addition of a transitive Prometheus (https://prometheus.io/) dependency – an open supply monitoring and alerting framework – brought on the reminiscence leak in our specific case. To grasp why, let’s return to the scenario earlier than we added Prometheus.

Earlier than we dragged within the Prometheus library, we pushed our metrics to statsd (https://github.com/statsd/statsd) – a community daemon that listens to and aggregates software metrics despatched over UDP or TCP. The StatsdMeterRegistry that’s a part of the Spring framework is liable for pushing metrics to statsd. The StatsdMeterRegistry solely pushes metrics that aren’t filtered out by a MeterFilter. The administration.metrics.allow.http.shopper.metrics property is an instance of such a MeterFilter. In different phrases, if

administration.metrics.allow.http.shopper.metrics = false 

the StatsdMeterRegistry will not push any HTTP shopper metric to statsd and will not retailer these metrics in reminiscence both. To date, so good.

By including the transitive Prometheus dependency, we added one more meter registry to our software, the PrometheusMeterRegistry. When there’s a couple of meter registry to show metrics to, Spring instantiates a CompositeMeterRegistry bean. This bean retains monitor of all particular person meter registries, collects all metrics and forwards them to all of the delegates it holds. It’s the addition of this bean that brought on the difficulty.

The problem is that MeterFilter situations aren’t utilized to the CompositeMeterRegistry, however solely to MeterRegistry situations within the CompositeMeterRegistry (See this commit for extra data.) That explains why theAutoConfiguredCompositeMeterRegistryaccumulates all of the HTTP shopper metrics in reminiscence, even once we explicitly set administration.metrics.allow.http.shopper.metricsto false.

Nonetheless confused? No worries, simply don’t use URI builders together with your WebClient and you have to be protected towards this reminiscence leak.

Conclusion

On this weblog put up we defined that this strategy of defining URLs of your request with Spring Boot’s WebClient is finest prevented:

webClient .get() 
          .uri(uriBuilder -> uriBuilder.path("/v2/merchandise/{id}")
          .construct(productId))

We confirmed that this strategy – which you may need come throughout in some on-line tutorial – is susceptible to reminiscence leaks. We elaborated on why these reminiscence leaks occur and that they are often prevented by defining parameterised request URLs like this:

webClient .get() .uri("/v2/merchandise/{id}", productId))

Leave a Reply

Your email address will not be published. Required fields are marked *