Skip to content

Reduce memory allocation#127

Merged
ogaca-dd merged 12 commits intomasterfrom
olivierg/ReduceMemoryAllocation
Jul 29, 2020
Merged

Reduce memory allocation#127
ogaca-dd merged 12 commits intomasterfrom
olivierg/ReduceMemoryAllocation

Conversation

@ogaca-dd
Copy link
Copy Markdown
Contributor

This PR reduces the memory allocations. Sending a counter is now around twice faster as before.

Reviewing commit by commit is recommended.

Copy link
Copy Markdown
Member

@truthbk truthbk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this, some similarities with the Java approach, but looks even cleaner to me.

I just have a doubt regarding the concurrent queue where we drop the SerializedMetric, looks like we will always hold a reference so the GC will not clear those up. Obviously that's part of the goal, to reuse those buffers, but I'm wondering if under heavy loads the RSS of the application might balloon up a little bit.

Let me know your thoughts.

public void Dispose()
{
Builder.Clear();
_pool.Enqueue(this);
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting approach, I like it.

internal class SerializerHelper
{
private static readonly string[] EmptyArray = new string[0];
private readonly ConcurrentQueue<SerializedMetric> _pool = new ConcurrentQueue<SerializedMetric>();
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is no upper bound on this queue, right? It should eventually reach a state of equilibrium, but that state is dependent on load. I'm wondering if this is perhaps too optimistic an approach?

Copy link
Copy Markdown
Contributor Author

@ogaca-dd ogaca-dd Jul 22, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed, if the processing of the metrics is not fast enough, it can lead to out of memory. I will fix it.

I just have a doubt regarding the concurrent queue where we drop the SerializedMetric, looks like we will always hold a reference so the GC will not clear those up. Obviously that's part of the goal, to reuse those buffers, but I'm wondering if under heavy loads the RSS of the application might balloon up a little bit.

If we limit the size of the pool, it should be OK. What do you think?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am working on creating a generic pool to limit the number of allocated objects. I am doing it in a separate PR as it introduce several changes.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Base automatically changed from olivierg/SplitStatsdUnitTest to master July 29, 2020 08:59
@ogaca-dd ogaca-dd merged commit c498393 into master Jul 29, 2020
@ogaca-dd ogaca-dd deleted the olivierg/ReduceMemoryAllocation branch July 29, 2020 09:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants