Skip to content

Conversation

@kjnsn
Copy link

@kjnsn kjnsn commented Sep 15, 2016

When the reader/writer returned an io.EOF this was passed back to the client instead of marking the TCP connection as bad and attempting to redial. This implements an exponential backoff redial system.

@jonaz
Copy link

jonaz commented Dec 30, 2016

I just ran into this. I was running a Get and suddenly i got EOF instead of memcache.ErrCacheMiss

@googlebot
Copy link

Thanks for your pull request. It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

📝 Please visit https://cla.developers.google.com/ to sign.

Once you've signed, please reply here (e.g. I signed it!) and we'll verify. Thanks.


  • If you've already signed a CLA, it's possible we don't have your GitHub username or you're using a different email address. Check your existing CLA data and verify that your email is set on your git commits.
  • If you signed the CLA as a corporation, please let us know the company's name.

@jonaz
Copy link

jonaz commented Nov 2, 2017

@kjnsn @bradfitz any chance to get this in? Otherwise i need to fork. We have a haproxy between app and memcached that dont let connections live for ever :(

@tcolgate
Copy link

@bradfitz we're hitting this as well. I think I can work around it by disable idle connection timeout on the server, but EOF should probably be marking the connections dead. I'm happy to bring this PR, or something similar, up to date.

@tcolgate
Copy link

okay, so io.EOF isn't a resumableError, so the connection shouldn't get re-used, which is why my service isn't degenerating to a pile of cache errors over time. It should be safe to retry from the caller without any danger of clogging up the connection pool with closed connections.

davidchua added a commit to davidchua/gomemcache-retry that referenced this pull request Aug 11, 2022
@frikky
Copy link

frikky commented Feb 2, 2023

Hey! Is there any way to get or something similar merged in?

I've had some request errors leading to eof, and would love at least some kind of retry system

@bradfitz bradfitz removed the cla: no label Sep 4, 2023
@QuChen88
Copy link

Ping. Any plans on getting this PR merged?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

8 participants