Performance issue caused by socket timeout for Sentry io

Keywords: PrestaShop - Google Cloud Platform - Technical issue - Other

bnsupport ID: b9be171b-af5b-02a6-3426-6067edfeb393

bndiagnostic output:

? Apache: Found possible issues
? Resources: Found possible issues
? Mariadb: Found possible issues
https://docs.bitnami.com/general/apps/wordpress/troubleshooting/debug-errors-apache/
https://docs.bitnami.com/bch/apps/moodle/troubleshooting/deny-connections-bots-apache/
https://docs.bitnami.com/installer/faq/linux-faq/administration/increase-memory-linux/
https://docs.bitnami.com/aws/apps/wordpress/troubleshooting/debug-errors-mariadb/

bndiagnostic failure reason: The suggested guides are not related with my issue

Description:
TC:
Shop url:
https://deviousdevices.ca/

  1. Add some items into the cart (check Most popular category)
  2. Open cart, press on Checkout
  3. Press on Checkout one more time

AR: performance issue, affecting the entire app server

Upon investigation i found out:

  1. Server freezes due to fpm process consumes 100% cpu
  2. that process is waiting for socket (skipped 700+k rows with similar timeout)
rt_sigaction(SIGPIPE, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, NULL, 8) = 0
poll([{fd=7, events=POLLIN|POLLPRI|POLLRDNORM|POLLRDBAND}], 1, 0) = 0 (Timeout)
rt_sigaction(SIGPIPE, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, NULL, 8) = 0
poll([{fd=7, events=POLLIN}, {fd=7, events=POLLOUT}], 2, 1000) = 1 ([{fd=7, revents=POLLOUT}])
rt_sigaction(SIGPIPE, NULL, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, 8) = 0
rt_sigaction(SIGPIPE, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, NULL, 8) = 0
poll([{fd=7, events=POLLIN|POLLPRI|POLLRDNORM|POLLRDBAND}], 1, 0) = 0 (Timeout)
rt_sigaction(SIGPIPE, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, NULL, 8) = 0
poll([{fd=7, events=POLLIN}, {fd=7, events=POLLOUT}], 2, 1000) = 1 ([{fd=7, revents=POLLOUT}])
rt_sigaction(SIGPIPE, NULL, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, 8) = 0
rt_sigaction(SIGPIPE, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, NULL, 8) = 0
poll([{fd=7, events=POLLIN|POLLPRI|POLLRDNORM|POLLRDBAND}], 1, 0) = 0 (Timeout)
rt_sigaction(SIGPIPE, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, NULL, 8) = 0
poll([{fd=7, events=POLLIN}, {fd=7, events=POLLOUT}], 2, 1000) = 1 ([{fd=7, revents=POLLOUT}])
rt_sigaction(SIGPIPE, NULL, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, 8) = 0
rt_sigaction(SIGPIPE, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, NULL, 8) = 0
poll([{fd=7, events=POLLIN|POLLPRI|POLLRDNORM|POLLRDBAND}], 1, 0) = 1 ([{fd=7, revents=POLLIN|POLLRDNORM}])
read(7, "\27\3\3\0%", 5)                = 5
read(7, "\263\358\206\360@\207\331\361\332\0043O\4L\350\251\3022\37y\241\341\235\206\35166\244\203?M"..., 37) = 37
write(7, "\37D\221E\216\243kh\243\323\36\344\20\212%\274\335\200\223\332\327\247\222\2\204\24\271}[\" Q"..., 3172) = 3172
rt_sigaction(SIGPIPE, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, NULL, 8) = 0
rt_sigaction(SIGPIPE, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, NULL, 8) = 0
rt_sigaction(SIGPIPE, NULL, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, 8) = 0
rt_sigaction(SIGPIPE, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, NULL, 8) = 0
rt_sigaction(SIGPIPE, NULL, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, 8) = 0
rt_sigaction(SIGPIPE, {sa_handler=SIG_IGN, sa_mask=[PIPE], sa_flags=SA_RESTORER|SA_RESTART, sa_restorer=0x7f72ef574730}, NULL, 8) = 0
write(7, "\25\3\3\0\32;y\275v\325d0\333\315?')\213\325qY\2673\273\232\353s\200\344\6\0", 31) = 31
close(7)                                = 0
  1. fd7 corresponds to socket:
provisioner-local:41596->15.42.188.35.bc.googleusercontent.com:https (ESTABLISHED)
  1. seems that address is related to Sentry io tracing tool, which apparently is bundled with Bitnami package

And adding stuck process stack:

#0  0x00007f72ec333819 __GI___poll
#1  0x00007f72ee45c359
#2  0x00007f72ee456120
#3  0x00007f72ee44f6ca curl_easy_perform
#4  0x00005617902d9e46
#5  0x00005617906522bc execute_ex
#6  0x0000561790657db2 zend_execute
#7  0x00005617905d50c4 zend_execute_scripts
#8  0x00005617905792d0 php_execute_script
#9  0x000056179015b0a0
#10 0x00007f72ec26909b __libc_start_main
#11 0x000056179015c6da _start

Hello @mrpavlenkonc,

Thanks for using our Prestashop stack! Could you tell me what would I need to reproduce this issue on my side? Does this happen with the modules and themes shipped by default?

Regards,
Francisco de Paz

Hi Francisco,
no, it’s not reproducible on clean package installation

I also contacted Prestashop team support, to help with the investigation on their side

So far i would assume it’s some kind of concurrent resource acquiring issue, like one fpm process from the pool acquired the socket, didn’t release it properly, while another concurrent process just waits for it, in really not efficient way, like mutex in the loop

I think for now we can move this thread on hold, cause i don’t really see any evidences it’s caused by Bitnami package

Thanks,

Hello @mrpavlenkonc,

Let’s do that then and wait for any updates in case there is something related to the stack configuration. I’ll create a reminder to check this in 2 weeks, as it is when the bot will close the thread.

Regards,
Francisco de Paz