In case you experience a few second delay when creating a post or submitting a comment, I noticed it as well and am trying to find the source. Server resources are ample enough.
The delay could be up to 30s. Apologies for the inconvenience!
Update 1: The good news is that I found out that some database settings weren’t correctly applied when deploying lemmy and now the page should feel much snappier. The bad news is that it doesn’t seem to solve the delay when creating a post. I’ll have a deep dive into the logs now. Wish me luck!
Update 2: I have to leave it here for today. I’ll try to see if I get more info from the lemmy devs. Fortunately it isn’t too bad.
Update 3: It might actually be a problem in the UI part of lemmy. You should be able to force a refresh of the page and even if the loading animation was still spinning the post will have gone through.
killin’ it wander, thank you for checking into it 💙
Thank you for your support!
you got this :D i know you’re super knowledgeable yourself, but please let me know if i can help somehow. i do linux devops for a living, so i’m pretty good in this realm 💙 thank you for working on this stuff for us!
Thank you a lot for your offer! I don’t do this directly for a living, so I’m pretty sure you’re better at it than I am :P The biggest problem I have now is that there really seems nothing wrong from the logfiles. I have also noticed that it takes more to load the larger the more subscribers the community has. I’m inclined to believe at this point that it might be lemmy doing all the federation stuff while the spinner is still loading, instead of telling the user that it has been posted and then doing all the federation announcing in the background.
Fortunately all this lead me to notice that my PGTUNE settings for postgres hadn’t been applied when I installed lemmy and now the site is much faster, except for those actions that require some sort of “announcing”. I’ve also increased the federation worker count to 512 which is what lemmy.ml currently uses and is surely overkill.
I’ve also noticed that the Jerboa app itself takes some time to post when using a lemmy.ml account (which is the flagship) and only posts something after timing out. So… I’m inclined to believe it’s some bug which hasn’t been noticed too much because usually communities didn’t have as many subscribers.
I have to leave now, but I’ll keep you posted! Thank you so much <3
of course 💙 all of your findings sound sensible and i’m sure if you’re tuning closer to how lemmy.ml is tuning then yeah, it might be all you can do for now. i’m sure some expert rust programmer or something will look at the lemmy code soon and be able to help with some performance fixes :D this, strangely, is the best case scenario for lemmy development: people are hammering it and that’s when the bugs and slowness comes out
I just noticed that it probably is a coding error in the UI. If I close the browser tab just after posting it will still have been published correctly even if I didn’t wait for the spinning animation to finish.
Maybe the UI isn’t getting a success callback, but definitely looks like something I can’t change by merely changing some performance settings.
from what i can see in the code, the web interface seems to keep a websocket open with the back-end and i can’t find any wait/sleep/pause statements. here’s a snippet of saving a comment, for example:
handleSaveCommentClick(i: CommentNode) { const cv = i.props.node.comment_view; const save = cv.saved == undefined ? true : !cv.saved; const auth = myAuth(); if (auth) { const form: SaveComment = { comment_id: cv.comment.id, save, auth, }; WebSocketService.Instance.send(wsClient.saveComment(form)); i.setState({ saveLoading: true }); } }
test
deleted by creator
deleted by creator
test
test