ID:110681
 
Resolved
BYOND Version:480
Operating System:Windows 7 Home Premium 64-bit
Web Browser:Firefox 3.6.15
Applies to:Website
Status: Resolved (web)

This issue has been resolved.
http://www.byond.com/members/ BYONDHelp?command=view_tracker_issue&tracker_issue=2818
The "Error: slow down! You must wait before posting again to this site." issue doesn't seem resolved. If anything, it seems even worse now, and its a pain in the butt. Trying to manage multiple pages is practically impossible. I could (maybe) understand if this prevented people from spamming a single page, but having it effect an entire site just seems foolish. I get hindered by this message almost every day.
It also seems like attempting to post again, before the timer has reset, starts the delay all over. Though I can't be sure on that.
Also, it may require you to reload the page for the restriction to clear? Again, not sure about that.
Also (again!), from the looks of it, it only lets you post 1 message every 5+ minutes? That's a bit ridiculous.
I just posted a comment on a blog post, then tried to comment on a hub entry and couldn't. The blog post and comment were both made by me. I'm not sure what the point of this feature is, but it isn't working. Get rid of it.

Damn, now that I posted here, I probably can't post the comment I was originally trying to post on the hub entry for another five minutes :-/

Edit: It's probably worth mentioning that the comments I posted or tried to post were on a blog post I made and a hub entry I made. I guess you're trying to stop spam, but what was I doing here, spamming myself? You had good intentions, it backfired, just fix it so we can forget it ever happened.

Edit #2: It happened again. I could understand enforcing a delay of a few seconds to limit what bots could do, but it was a couple of minutes between postings at least. Also, when this error message is shown the comment you try to post is erased so I have to re-type it. I hope the staff's development news posts will give people the chance to say "whoa, that's a terrible idea" before you add more features like this one.
It sometimes locks me for hours at a time, for whichever reason.
Forum_account,

No need to be condescending. This is a bug. It is supposed to enforce a 10s delay between posts on the same blog, to prevent spamming.
There's a level of condescension in limiting out post rates because you're afraid we're all a bunch of spammers. If you think I'm being rude then I guess you get how annoying this bug is.
The spam prevention thing has been around for a long time and is not new; it's just malfunctioning now. The normal limit is 10 seconds which isn't enough to be a problem for a legitimate poster. The problem is something has gone wrong with the way the website is checking that limit. The feature itself isn't really a problem when it works correctly.

So far I have not found out why it's been confused, but I'm still looking.
The first time I noticed it was when I posted a comment on a blog post then tried to post a comment on a hub entry. The second time I posted a comment on a blog itself then tried to post a comment on a hub entry. In the second case I was typing the comments from a phone, so there was definitely a few minutes delay in between. In both cases, I was the owner of both entities I was trying to comment on. Hope this helps!
I think I found the issue; your info did help quite a bit. It turns out that when this got fixed earlier it didn't get fixed for hub entries, only for blog posts.
Forum_account wrote:
There's a level of condescension in limiting out post rates because you're afraid we're all a bunch of spammers. If you think I'm being rude then I guess you get how annoying this bug is.

No, it has nothing to do with "thinking" we have spammers. We DO have a bunch of spammers, and this feature was put in retroactively to correct that. And that was some time ago. As Lummox said, a glitch has caused it to act incorrectly recently.

And it has nothing to with "thinking" you are being rude. You ARE being rude. Cut it out.
No, it has nothing to do with "thinking" we have spammers. We DO have a bunch of spammers,

While I'm sure you have spammers, the problem is that everyone is treated as a potential spammer. That's rude, so I'm not going to be nice about it. If you use the site more and had some of your comments eaten by the spam filter, I'm sure it would have been fixed or removed sooner. I'm not sure you realize how frustrating BYOND can be to use.
I'm sure BYOND can be frustrating but venting that through condescending takes helps no one. All I am asking you (repeatedly now) is to cut out your little insults, or I will do it for you. We are trying to improve the culture of this community and it isn't helpful when one of the better developers here feels some need to make comments like this.

When you experience a bug, just post the details and we'll do what we can. Just because this isn't working properly at the moment doesn't mean that the intent was bad. In this particular case, the problem was exacerbated by a recent change, which is why we didn't fix it sooner (or you didn't bother to post about it sooner). The spam filter has been in place for years now.
I understand your frustration--truly. But you're not being treated like a criminal; it's a simple matter of the KISS principle. Making a spam filter make exceptions for certain users is neither foolproof nor entirely simple; even if it ignored users who had no history of spam and were not newbies, it wouldn't catch the stolen-account or bad-little-brother scenarios. It's not unheard of for a user with a previously clean, normal posting history to suddenly go berserk with spam. Heck, as we've seen recently even human judgment that "X wouldn't do Y" is plenty fallible; how much more so an algorithm? Nor can we simply make an exception for the hub owner or post author; if they really are rapid-fire spamming that's still a bad thing we need to catch, because it's not "victimless" spam if they're doing it to bump up their visibility and ultimately just cluttering our site.

So adding a layer of complexity would only make for a weaker filter. Meanwhile the simple filter is completely invisible to legitimate users when it's working properly; if it's not working right then it still has to be fixed either way. The only good thing complicating the filter would achieve is making it less likely to produce a false positive when it's broken, but that assumes the extra complexity itself isn't subject to breakage. If a simple thing like a timing check can break, a more complex identity/trustworthiness check certainly can. Overengineering just leads to more bugs and slower fixes, and nobody wants that.

The bottom line is that an algorithm failed. We all agree it sucks, and it's frustrating. However the algorithm's simplicity is a good thing, and there's no easy way to make it fail more gracefully without either ruining its effectiveness outright, or risking even more failures. But the fix will go live soon and this will go back to being invisible to legitimate users like yourself. And I do appreciate your help providing further information that helped get to the bottom of this.