ID:2162075
 
Resolved
The persist value in world.Export() had been disabled since version 487, but it has been reenabled now. Several issues with incorrect results from server-to-server Export() calls have been identified and fixed.
BYOND Version:510
Operating System:Windows 10 Pro 64-bit
Web Browser:Firefox 49.0
Applies to:Dream Daemon
Status: Resolved (511.1361)

This issue has been resolved.
As the title says, world.Export's persist doesn't actually keep the socket open; it closes right after the message is sent. Will provide more details if requested.
http://www.byond.com/forum/?post=1405894

3 years we've been asking about this problem.
Different thing entirely.
Same end cause: slowness that should not be present.
This is a bug with sockets not persisting, not speed. They may lead to the same problem, one is an entirely different issue.
one is an entirely different issue.

You definitely can't know this.

It stands to reason that if there is a handshake blocking the desired message, continually re-shaking could be the root of the speed issues on repeated messages.

But I dunno, just me, you know, thinking about stuff critically.
In response to Ter13
Ter13 wrote:
one is an entirely different issue.

You definitely can't know this.

It stands to reason that if there is a handshake blocking the desired message, continually re-shaking could be the root of the speed issues on repeated messages.

But I dunno, just me, you know, thinking about stuff critically.

BYOND doesn't send multiple handshakes. The inability for Persist to work prevents it from just using one handshake because it has to re handshake each reconnect
I'll do some testing to see if I can find out more.
Hello! Received @ (2.045); Delta: 0.082; time since accept: 0.059
Hello! Received @ (2.108); Delta: 0.063; time since accept: 0.026
Hello! Received @ (2.18); Delta: 0.072; time since accept: 0.044
Hello! Received @ (2.235); Delta: 0.055; time since accept: 0.025
Hello! Received @ (2.299); Delta: 0.064; time since accept: 0.028
Hello! Received @ (2.352); Delta: 0.053; time since accept: 0.025
Hello! Received @ (2.416); Delta: 0.064; time since accept: 0.025
Hello! Received @ (2.469); Delta: 0.053; time since accept: 0.024
Hello! Received @ (2.528); Delta: 0.059; time since accept: 0.029
Hello! Received @ (2.609); Delta: 0.081; time since accept: 0.039
Hello! Received @ (2.675); Delta: 0.066; time since accept: 0.027
Hello! Received @ (2.733); Delta: 0.058; time since accept: 0.031
Hello! Received @ (2.801); Delta: 0.068; time since accept: 0.031
Hello! Received @ (2.885); Delta: 0.084; time since accept: 0.034
Hello! Received @ (2.948); Delta: 0.063; time since accept: 0.028
Hello! Received @ (3.003); Delta: 0.055; time since accept: 0.025
Hello! Received @ (3.073); Delta: 0.07; time since accept: 0.026
Hello! Received @ (3.131); Delta: 0.058; time since accept: 0.03
Hello! Received @ (3.181); Delta: 0.05; time since accept: 0.025
Hello! Received @ (3.262); Delta: 0.081; time since accept: 0.046
Hello! Received @ (3.314); Delta: 0.052; time since accept: 0.023
Hello! Received @ (3.361); Delta: 0.047; time since accept: 0.025
Hello! Received @ (3.404); Delta: 0.043; time since accept: 0.021
Hello! Received @ (3.455); Delta: 0.051; time since accept: 0.024
Hello! Received @ (3.516); Delta: 0.061; time since accept: 0.031
Hello! Received @ (3.57); Delta: 0.054; time since accept: 0.025
Hello! Received @ (3.626); Delta: 0.056; time since accept: 0.028
Hello! Received @ (3.683); Delta: 0.057; time since accept: 0.029
Hello! Received @ (3.737); Delta: 0.054; time since accept: 0.022


Going to leave this here to see if Persist being fixed will make the delta time not be as significant.
Are you using persist to communicate with a BYOND world or an HTTP server?

because....
http://www.byond.com/docs/notes/415.html

world.Export() has an optional (currently undocumented) third argument: "flags". Currently the only flag is 1 (tentatively WORLD_PERSIST), eg:
 usr << world.Export("foo.com:1000",null,1) // pass a "0" in a subsequent call to this address to close the connection 

When this is set, the server will keep the connection open so that subsequent world.Export() calls to that address are more efficient. >For now, this doesn't work with HTTP world.Export() calls. It also doesn't support a convenient way for the recipient server to communicate back to the host server through the persistent connection (outside of the world.Topic() return value). For now, if the recipient wants to contact the host at an arbitrary time, it must open a separate connection through a world.Export() to the host's port. Yes, this is stupid.
In theory, this system can be used to communicate between BYOND and outside (eg, C++/Java/etc) servers. We'll document the communication protocol to make this easier.
Looking at the code, I'm seeing many ways that a server-to-server link can be closed when persist is true, but all of them are cases where bad data has been passed. The receiving server should never shutdown its link unless it's gotten a bad message. The sending server will only shutdown in response to a bad message, or a completed message when persist is false.

One thing that I think would help would be if you could show the code you're using for world.Export() here, and also how you're handling world/Topic() for those cases. That might shed some light and lend me a better idea of what to test for.
In response to Lummox JR
Lummox JR wrote:
Looking at the code, I'm seeing many ways that a server-to-server link can be closed when persist is true, but all of them are cases where bad data has been passed. The receiving server should never shutdown its link unless it's gotten a bad message. The sending server will only shutdown in response to a bad message, or a completed message when persist is false.

One thing that I think would help would be if you could show the code you're using for world.Export() here, and also how you're handling world/Topic() for those cases. That might shed some light and lend me a better idea of what to test for.

Those timed cases are my own coded server, but it appears that regardless of connecting to DD or my server, they both seem to close the socket around a few lines of code (according to my debugger anyway, mileage may vary) after "Double response in SRV2SRV_MSG".

In response to Somepotato
Ooh! That's useful info. While the double return actually shouldn't be a problem that can shut things down, I can see how that situation might come up in a persistent socket. That at least is something I can approach and fix.
Its not actually that error, its just the closest "landmark" I could find to the error. In the function that that error is spat out the socket is closed.
Lummox:

Based on my testing, a "receiver" with:
/world/Topic(a, b, c, d)
world.log << json_encode(list(a,b,c,d))
return "foo"


and a "sender" with:
/world/New()
world.log << Export("byond://localhost:5001/?thing=value", null, 1)


will exchange the following packets (all values are hex):
s->r: [      15] FE 01 00 00 DB 01 00 00 59 09 87 79 9A 77 F4 3C D3 2A E8 69 89 40 35 5E 37 CE C2 2A E8 50 2E 75
r->s: [ 15] (possibly encoded?) 9E 9C 00 FF FF 30 C5 40 51 E7 CC 16 4E DD 9C 1E A6 AE 86 E6 6F EF C8 53 D7 FF
s->r: [ 83] (sent encoded) 00 92 13 00 00 "/?thing=value" 00
r->s: [ 83] (sent encoded) 06 "foo" 00


followed by the sender closing the connection.
We know, I pointed him to where in the code (debugger assembled code, anyway) the sockets being closed (its something that shouldn't be happening but is for some reason)
Oh dang. I just ran across a big ol' comment on this that says persistent connections were stopped in 487 because of a nasty bug (id:115176). Doh!

I believe however that the nasty bug in question has everything to do with the way Export connections handle requests, which is something I'm overhauling in this new build. So I'm going to try turning persist back on and running some tests.
Well then. I do look forward to the overhaul!
I opened a can of worms here, but it looks like it was a can worth opening. I think I also figured out a lot of why world.Export() was crapping the bed in threaded mode.
Thats hilarious; I had no idea about the old bug.
Page: 1 2