Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data races #18

Open
FatalReadError opened this issue May 31, 2020 · 10 comments · May be fixed by #30
Open

Data races #18

FatalReadError opened this issue May 31, 2020 · 10 comments · May be fixed by #30
Milestone

Comments

@FatalReadError
Copy link

With [email protected] and when testing publishing from multiple browsers with race detector running, encountered race conditions.

WARNING: DATA RACE
Write at 0x00c000456258 by goroutine 68:
  github.com/alexandrevicenzi/go-sse.(*Channel).SendMessage()
      /home/fat/go/pkg/mod/github.com/alexandrevicenzi/[email protected]/channel.go:26 +0x87
  github.com/alexandrevicenzi/go-sse.(*Server).SendMessage()
      /home/fat/go/pkg/mod/github.com/alexandrevicenzi/[email protected]/sse.go:117 +0x219
  projects/lib/channels.GenChannel.func1()
      /home/fat/projects/lib/channels/channels.go:121 +0x377

Previous write at 0x00c000456258 by goroutine 61:
  github.com/alexandrevicenzi/go-sse.(*Channel).SendMessage()
      /home/fat/go/pkg/mod/github.com/alexandrevicenzi/[email protected]/channel.go:26 +0x87
  github.com/alexandrevicenzi/go-sse.(*Server).SendMessage()
      /home/fat/go/pkg/mod/github.com/alexandrevicenzi/[email protected]/sse.go:117 +0x219
  projects/lib/channels.GenChannel.func1()
      /home/fat/projects/lib/channels/channels.go:121 +0x377

Sorry, don't have the dump handy but had another data race on msg.retry write,

go-sse/sse.go

Line 94 in fcf9dce

msg.retry = s.options.RetryInterval

(which is being fed by SSE.SendMessage(chName, sse.NewMessage(strconv.Itoa(time.Now().Second()), string(msg), "msg")))

I'm using some complex code as well, so it's possible that I'm not 'doing it right', but these two data races under load seem to point at these two locations (channel.go:26 and sse.go:94)

@FatalReadError
Copy link
Author

These two data races seem rare and hard to trigger.

@alexandrevicenzi
Copy link
Owner

Could you share some of your code?

@FatalReadError
Copy link
Author

Unfortunately I can't right now, but I will try to dig up more information and possibly even a reproduction test case.

@alexandrevicenzi alexandrevicenzi added this to the dev milestone Dec 8, 2020
@alexandrevicenzi
Copy link
Owner

There are many things wrong, I'm working on a proper fix for the entire lib, but it will take a bit more time.

@codewinch
Copy link

codewinch commented Jun 24, 2021

Don't know if this is relevant, but I've experienced some of this:

WARNING: DATA RACE
Write at 0x00c000284018 by goroutine 13:
  github.com/alexandrevicenzi/go-sse.(*Channel).SendMessage()
      /home/cw/go/pkg/mod/github.com/alexandrevicenzi/[email protected]/channel.go:26 +0x8c
  github.com/alexandrevicenzi/go-sse.(*Server).SendMessage()
      /home/cw/go/pkg/mod/github.com/alexandrevicenzi/[email protected]/sse.go:119 +0x21b

I was able to fix it by just wrapping a Lock() and Unlock() around that c.lastEventID = message.id in channel.go:26 like this:


// SendMessage broadcast a message to all clients in a channel.                                                    
func (c *Channel) SendMessage(message *Message) {                                                                  
                                                                                                                   
    c.mu.Lock()      // <-- new                                                                                              
    c.lastEventID = message.id                                                                                     
    c.mu.Unlock()   // <-- new                                                                                               
                                                                                                                   
    c.mu.RLock()                                                                                                   
                                                                                                                   
    for c, open := range c.clients {                                                                               
        if open {                                                                                                  
            c.send <- message                                                                                      
        }                                                                                                          
    }                                                                                                              
                                                                                                                   
    c.mu.RUnlock()                                                                                                 
}  

At least in initial testing, it resolves the data race.

@codewinch
Copy link

In my case, it was very reproducible and this instantly fixed it.

@codewinch
Copy link

codewinch commented Jun 24, 2021

Spoke too soon -- have another on sse.go:96:

msg.retry = s.options.RetryInterval

@alexandrevicenzi
Copy link
Owner

Yes, it can happen anywhere, the lib is using goroutines where it's not needed and for that needs locks everywhere. The way to fix it is by removing most and leave only when required.

@iscander
Copy link

iscander commented Jul 1, 2021

Can reproduce issue with set last lastEventID. My pattern of usage send multiple messages of different time to the same client like multiple topics subscriptions. It easy appears in tests, due absence of significant delay.

Race log is follow:

WARNING: DATA RACE
Write at 0x00c000436258 by goroutine 49:
github.com/alexandrevicenzi/go-sse.(*Channel).SendMessage()
/vendor/github.com/alexandrevicenzi/go-sse/channel.go:26 +0x8c
github.com/alexandrevicenzi/go-sse.(*Server).SendMessage()
/vendor/github.com/alexandrevicenzi/go-sse/sse.go:116 +0x21b
/rest/handlers.(*eventsHandler).trackFlow.func1()
/rest/handlers/events.go:137 +0x124

Previous write at 0x00c000436258 by goroutine 48:
github.com/alexandrevicenzi/go-sse.(*Channel).SendMessage()
/vendor/github.com/alexandrevicenzi/go-sse/channel.go:26 +0x8c
github.com/alexandrevicenzi/go-sse.(*Server).SendMessage()
/vendor/github.com/alexandrevicenzi/go-sse/sse.go:116 +0x21b
/rest/handlers.(*eventsHandler).trackFlow.func1()
/rest/handlers/events.go:137 +0x124

and the patch is exactly the same:


diff --git a/go/vendor/github.com/alexandrevicenzi/go-sse/channel.go b/go/vendor/github.com/alexandrevicenzi/go-sse/channel.go
index 82be4197..c66f948e 100644
--- a/go/vendor/github.com/alexandrevicenzi/go-sse/channel.go
+++ b/go/vendor/github.com/alexandrevicenzi/go-sse/channel.go
@@ -23,7 +23,9 @@ func newChannel(name string) *Channel {

 // SendMessage broadcast a message to all clients in a channel.
 func (c *Channel) SendMessage(message *Message) {
+       c.mu.Lock()
        c.lastEventID = message.id
+       c.mu.Unlock()

        c.mu.RLock()

iscander added a commit to iscander/go-sse that referenced this issue Jul 5, 2021
Protect the option by wrappring into a mutex. The datarace is gone.
It closes alexandrevicenzi#18
@iscander iscander linked a pull request Jul 5, 2021 that will close this issue
@iscander
Copy link

iscander commented Jul 5, 2021

https://gist.github.com/iscander/9a144528e563dc5969346d5e0b61cd4f yet another sample which rises same data race but as a part of HTTPHandler

isdzulqor pushed a commit to Haraj-backend/go-sse that referenced this issue Nov 22, 2022
Protect the option by wrappring into a mutex. The datarace is gone.
It closes alexandrevicenzi#18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants