I am diving into Golang and have a problem that I have been working on a few days and I just cant seem to grasp the concept of go routines and how they are used.
Basically I am, trying to generate millions of random records. I have functions that make the random data, and will create a giant .CSV file containing this data.
My questions is if it is possible to make this concurrent and speed things up?
My code is basically generate a random string, write string to file up to N times (where N is whatever you want).
My question is if its possible to even do this concurrently in order to reduce execution time. It seems that no matter how I approach this problem, I still get the same benchmark as if I did it without go routines.
This is a sample of what I have so far:
func worker(c chan string) {
for {
c <- /* Generate random data using other functions here */
}
close(c)
}
func writer(s string) {
csvfile.WriteString(s)
}
func main(){
receive := make(chan string)
for i := 0; i < 100; i++ {
go worker(receive)
}
for i := 0; i < 10000; i++ {
go writer(<-receive)
}
}
Where I generate data, I am using tons and tons of function calls from: https://github.com/Pallinder/go-randomdata. Do you think that that could be where I am losing all this time?
Any help would be appreciated.
*math.Randper goroutine. You can do this viarand.New(andrand.NewSource, with different seeds for each of course).