I am getting pq: sorry, too many clients already error when I am calling the GetMessages() multiple times.
Please find the updated code:
main() code
func main() {
dbConn, err := InitDB()
if err != nil {
Log.Error("Connection Error: ", err.Error())
return
}
defer dbConn.Close()
go run()
var input string
fmt.Scanln(&input)
}
Database connection code is:
func InitDB()(*sql.DB, error) {
connectionString := fmt.Sprintf("user=%v password='%v' dbname=%v sslmode=disable", USER, PASSWORD, DATABASE)
db, err = sql.Open(DRIVER, connectionString)
return db, err
}
run goroutine:
func run() {
for {
messages, err := GetMessages()
if err != nil {
Log.Error("Connection Error: ", err.Error())
return
}
log.Info(messages)
}
}
GetMessages() function code:
func GetMessages() (messages []string, err error) {
rows, err := db.Query(`SELECT message1, message2, message3, message4, message5,
message6, message7, message8, message9, message10, message11, message12, message13,
message14, message15, message16, message17, message18, message19, message20, message21,
message22, message23, message24, message25, message26, message27, message28, message29,
message30, message31, message32, message33, message34, message35, message36, message37,
message38, message39, message40, message41, message42, message43, message44, message45,
message46, message47, message48 FROM table1 WHERE id=1`)
if err != nil {
Log.Error("Query error", err)
return messages, err
}
var pointers []interface{}
defer rows.Close()
for rows.Next() {
pointers = make([]interface{}, 48)
messages = make([]string, 48)
for i, _ := range pointers {
pointers[i] = &messages[i]
}
err = rows.Scan(pointers...)
if err != nil {
Log.Error("Failed to scan row", err)
return messages, err
}
}
return messages, nil
}
I checked this answer and I have used scan but still it isn't working
UPDATE
Issue was in another function. I was using db.Query without closing the returned rows object and was repeatedly calling that function. I've updated my code; used db.Exec instead of db.Query and it's working now. Thank you so much @mkopriva for this answer. :)
Try setting SetMaxOpenConns. The default is 0 (unlimited). This may be causing the issue. It would help if you also had SetConnMaxLifetime; otherwise, Postgres will start holding connections longer, and you will notice an increase in memory usage.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With