Dbx

Wrappers around database/sql
for Postgres, SQLite3 and MySQL.
This is not an ORM but a set of granular wrappers for devs that enjoy
hand tailor SQL queries but don't want to get bogged down in tedious
CRUD operations and mappings. Because different database engines don't
agree on how to quote identifiers and how to define parameters in
prepared statements.
From the rawest to the finest:
Main library methods:
dbx.Open(driver string, dsn string) *DB (the driver here is
redundant but others use it too)
DB operates with raw SQL strings and params as a any varlist:
DB.Read(query string, params ...any) (data []maps[string]any)
DB.Write(query string, params ...any) (affected int64)
DB.Close() error
Convenience methods:
DB.Query(table string) *Query
DB.Entry(table string) *Entry
DB.Record(table string) *Record
Each of these returns a new struct that is connected to the opened
database connection and a table.
Query is a fluent query builder, used by other resources but can be
used to assemble SQL queries. Doesn't do type safety or joins. Yet.
Entry builds on top of Query and does CRUD operations over go maps.
You probably won't use these and this might get private.
Record builds on top of Entry is probably the most useful for now
until the Model arrives.
-
Record.Load(id int64), load the record with .id into the .Data
map.
-
Record.Save() bool, updates or creates the loaded record into the
database, based on the value of its private .id field.
-
Record.Delete() bool, returns true if the record was deleted.
-
Record.Find(criteria ...any) bool, return true if a first row
matches the query criteria and loads its .Data map.
Examples
Using SQLite3 and a record:
package main
import (
"github.com/jpedro/dbx"
)
func main() {
db := dbx.Open(dbx.DriverSqlite3, "example.db")
defer db.Close()
record := db.Record("users")
record.Data["name"] = "some record"
record.Data["password"] = "you should hash this"
saved := record.Save()
println("Saved:", saved)
println("Record Id:", record.Id())
record.Data["password"] = "UPDATED"
updated := record.Save()
println("Updated:", updated)
println("Record Id:", record.Id())
deleted := record.Delete()
println("Deleted:", deleted)
rows := db.Read("SELECT * FROM users")
println("All users:", rows)
}
Using Postgres and an entry:
package main
import (
"github.com/jpedro/dbx"
)
func main() {
db := dbx.Open(dbx.DriverPostgres, "postgres://tests:tests@127.0.0.1:5432/tests?sslmode=disable")
defer db.Close()
entry := db.Entry("users")
newId := entry.Create("users", map[string]any{
"name": "some entry",
"age": 456,
})
println("Created user:", newId)
found := entry.Get(newId)
println("Found entry:", found)
rows := db.Read("SELECT * FROM users")
println("All users:", rows)
}
Todos
-
Add an embeddable Model that understands its parent struct and
supports validation. Relies on scanning the struct fields and
calling sql.Result.Scan(refs...) into them. The main issue is
that the parent struct name and fields can't be reached via
reflection and this need to be done differently. In the
experiments.
type User struct {
Model
Id string
Email string
}
// user := db.Model(User) // this `User` here is a problem because
// go doesn't accept types as arguments,
// just values. Even more strangely,
// methods can't be parameterized. Again
// Again, reflection to the rescue.
// http://u.jpedro.net/meth/gen/is/a/no/go
user := db.Model(&User{}) // Correct way. Via reflection, this sets
// the `.Model.Table` correctly, the
// `.Model.Child` pointer and the field's
// `.Model.Mappings`, because Postgres
// likes you to lower or quote everything.
user.Id = 123
user.Save() // Call forwarded to `Model.Save()`.
-
Add type ValidateRule func(value any, args ...any) and maybe
dynamically load them from a plugin directory.
-
Add dbx.Validate(value any, rules []ValidateRule) []error.
-
Add dbx.Hash(text string, method dbx.HashMethod, options ...any) string.
-
Add dbx.Rand32(length int) string for a random base32 string
and dbx.Rand58(length int) string for a random base58 string.
-
Extract differences in database engines into dialects, that can be
re-used by a future migrate (mdb) cli tool. Something like this,
that eventually can be defined in a yaml file.
schema := Migrate("postgres", func(s *Schema) {
s.Table(func(t *Table) {
t.Name = "user"
t.Add(t.Id())
t.Add(t.Varchar("email", 30).NotNull().Unique())
t.Add(t.Int32("account_id").NotNull().References("account", "id"))
})
s.Drop("table", "temp")
s.ExecFile("path/to/migrations/users.sql")
})
schema.Preview()
schema.Apply()