Amplitude Golang SDK

Go Reference
GitHub release (latest by date)
GitHub go.mod Go version

Amplitude unofficial client for Go, inspired in their official SDK for Node.

For reference, visit HTTP API v2 documentation.


$ go get


// startup
client := amplitude.NewDefaultClient("<your-api-key")

// logging events
    UserID: "[email protected]",
    EventType: "test-event",
    EventProperties: map[string]interface{}{
        "source": "notification",
    UserProperties: map[string]interface{}{
        "age": 25,
        "gender": "female",

// gracefully shutdown, waiting pending events to be sent

The Event (doc) structure is based on API V2 request properties.

Events will not be sent synchronously, the client keeps a goroutine responsible for batching and issuing uploads of events. This routine will upload events:

  • after the upload interval (every 10ms by default).
  • as soon as we accumulate enough events to batch (256 events by default).
  • when Flush is explicitly invoked.
  • during shutdown process.

LogEvent calls, therefore, should never block. It will return an error in case the event was not queued (which means the event will be dropped without even being sent). This should not happen unless the uploads are not getting through for some reason (e.g. a misconfiguration).

Check advanced parameters to learn how to tweak the default behaviour.

Advanced parameters

The default client behaviour can be configured through a set of custom Options (doc).

client := amplitude.NewClient("<your-api-key", amplitude.Options{ ... })


  1. If you want to configure your client to issue uploads every second:

client := amplitude.NewClient("<your-api-key", amplitude.Options{
    UploadInterval: time.Second,
  1. If you want to disable retries:

client := amplitude.NewClient("<your-api-key", amplitude.Options{
    MaxUploadAttempts: 1,
  1. If you want to hook your own Datadog metrics for amplitude events:

client := amplitude.NewClient("<your-api-key", amplitude.Options{
    UploadDelegate: func(_ *amplitude.Uploader, events []*data.Event, err error) {
        count := len(events)
        if err != nil {
            statsd.Incr("", []string{"status:failure"}, count)
        } else {
            statsd.Incr("", []string{"status:success"}, count)
  1. If you want to allow more upload batches in parallel and a larger queue, in case you anticipate a higher throughput of events:

client := amplitude.NewClient("<your-api-key", amplitude.Options{
    MaxParallelUploads: 16, 
    MaxCachedEvents: 32000,


View Github