Consider the following snippet:
func NewA(x ...interface{}) error {
if len(x) > 0 {
return errors.New(x[0].(string))
}
return nil
}
func NewB(s ...string) error {
if len(s) > 0 {
return errors.New(s[0])
}
return nil
}
var sink error
func BenchmarkA(b *testing.B) {
b.ReportAllocs()
for i := 0; i < b.N; i++ {
var s = "hello"
sink = NewA(s)
}
}
func BenchmarkB(b *testing.B) {
b.ReportAllocs()
for i := 0; i < b.N; i++ {
var s = "hello"
sink = NewB(s)
}
}
Running benchmarks on this, you find:
BenchmarkA-8 20000000 84.3 ns/op 32 B/op 2 allocs/op
BenchmarkB-8 30000000 42.9 ns/op 16 B/op 1 allocs/op
Version A is more costly by 1 extra allocation. This occurs because calling NewA has a hidden call to runtime.convT2Estring where the string header is allocated on the heap.
However, I contend that the compiler should be able to prove that this is unnecessary. The call to NewA is concrete, and so the compiler should able to prove that a string header passed into the variadic interfaces neither escaped nor is modified. When crafting the interface header, it should be able to directly point to the string header on the stack.
\cc @randall77 @neild
Consider the following snippet:
Running benchmarks on this, you find:
Version A is more costly by 1 extra allocation. This occurs because calling
NewAhas a hidden call toruntime.convT2Estringwhere the string header is allocated on the heap.However, I contend that the compiler should be able to prove that this is unnecessary. The call to
NewAis concrete, and so the compiler should able to prove that a string header passed into the variadic interfaces neither escaped nor is modified. When crafting the interface header, it should be able to directly point to the string header on the stack.\cc @randall77 @neild