[2024-07-02 13:17:40.281][23][critical][backtrace] [./source/server/backtrace.h:127] Caught Segmentation fault, suspect faulting address 0x0
[2024-07-02 13:17:40.281][23][critical][backtrace] [./source/server/backtrace.h:111] Backtrace (use tools/stack_decode.py to get line numbers):
[2024-07-02 13:17:40.281][23][critical][backtrace] [./source/server/backtrace.h:112] Envoy version: 32113313a357829ba3a5dce0795b6780bf8cbf4d/1.30.4/Clean/RELEASE/BoringSSL
[2024-07-02 13:17:40.281][23][critical][backtrace] [./source/server/backtrace.h:114] Address mapping: 5f81639dd000-5f816638b000 /usr/local/bin/envoy
[2024-07-02 13:17:40.282][23][critical][backtrace] [./source/server/backtrace.h:121] #0: [0x7d4b951f4520]
[2024-07-02 13:17:40.305][23][critical][backtrace] [./source/server/backtrace.h:119] #1: luaL_unref [0x5f81653129f9]
[2024-07-02 13:17:40.328][23][critical][backtrace] [./source/server/backtrace.h:119] #2: Envoy::Extensions::Router::Lua::RouteHandleWrapper::~RouteHandleWrapper() [0x5f81652a2f80]
[2024-07-02 13:17:40.352][23][critical][backtrace] [./source/server/backtrace.h:119] #3: Envoy::Extensions::Filters::Common::Lua::BaseLuaObject<>::registerType()::{lambda()#1}::operator()() [0x5f81652a2a16]
[2024-07-02 13:17:40.375][23][critical][backtrace] [./source/server/backtrace.h:119] #4: lj_BC_FUNCC [0x5f81652aaf0b]
AsyncClient 0x43cbf3c8000, stream_id_: 2729926164092997584
&stream_info_:
StreamInfoImpl 0x43cbf3c82c0, protocol_: 1, response_code_: 200, response_code_details_: via_upstream, attempt_count_: 1, health_check_request_: 0, getRouteName(): upstream_info_:
UpstreamInfoImpl 0x43cbf923a58, upstream_connection_id_: 6
Http1::ConnectionImpl 0x43cbf51d010, dispatching_: 1, dispatching_slice_already_drained_: 1, reset_stream_called_: 0, handling_upgrade_: 0, deferred_end_stream_headers_: 0, processing_trailers_: 0, buffered_body_.length(): 0, header_parsing_state_: Done, current_header_field_: , current_header_value_:
absl::get<ResponseHeaderMapPtr>(headers_or_trailers_): null
Dumping corresponding downstream request:
decoder:
UpstreamRequest 0x43cbf03f400
request_headers:
':authority', 'envoy-gateway'
':path', '/'
':method', 'GET'
':scheme', 'http'
'cookie', 'random=hello'
'content-length', '67'
'x-envoy-internal', 'true'
'x-forwarded-for', '<some-ip>'
'x-envoy-expected-rq-timeout-ms', '15000'
FilterManager 0x43cbf923e30, state_.has_1xx_headers_: 0
filter_manager_callbacks_.requestHeaders():
':authority', 'envoy-gateway'
':path', '/'
':method', 'GET'
':scheme', 'http'
'cookie', 'random=hello'
'content-length', '67'
'x-envoy-internal', 'true'
'x-forwarded-for', '<some-ip>'
'x-envoy-expected-rq-timeout-ms', '15000'
filter_manager_callbacks_.requestTrailers(): null
filter_manager_callbacks_.responseHeaders(): null
filter_manager_callbacks_.responseTrailers(): null
&streamInfo():
StreamInfoImpl 0x43cbf3c82c0, protocol_: 1, response_code_: 200, response_code_details_: via_upstream, attempt_count_: 1, health_check_request_: 0, getRouteName(): upstream_info_:
UpstreamInfoImpl 0x43cbf923a58, upstream_connection_id_: 6
, current_dispatching_buffer_: drainedConnectionImpl 0x43cbf2e9600, connecting_: 0, bind_error_: 0, state(): Open, read_buffer_limit_: 1048576
socket_:
ListenSocketImpl 0x43cbf32f880, transport_protocol_:
connection_info_provider_:
ConnectionInfoSetterImpl 0x43cbf34d518, remote_address_: 127.0.0.1:8090, direct_remote_address_: 127.0.0.1:8090, local_address_: 127.0.0.1:56358, server_name_:
node:
cluster: test
id: test
dynamic_resources:
cds_config:
path: /tmp/cds.yaml
lds_config:
path: /tmp/lds.yaml
admin:
access_log_path: /dev/null
address:
socket_address: { address: 0.0.0.0, port_value: 9901 }
static_resources:
listeners:
- name: http
per_connection_buffer_limit_bytes: "3221225472"
address:
socket_address:
address: 0.0.0.0
port_value: 8089
filter_chains:
- filters:
- name: envoy.filters.network.http_connection_manager
typed_config:
"@type": type.googleapis.com/envoy.extensions.filters.network.http_connection_manager.v3.HttpConnectionManager
stat_prefix: ingress_http
generate_request_id: true
codec_type: auto
upgrade_configs:
- upgrade_type: websocket
http_filters:
- name: envoy.filters.http.lua
typed_config:
"@type": type.googleapis.com/envoy.extensions.filters.http.lua.v3.Lua
default_source_code:
inline_string: |
function envoy_on_request(request_handle)
local requestPath = "/"
local headers, body = request_handle:httpCall(
"server",
{
[":path"] = requestPath,
[":method"] = "GET",
[":authority"] = "envoy-gateway",
}, requestPath , 15000)
local pattern = '"token":"([^"]+)"'
local token = string.match(body, pattern) or ""
request_handle:streamInfo():dynamicMetadata():set("envoy.filters.http.lua", "request.info", {
token = token,
})
pattern = '"server":"([^"]+)"'
local server = string.match(body, pattern) or ""
request_handle:headers():add("x-server", server)
end
function envoy_on_response(response_handle)
local meta = response_handle:streamInfo():dynamicMetadata():get("envoy.filters.http.lua")["request.info"]
response_handle:headers():add("set-cookie", "gs="..meta.token)
end
- name: envoy.filters.http.router
typed_config:
"@type": type.googleapis.com/envoy.extensions.filters.http.router.v3.Router
route_config:
name: route
virtual_hosts:
- name: service
domains: ["*"]
routes:
- match:
prefix: "/"
route:
inline_cluster_specifier_plugin:
extension:
name: envoy.router.cluster_specifier_plugin.lua
typed_config:
"@type": type.googleapis.com/envoy.extensions.router.cluster_specifiers.lua.v3.LuaConfig
source_code:
inline_string: |
function envoy_on_route(route_handle)
local header_value = route_handle:headers():get("x-server") or ""
return header_value
end
default_cluster: server
clusters:
- name: server
connect_timeout: 5s
type: LOGICAL_DNS
load_assignment:
cluster_name: server
endpoints:
- lb_endpoints:
- endpoint:
address:
socket_address:
address: 127.0.0.1
port_value: 8090
Title: segmentation fault with lua_cluster_specifier plugin during object destruction
Description:
I am facing an issue while running envoy docker container. The container stops while serving requests randomly. The stacktrace is as follows:
Envoy File
Envoy Image:
envoyproxy/envoy:debug-v1.30-latestFrom the stacktrace, it looks like some issue during destruction of members of
RouteHandleWrapper. Can someone please help in debugging this ?