Invalid AS 923 Downlink modulation v3.2.0

Hi,

I’m working with an AS 923 setup on loraserver v3.2.0 and noticed that the network server has been sending downlink in frequencies that are not only incorrect according to the standard, but are also incorrectly modulated for the region. In the example below, the NS is sending a frame on SF10/500kHz which corresponds to none of the data rates for AS 923. Furthermore, 923.3 MHz is not even defined as one of the frequencies of my network server as far as I understand.

Channels part of the network server configuration:

[network_server.band]
name="AS_923"
uplink_dwell_time_400ms=false
downlink_dwell_time_400ms=false
uplink_max_eirp=27
repeater_compatible=false

[network_server.network_settings]
installation_margin=10
rx_window=0
rx1_delay=1
rx1_dr_offset=0
rx2_dr=-1
rx2_frequency=-1
downlink_tx_power=27
disable_mac_commands=false
disable_adr=false

enabled_uplink_channels=[]

[[network_server.network_settings.extra_channels]]
frequency=922200000
min_dr=0
max_dr=5

[[network_server.network_settings.extra_channels]]
frequency=922400000
min_dr=0
max_dr=5

[[network_server.network_settings.extra_channels]]
frequency=922600000
min_dr=0
max_dr=5

[[network_server.network_settings.extra_channels]]
frequency=922800000
min_dr=0
max_dr=5

[[network_server.network_settings.extra_channels]]
frequency=923000000
min_dr=0
max_dr=5

[[network_server.network_settings.extra_channels]]
frequency=923600000
min_dr=0
max_dr=5

[[network_server.network_settings.extra_channels]]
frequency=923400000
min_dr=0
max_dr=6

[[network_server.network_settings.extra_channels]]
frequency=923900000
min_dr=7
max_dr=7

Is anyone able to assit?

I ran some more tests on this issue. My setup has two network servers configured with the same application server. One NS is on the AS region and the other is on the AU region. My test device is configured on the AS network server.

When I completely turn off the NS in the AU region, the problem seems to not occur. However, when I re-launch the AU network server, I can see on the logs that it has a session for an AS device(?):

time="2019-10-10T05:41:32Z" level=info msg="adr request added to mac-command queue" dev_eui=70b3d57050001395 dr=2 nb_trans=1 req_dr=2 req_nb_trans=1 req_tx_power_idx=5 tx_power=0
time="2019-10-10T05:41:32Z" level=info msg="requesting device-status" dev_eui=70b3d57050001395
time="2019-10-10T05:41:32Z" level=info msg="pending mac-command block set" cid=LinkADRReq commands=2 dev_eui=70b3d57050001395
time="2019-10-10T05:41:32Z" level=info msg="pending mac-command block set" cid=DevStatusReq commands=1 dev_eui=70b3d57050001395
time="2019-10-10T05:41:32Z" level=info msg="pending mac-command block set" cid=RXParamSetupReq commands=1 dev_eui=70b3d57050001395
time="2019-10-10T05:41:32Z" level=info msg="pending mac-command block set" cid=TXParamSetupReq commands=1 dev_eui=70b3d57050001395
time="2019-10-10T05:41:32Z" level=info msg="gateway/mqtt: publishing gateway command" command=down gateway_id=00800000a0003e52 qos=0 topic=gateway/00800000a0003e52/command/down
time="2019-10-10T05:41:32Z" level=info msg="device-session saved" dev_addr=003ddabb dev_eui=70b3d57050001395
time="2019-10-10T05:41:32Z" level=info msg="downlink-frames saved" dev_eui=70b3d57050001395 token=38459

Additionally, the logs show the downlink acknowledgments inverted between the AU and AS network servers. On the AS (00800000a0003e52):

time="2019-10-10T05:55:56Z" level=info msg="gateway/mqtt: publishing gateway command" command=down gateway_id=00800000a0003e52 qos=0 topic=gateway/00800000a0003e52/command/down
time="2019-10-10T05:55:56Z" level=info msg="downlink-frames saved" dev_eui=70b3d57050001395 token=56873
time="2019-10-10T05:56:35Z" level=info msg="gateway/mqtt: publishing gateway command" command=down gateway_id=00800000a0003e52 qos=0 topic=gateway/00800000a0003e52/command/down
time="2019-10-10T05:56:35Z" level=info msg="downlink-frames saved" dev_eui=70b3d57050001395 token=10135
time="2019-10-10T05:57:57Z" level=info msg="backend/gateway: downlink tx acknowledgement received" gateway_id=00800000a00045da
time="2019-10-10T05:58:05Z" level=info msg="gateway/mqtt: publishing gateway command" command=down gateway_id=00800000a0003e52 qos=0 topic=gateway/00800000a0003e52/command/down
time="2019-10-10T05:58:05Z" level=info msg="downlink-frames saved" dev_eui=70b3d57050001395 token=31476
time="2019-10-10T05:58:05Z" level=info msg="backend/gateway: downlink tx acknowledgement received" gateway_id=00800000a0003e52

And on the AU (00800000a00045da):

time="2019-10-10T05:55:56Z" level=info msg="backend/gateway: downlink tx acknowledgement received" gateway_id=00800000a0003e52
time="2019-10-10T05:56:35Z" level=info msg="backend/gateway: downlink tx acknowledgement received" gateway_id=00800000a0003e52
time="2019-10-10T05:57:57Z" level=info msg="gateway/mqtt: publishing gateway command" command=down gateway_id=00800000a00045da qos=0 topic=gateway/00800000a00045da/command/down
time="2019-10-10T05:57:57Z" level=info msg="downlink-frames saved" dev_eui=00000000820001b5 token=9011

Did some more testing and it seems I cannot have two network servers using the same MQTT broker with the topic templates left as default. As the two servers receive data from all gateways through their bridges, it seems the NSs are unable to ignore gateways that are not configured for them and end up messing with the downlink frames of the other NS that actually have the gateway configured.

So in my case, the solution was to either have separate topics for each NS, or a separate broker for each NS. E.g on the lora gateway bridge of an AU gateway:

[integration.mqtt]
event_topic_template="nsau/gateway/{{ .GatewayID }}/event/{{ .EventType }}"
command_topic_template="nsau/gateway/{{ .GatewayID }}/command/#"

And the AS gateway:

[integration.mqtt]
event_topic_template="nsas/gateway/{{ .GatewayID }}/event/{{ .EventType }}"
command_topic_template="nsas/gateway/{{ .GatewayID }}/command/#"

And also have the same topics on the network servers.

1 Like

You are correct. A similar question was asked previously, and the recommended approach was per-region topics.

1 Like