Clearer comments regarding enabled uplink channels for US configuration


#1

After a lot of tinkering and head-scratching I was able to solve the issue I was having with specifying channel subband on my gateway and the network server. Some minor changes to the comments would help a lot in case someone deals with the same issues in the future.

Currently the comments about enabled uplink channels looks like this:

https://www.loraserver.io/loraserver/install/config/

  # Enable only a given sub-set of channels
  #
  # Use this when ony a sub-set of the by default enabled channels are being
  # used. For example when only using the first 8 channels of the US band.
  # Note: when left blank, all channels will be enabled.
  #
  # Example:
  # enabled_uplink_channels=[0, 1, 2, 3, 4, 5, 6, 7]
  enabled_uplink_channels=[]

in the quick-start guide we have this:

US915 configuration example

[network_server]
net_id="000000"

  [network_server.band]
  name="US_902_928"
  enabled_uplink_channels=[0, 1, 2, 3, 4, 5, 6, 7]

I copied the example when setting my server up but didn’t fully realize that the channels != sub-band. Though much troubleshooting I had to figure out why sensors were joining on sub-band 2 but sending data from sub-band 1. (Sub-band terminology is present in Laird Sentrius sensors as well as Multitech gateways).

Adding a comment clarifying that in the US standard channels 0-7 corresponds to sub-band 1 and, for example, channels 8-15 correspond to sub-band 2 would have helped a lot in understanding how the configuration works.

To clarify for myself as well, are the 8 upstream 500kHz bandwidth channels (64 to 71) listed in the regional parameters included in this list and what is that channel used for?


Receive one uplink and it stops receiving frames right after
#2

500 KHz uplink channel’s can’t realistically be used for much with current SX1301 / SX1308 based gateways, because those only have a single 500 KHz demodulator (“IF8”) which can only be tuned within range of the one of the existing two front ends and can only receive at a single spreading factor.

That is in contrast to the 8 variable-SF 125 KHz demodulators typically used to implement 8 uplink channels.

With that hardware they’re really only useful for gateway-to-gateway (or otherwise heavily coordinated) links. For other usage, the IF8 doesn’t really offer much advantage over a node-class radio at about 10x the cost, so you might as well just use node class radios, and get fully independent tuning.

Note also that these overlap the usual range of uplink channels and so are basically an alternate allocation of the same spectrum. LoRa is imperfectly orthogonal between pairs of Bandwidth-Spread factor combinations that give the chirps the same “slope” so in theory there could be some degree of interference, though likely within reason.


#3

If you’re using a gateway that can reliable receive 500KHz uplinks, do you need to enable that channel? If I have my channels set to 0-7, do I also need to add channel 64 as well?


#4

“Need” is a subjective word - need why?

Perhaps some spec officially requires it; practically, things should work fine without it. Regulatory frameworks don’t assume such capability, though you may be better able to utilize what they permit in some situations with it.

However, out of the box LMIC may assert() if it finds no 500 KHz channels enabled. That is easily prevented by clamping the data rate setting functions (there are two, one for joins and one for everything else). Other node stacks might or might not have similar behavior.


#5

you get better battery life, and you get off the air faster if you can send the same number of bits with a faster speed.

Having said that…if I want to have loraserver.io send a channel mask down to enable all of the channels in the first sub band, should I set that to 0,1,2,3,4,5,6,7,64?