No matter the version of ASL I’ve tried, “playback” does not work as advertised - the use of either localplay or playback results in the requested audio file only being played on the local node on which it is run. I’ve also noticed there have been multiple threads started about this and not a single one has been addressed by the development team (if they have been, I can’t find them). Nor can I find any definitive answer in the Wiki
Could someone with experience/knowledge PLEASE respond to this?
So, why not give us more background on it so we are educated to your exact situation.
Duplex mode ?
linktolink ?
Duplex Modes of nodes connected that it is not PLAYBACK through (public and private) ?
anything else you might find important in your config.
Out of the box, I have no educated answer for you without details.
But if you want to find a bug, you will need details to replicate it.
I can say it works for me ! -No bug out of the box. ASL 1.01, 4 boxes. Don’t use it on my beta 6 boxes.
Perhaps let us look at the full command string you are using in rpt.conf may be the best 1st step.
Worry about the other details after. Likely a typo or improper use of the command.
All are private nodes (4 total), all are configured duplex = 0 and linktolink = yes. a 1 node acts as a hub for the other 3. All nodes reside on their own servers, all are remotely located to each other and connected via our local AREDN Mesh Network… Also, all ASL 2.0
Take the following command, executed from bash on Node 1103:
So, while I’m trying to look at this in a large picture,
what is the ‘file formats used’ and are they the same in all instances ?
I might suggest a test as well I wait on the format reply.
Put this ‘operation’ in a command in rpt.conf like
8900=Playback,/var/lib/asterisk/sounds/digits/2 ; test
In fact, test that exact line for playback across other nodes for playback. Call it by executing *8900 from radio end of a local node. (change the command number if in conflict)
load => format_g723.so ; G.723.1 Simple Timestamp File Format
load => format_g726.so ; Raw G.726 (16/24/32/40kbps) data
load => format_g729.so ; Raw G729 data
load => format_gsm.so ; Raw GSM data
load => format_h263.so ; Raw H.263 data
load => format_h264.so ; Raw H.264 data
load => format_ilbc.so ; Raw iLBC data
noload => format_jpeg.so ; JPEG (Joint Picture Experts Group) Image
load => format_pcm.so ; Raw/Sun uLaw/ALaw 8KHz (PCM,PCMA,AU), G.
load => format_sln.so ; Raw Signed Linear Audio support (SLN)
load => format_vox.so ; Dialogic VOX (ADPCM) File Format
load => format_wav_gsm.so ; Microsoft WAV format (Proprietary GSM)
load => format_wav.so ; Microsoft WAV format (8000Hz Signed Line
While it may seem I am running you in circles, I am not.
There are a lot of little details I have run into in the past and never finalized a decision on them, just worked with a good workaround that presented itself which is a bit what you have done unsuccessfully.
Just a couple of things to know about audio formats.
Example of .wav and .WAV are not the same exact format, and if it is typo’d for the other, it will not play.
If you have the codec loaded and play a .WAV without extension, it will not play since that determines the exact format (I believe a raw wav without some header info).
SO, I suspect some issue with .ul and .ulaw but I’m not spending time on that.
But you can. It was your pick. I would say you need to properly use the extension on the file and in your calling of it from your script. For this is the only method to distinguish different formats of similar codec’s/formats.
For it is not native to the system except where you installed the extra software.
However it might be interesting to see the results if you transfer the file to one of the other nodes and played it. It’s possible it will play but I don’t know that.
If it were me, I would want to play the file back in .WAV
Honestly, I can’t tell you what happens in the codec process between formats in transfer. Not my thing.
While one can assume asterisk rules apply here while using Playback. For the command itself is a asterisk command. not app_rpt. But app_rpt controls audio gating.
As I originally thought/assumed since this issue was talked about less than a year ago, that your issue is a result of ‘AUDIO GATING’ in the use of 'duplex = 0 and linktolink = yes AND we run the one test that pretty much proves that.
Since you are doing this from bash line/script
You might try to make your shell call directly to ‘asterisk’ for PLAYBACK of your file and perhaps the result may be different.
From this position, you may not have control of who does not get the audio stream… Not sure? May not matter since app_rpt controls the audio gates.
If all fails and you are willing to try a work around, try feeding the script audio to a private node actuated by vox and connected to your network so you have a genuine COS. That should open all the proper audio gates within that duplex = 0 and linktolink = yes because without it, it’s not likely going to pass…
Thanks for the detailed and thoughtful response Mike.
This whole incarnation of wanting to use playback arose because I am interested in getting AutoSkyWarn up and running, the audio files of which I want heard system-wide (all nodes).
The newstonight examples I gave are actually commands used as part of my weekly NewsLine broadcast mechanism, where the audio files reside on each node and the system SSH’s to each node to start the playback on each (all automated via cron). It’s worked well for me (http://www.ah6le.net/index.php/allstar-links/broadcast-newsline-on-multiple-allstar-nodes).
I am only using the commands I’ve posted as a test to prove out the concept of using playback for multiple nodes and here we are.
I’m not exactly sure where I “got” the idea that sound files were to be .ul but I will rethink that. I have placed the .wav versions of those files on my local node for testing purposes and interestingly, they don’t play at all (although the node keys the TX but dead air is heard).
As some of these post are looked at forever, I think it might be interesting to note what is happening with what you are doing and some history as I remember it (may not be so good).
Our whole system resides on asterisk, a telephony PBX. An app_rpt node is basically a phone connection extension with special rules to work with radio.
When one node is connected to the other, you are are in a conference phone call. So that audio is top priority.
linktolink was created early in this package to allow users, who were most entirely repeaters, to have a tie in without replacing their existing hard wired controllers, They would often use a remote base port to feed app_rpt connectivity. But it is hard to have any control on that hard wired system from the ASL side.
But it can be done with scripted dtmf re-generators and crafty command separation. I too did this on my 220 system at first. As I did not have internet at the repeater site and used a pair of rf link radio’s for the audio connectivity. Cross-Inverting cos and ptt with linktolink made it work.
So all of the audio gating for linktolink was in effect for this purpose. And still is.
The fact that Playback was not originally figured into that gate is likely it was not around at the time the function was created. But this would be very hard to account for every scenario of how this software might be used. Make one thing work and break 3 others that now need work from some other scenario or those not yet tried. It’s just not a one size fits all software. One size fits many is more like it.
Being open source software, if anyone had specialized needs, they could alter it themselves and recompile the package. Nobody every claimed it is everything and FREE thanks to a lot of volunteers.
Perhaps someone who was on the original dev team might chime in hear for better understanding.
But I fear there are not many left. Leaving you with dummies like me.
So, as I think about it, and as I use to say all the time,
with asterisk, anything is possible with a little rethink and redo workaround.
So, here is one more avenue for you to try.
With the use of a USRP channel driver in your links, you might be able to do this and have the results you want. However, to do this, you ‘might’ need to install the dvswitch server package for the proper interfaces should be there for you to do it without writing code yourself…
I will not go through the details as I likely will say something wrong no doubt.
But some reading might be in order at groups.io/dwswitch
But I also think if you create enough private nodes on your central server connected together using a usrp channel driver in a ‘X’ connection, you might be able to do this straight up. One private node per link.
Never tried it.
Mike, as what I think is the last post in this thread about this, indeed changing duplex= from anything other than 0 does allow the playback command to work properly and audio is sent to all connected nodes.
In my case, as the node (my hub) this will on is radioless, this is not a hindrance to my operation. A workaround to be sure but one that does work for my needs
The net-announcement.wav file is located in /var/lib/asterisk/sounds/ so the full path is not needed. The file name and node number above are fictions.
My nodes are public but I don’t think private nodes should be a problem.
Edit: Do you have the same issue using the command from the Asterisk cli? Maybe try one of the words from /var/lib/asterisk/sounds/. For example rpt playback <your node> megahertz
I had him do a test to create the same from DTMF to be sure it was not a command line issue as well.
8900=Playback,/var/lib/asterisk/sounds/digits/2 ; test
and yes, from the default directory it does not have to be included. Same with .ulaw from there.
It does play on the local node.
The audio just is not gated through linked nodes when you use duplex=0 and linktolink.
I remember the issue vaguely, as I had it as well when I used linktolink but ran through most possibilities with him to be sure.
I am left with putting this on a wish list for the dev team. Not sure how complicated it might get. But I can imagine.
Theory would be…
Format between nodes should not matter. If it plays on the local node, it should be automatically transcoded to the format broadcast to the links. It’s just that no mixer action is applied to it. (gates not open, thinking of it as a switch)
Have you tried it with duplex=0 without linktolink? Linktolink is supposed to stop all telemetry so I could see where this is expected behavior. If linktolink is the culprit and you feel strongly the behavior should be changed feel free write it up over on GitHub - AllStarLink/ASL-Asterisk: Version of Asterisk used for AllStarLink.
Why are these nodes connected with linktolink? I can see an argument for not using linktolink to connect nodes. As I see it, linktolink was designed to connect repeater controller ports together.