areuz areuz - 2 months ago 7
JSON Question

Periodically reading output from async background scripts

Context: I'm making my own i3-Bar script to read output from other (asynchronous) scripts running in background, concatenate them and then

echo
them to i3-Bar itself.

The way I'm passing outputs is in plain files, and I guess (logically) the problem is that the files are sometimes read and written at the same time. The best way to reproduce this behavior is by suspending the computer and then waking it back up - I don't know the exact cause of this, I can only go on what I see from my debug log files.

Main Code: Added comments for clarity

#!/usr/bin/env bash
cd "${0%/*}";

trap "kill -- -$$" EXIT; #The bg. scripts are on a while [ 1 ] loop, have to kill them.

rm -r ../input/*;
mkdir ../input/; #Just in case.

for tFile in ./*; do
#Run all of the available scripts in the current directory in the background.
if [ $(basename $tFile) != "main.sh" ]; then ("$tFile" &); fi;
done;

echo -e '{ "version": 1 }\n['; #I3-Bar can use infinite array of JSON input.

while [ 1 ]; do

input=../input/*; #All of the scripts put their output in this folder as separate text files

input=$(sort -nr <(printf "%s\n" $input));

output="";

for tFile in $input; do
#Read and add all of the files to one output string.
if [ $tFile == "../input/*" ]; then break; fi;
output+="$(cat $tFile),";
done;

if [ "$output" == "" ]; then
echo -e "[{\"full_text\":\"ERR: No input files found\",\"color\":\"#ff0000\"}],\n";
else
echo -e "[${output::-1}],\n";
fi;

sleep 0.2s;
done;


Example Input Script:

#!/usr/bin/env bash
cd "${0%/*}";

while [ 1 ]; do
echo -e "{" \
"\"name\":\"clock\"," \
"\"separator_block_width\":12," \
"\"full_text\":\"$(date +"%H:%M:%S")\"}" > ../input/0_clock;
sleep 1;
done;


The Problem



The problem isn't the script itself, but the fact, that i3-Bar receives a malformed JSON input (-> parse error), and terminates - I'll show such log later.

Another problem is, that the background scripts should run asynchronously, because some need to update every 1 second nad some only every 1 minute, etc. So the use of a FIFO isn't really an option, unless I create some ugly inefficient hacky stuff.

I know there is a need for IPC here, but I have no idea how to effieciently do this.

Script output from randomly crashing - waking up error looks the same

[{ "separator_block_width":12, "color":"#BAF2F8", "full_text":"192.168.1.104 "},{ "separator_block_width":12, "color":"#BAF2F8", "full_text":"100%"}],

[{ "separator_block_width":12, "color":"#BAF2F8", "full_text":"192.168.1.104 "},,],


(Error is created by the second line)
As you see, the main script tries to read the file, doesn't get any output, but the comma is still there -> malformed JSON.

Answer

The immediate error is easy to fix: don't append an entry to output if the corresponding file is empty:

for tFile in $input; do
    [[ $tFile != "../input/*" ]] &&
      [[ -s $tFile ]] &&
      output+="$(<$tFile),"
done

There is a potential race condition here, though. Just because a particular input file exists doesn't mean that the data is fully written to it yet. I would change your input scripts to look something like

#!/usr/bin/env bash
cd "${0%/*}";

while true; do
    o=$(mktemp)
    printf '{"name": "clock", "separator_block_width": 12, "full_text": %(%H:%M:%S)T}\n' > "$o"
    mv "$o" ../input/0_clock
    sleep 1
done

Also, ${output%,} is a safer way to trim a trailing comma when necessary.

Comments