areuz areuz - 3 years ago 217
JSON Question

Periodically reading output from async background scripts

Context: I'm making my own i3-Bar script to read output from other (asynchronous) scripts running in background, concatenate them and then

them to i3-Bar itself.

The way I'm passing outputs is in plain files, and I guess (logically) the problem is that the files are sometimes read and written at the same time. The best way to reproduce this behavior is by suspending the computer and then waking it back up - I don't know the exact cause of this, I can only go on what I see from my debug log files.

Main Code: Added comments for clarity

#!/usr/bin/env bash
cd "${0%/*}";

trap "kill -- -$$" EXIT; #The bg. scripts are on a while [ 1 ] loop, have to kill them.

rm -r ../input/*;
mkdir ../input/; #Just in case.

for tFile in ./*; do
#Run all of the available scripts in the current directory in the background.
if [ $(basename $tFile) != "" ]; then ("$tFile" &); fi;

echo -e '{ "version": 1 }\n['; #I3-Bar can use infinite array of JSON input.

while [ 1 ]; do

input=../input/*; #All of the scripts put their output in this folder as separate text files

input=$(sort -nr <(printf "%s\n" $input));


for tFile in $input; do
#Read and add all of the files to one output string.
if [ $tFile == "../input/*" ]; then break; fi;
output+="$(cat $tFile),";

if [ "$output" == "" ]; then
echo -e "[{\"full_text\":\"ERR: No input files found\",\"color\":\"#ff0000\"}],\n";
echo -e "[${output::-1}],\n";

sleep 0.2s;

Example Input Script:

#!/usr/bin/env bash
cd "${0%/*}";

while [ 1 ]; do
echo -e "{" \
"\"name\":\"clock\"," \
"\"separator_block_width\":12," \
"\"full_text\":\"$(date +"%H:%M:%S")\"}" > ../input/0_clock;
sleep 1;

The Problem

The problem isn't the script itself, but the fact, that i3-Bar receives a malformed JSON input (-> parse error), and terminates - I'll show such log later.

Another problem is, that the background scripts should run asynchronously, because some need to update every 1 second nad some only every 1 minute, etc. So the use of a FIFO isn't really an option, unless I create some ugly inefficient hacky stuff.

I know there is a need for IPC here, but I have no idea how to effieciently do this.

Script output from randomly crashing - waking up error looks the same

[{ "separator_block_width":12, "color":"#BAF2F8", "full_text":" "},{ "separator_block_width":12, "color":"#BAF2F8", "full_text":"100%"}],

[{ "separator_block_width":12, "color":"#BAF2F8", "full_text":" "},,],

(Error is created by the second line)
As you see, the main script tries to read the file, doesn't get any output, but the comma is still there -> malformed JSON.

Answer Source

The immediate error is easy to fix: don't append an entry to output if the corresponding file is empty:

for tFile in $input; do
    [[ $tFile != "../input/*" ]] &&
      [[ -s $tFile ]] &&

There is a potential race condition here, though. Just because a particular input file exists doesn't mean that the data is fully written to it yet. I would change your input scripts to look something like

#!/usr/bin/env bash
cd "${0%/*}";

while true; do
    printf '{"name": "clock", "separator_block_width": 12, "full_text": %(%H:%M:%S)T}\n' > "$o"
    mv "$o" ../input/0_clock
    sleep 1

Also, ${output%,} is a safer way to trim a trailing comma when necessary.

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download