#!/bin/bash export DISPLAY=:0 INPUT="$1" ID=$2 unquote(){ echo $1 | sed 's/"//g' } URL=$(unquote $(echo $INPUT | jq .url)) DOMAINS=`node array-to-lines.js "$(echo $INPUT | jq .third_party_domains)"` source ./utils.sh PREVIEW="TRUE" # set to "TRUE" in order to enable automatic screenshots kept in preview.png if [ "$PREVIEW" = "TRUE" ]; then (while true; do grab_screen_to_public $ID sleep 1 done) & refresher_pid=$!; fi ORIGIN_DOMAIN=$(sed -e 's/[^/]*\/\/\([^@]*@\)\?\([^:/]*\).*/\2/' <<< "$URL") while IFS= read -r DOMAIN; do load_website "$DOMAIN" open_console grab "$DOMAIN before" (tr '\n' ' ' < click-accept-all.js) | xclip -sel clip keycombo Control_L v sleep 0.3 xdotool key Return sleep 1.5 grab "$DOMAIN after" done <<< "$DOMAINS" load_website "$URL" echo "{\"current_action\": \"Strona wczytana\"}" grab load_website open_network_inspector grab open_network_inspector declare -a pids; pids=() index=0 mkdir -p "/opt/static/$ID" while IFS= read -r DOMAIN; do if [ "$DOMAIN" = "" ]; then continue fi echo "{\"current_action\": \"Skanowanie skryptów z domeny $DOMAIN...\"}" network_inspector_search "domain:$DOMAIN " # can filter with more granularity: https://developer.mozilla.org/en-US/docs/Tools/Network_Monitor/request_list#filtering_by_properties # grab ni_search count=0 while network_inspector_has_more_entries do filename="$ID/${index}.png" scrot "/opt/static/$filename" grab "searching $DOMAIN" BASE_URL="$BASE_URL" python annotate_header.py "$filename" "$DOMAIN" \ "set-cookie" "identyfikator internetowy z cookie" 11 ""\ "Cookie" "identyfikator internetowy z cookie" 11 ""\ "Referer" "Część mojej historii przeglądania" 0 "$ORIGIN_DOMAIN" & pids+=($!) network_inspector_next_entry ((index++)) ((count++)) if [ $count -gt 10 ]; then break; fi done done <<< "$DOMAINS" if [ "$PREVIEW" = "TRUE" ]; then kill $refresher_pid; fi echo "{\"current_action\": \"Kończenie...\"}" for PID in "${pids[@]}" do wait $PID done kill -2 %%; cleanup echo "Done!"