How can i monitor easilly via Linux several webpages for changes?

I mean example when some new image or html element (without visible text) appear on webpage, i will get notiffied

Update:
1) Linux desktop application: http://specto.sourceforge.net/

2) Linux URLWatch: https://thp.io/2008/urlwatch/ (not tried)

3)
Code:
watch -g curl -s example.com && echo Changed
OR
Code:
watch -g curl -s example.com &> /dev/null && echo 'Body of email' | mail -s 'Subject' [email protected]
4)
Code:
#!/bin/bash
# Usage: checksite.sh <URL> <interval in seconds>                                                                     

site=$1                                                                         
interval=$2
get_cmd='curl -s'                                                               
contents=$( $get_cmd $site )                                                    
subject="$site changed"                                                         
body="HONK HONK HONK"                                                           
to="[email protected]"                                                           
while true ; do                                                                 
    new_contents=$( $get_cmd $site )                                            
    if [ "$contents" != "$new_contents" ] ; then                                
        echo "$body" | mail -s "$subject" $to                                   
        exit                                                                    
    else                                                                        
        contents=$new_contents                                                  
    fi                                                                          
    sleep $interval                                                                     
done
5)
Code:
#!/bin/bash

# monitor.sh - Monitors a web page for changes
# sends an email notification if the file change

USERNAME="[email protected]"
PASSWORD="itzasecret"
URL="http://thepage.com/that/I/want/to/monitor"

for (( ; ; )); do
    mv new.html old.html 2> /dev/null
    curl $URL -L --compressed -s > new.html
    DIFF_OUTPUT="$(diff new.html old.html)"
    if [ "0" != "${#DIFF_OUTPUT}" ]; then
        sendEmail -f $USERNAME -s smtp.gmail.com:587 \
            -xu $USERNAME -xp $PASSWORD -t $USERNAME \
            -o tls=yes -u "Web page changed" \
            -m "Visit it at $URL"
        sleep 10
    fi
done
run: nohup ./monitor.sh &

6)
Code:
#!/bin/sh

# Copyright 2009 Ruben Berenguel

# ruben /at/ maia /dot/ ub /dot/ es

# PageDiffs: Fill in an array of webpages, with the option "write"
# will download them, with the "diff" option will re-download them and
# check the new against the old for differences. With the "diff mail"
# option, will send an email to $MAILRECIPIENT, assuming mail works.
# You can find the most up to date version of this file (and the GPL)
# http://rberenguel.googlecode.com/svn/trunk/Bash/PageDiffs.sh

# 20091226@00:24

MAILRECIPIENT="[email protected]"

j=0
Pages[j++]="http://www.maia.ub.es/~ruben/"
Pages[j++]="http://www.google.es"
#Add more pages as above

if [ "$1" = "write" ]; then
echo Generate files
count=0
for i in "${Pages[@]}"
do
echo Getting "$i" into File$count
wget "$i" -v -O "File$count"
let count=$count+1
done
fi
if [ "$1" = "diff" ]; then
count=0
for i in "${Pages[@]}"
do
# echo Getting "$i" into Test$count
wget "$i" -q -O "Test$count"
Output=$(diff -q "Test$count" "File$count" | grep differ)
Result=$?
if [ "$Result" = "0" ]; then
if [ "$2" = "mail" ]; then
echo Page at "$i" has changed since last check! >> MailCont
mail=1
fi
echo Page at "$i" has changed since last check!
else
echo Page at "$i" has not changed since last check!
fi
#rm Test$count
let count=$count+1
done
if [ "$mail" = "1" ]; then
mail -s "Page changed alert!" $MAILRECIPIENT
run by cronjob: 00 13 1,7,14,21,28 * * /home/user/PageDiff.sh diff

--
Other, non-Linux webpage monitoring tools: http://internetlifeforum.com/internet/792-how-monitor-some-website-changes-tool/