Tag Archives: script

Shell script to monitor SSD wear level

I bought some new computer hardware since a few days (i7 6700, GTX970 etc.), one of which is a Samsung 250Gb SSD on which Xubuntu was installed, before buying the SSD i have read reviews and benchmarks which gave some durability numbers, (the number of bytes written before the SSD become unusable), what is rather cool with SSD is that you can have a proper estimate of when the SSD will become unusable, SMART and other tools can give insight but i wanted a simple way to get an estimate for my SSD wear level so i started to find a way to get the number of written data and started to write some shell code.

With the ext4 filesystem (i don’t know if it is possible with others) it is possible to get the number of kbytes written to the file system since it was created (cat /sys/fs/ext4/your_ext4_partition/lifetime_write_kbytes)… so i wrote a small shell script which indicate the SSD wear level as percents and the days left before it become unusable so i can start to backup and buy another one before it is too late, the script also show the SSD lifetime write in mb/gb/tb units in case something wrong happen šŸ˜›

Before using it, the script need some inputs which are:

  • sys_fs_fs_device_path: Where is lifetime_write_kbytes located
  • ssd_date: The date at which the SSD begun to be used
  • ssd_tb_endurance: Endurance of the SSD in terabyte unit (gotten from reviews/benchmarks with a slightly decreased value to be safe)

Note: The script need slight modifications depending on the partitioning scheme the SSD has… right now it handle just a single partition.

#!/bin/sh
# Script displaying SSD wear level as percents and which estimate SSD days left
# This make use of ext4 "lifetime_write_kbytes" which indicate the number of kbytes written to the file system since it was created
# grz- / 2016

########
# lifetime_write_kbytes path (the one you use on your SSD)
sys_fs_fs_device_path="/sys/fs/ext4/sda1/lifetime_write_kbytes"
# when the ssd was used
ssd_date='09/17/2016'
# max ssd endurance (tb)
ssd_tb_endurance=30
########

ssd_date_as_timestamp=`date -d $ssd_date +%s`
now_timestamp=`date +%s`
ssd_days_elapsed="$((($now_timestamp - $ssd_date_as_timestamp)/(60*60*24)))"

lifetime_write=`cat $sys_fs_fs_device_path`

echo "\033[36mSSD wear levels:"
echo "\033[37m specified endurance: \033[32m${ssd_tb_endurance}tb"
echo "\033[37m ssd lifetime write: \033[32m\n\t$(echo "($lifetime_write/1024)" | bc)Mb\n\t$(echo "($lifetime_write/1048576)" | bc)Gb\n\t$(echo "($lifetime_write/1073741824)" | bc)Tb\n"
echo "\033[37m ~ssd wear: \033[32m$(echo "scale=2; (($lifetime_write / 1073741824) / $ssd_tb_endurance) * 100" | bc)%"

days_left=$(echo "($ssd_tb_endurance * 1073741824) / ($lifetime_write / $ssd_days_elapsed)" | bc)

echo "\033[37m ~ssd days left: \033[32m$days_left days"

To add SMART data “Wear_Leveling_Count” attribute value output, append this at the end of the script:

# device path
ssd_device_path=/dev/sda

wear_leveling_count=$(smartctl --format=brief -a $ssd_device_path | grep Wear_Leveling_Count)

echo "\n\033[37m SMART Wear_Leveling_Count attr.: \033[32m$(echo $wear_leveling_count | awk '{print $4}') (value) $(echo $wear_leveling_count | awk '{print $6}') (treshold) $(echo $wear_leveling_count | awk '{print $8}') (raw value)"

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Monitoring files change with PHP

This article is about monitoring all files change on a regular basis with a little PHP script and it follow the malicious code removal article but it can be useful for a huge number of things even if the usage of PHP may limit its applications.

Monitoring files change is a good way to catch any kind of PHP infectionsĀ before it is too late and be aware of anything suspicious happening on your files, this article is not only about PHP infections even if the script was made to prevent them, the solution presented here can be used for many other things… not only for PHP files, it is quite generic.

A way of doing it on Unix systems is to make use of the “find” and any hashes generator like “md5sum” like so :

find . -type f -exec md5sum  {} \; > md5sums_list
md5sum -c md5sum_list

Combined with some kind of storage to store the hashes, this is a fast and very simple way of checking if files differ (it can also be done easily in PHP) and know which files has been changed but if you want to see actual changes, you will need a more advanced method.

There is pretty good softwares which are doing that already and in much better way (like auditd) but i wanted a simple cross-platform solution to be aware of any files modification to prevent infections before everything is screwed… this is actually what is under the hood of popular protection plugins for WordPress like Wordfence, you could create a similar plugin quite easily if you dump the data into a database.

The PHP script presented here scan recursively all files which are in a specified directory on a regular basis and log the files which were modified between two scan along with their content (can be very memory hungry so it is restricted based on the file size) and content difference (thank to the finediff library), it is very fast because it only check for differences in filesize and it is able to do logs rotation, i set it to run as a cron job every hours on a server and it work like a charm.

But an issue show up rapidly… how do you make sens of the vast amount of log produced?

This is where the web log viewer kick in, this is a rather simple app. made with theĀ DataTables library, just a file input field to be able to load the log file and a table but this is enough to quickly see what is going on with your files for each scans and you can even see rapidly the differences thank to the HTML render of the finediff library when you click on the file row.

PHP files monitoring logs web viewerlive version

Now, to start monitor your files, you have to call files_monitor.php

But before you likely have to configure it for your usage :

  • Change $target_directory value by the directory you want to monitor (all .php files in this directory will be monitored)
  • Change $log_directory value by the directory you want to store produced log files
  • Change $filesize_content_to_log_limit value by the amount of bytes you want, all files under or equal that size will have their content and the content diff. logged (i use 50000 for 50kb which may be enough for .php files)
  • Change $log_rotation_filesize value by the amount of bytes you want for the log rotation to kick in (like 10000000 for 10mb)

Idea : Could be fun to add a pretty report by mail or message, if you have any suggestions, please share it.

Idea : Could be very fun to serve the log data in real-time to the viewer.

Note : If you want to monitor other kind of files than .php files you have to change the conditional at line 67 or extend it or remove it if you want to log every files.

Note : Due to the use of the file_get_contents function and many other things, the script may get very memory hungry for very large files and it may get very resources hungry as well if you log diff content (although this can be tweaked by changing the way the diff is produced, right now it is word based) and content of large files.

Disclaimer : I take no responsibility for any loss or damage suffered as a result of using the scripts presented here.

Download :
php_files_monitor.tar
php_files_monitor_viewer.tar

 

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

How to remove malicious code infecting every PHP files of a website

A few month ago i stumbled upon some malicious replicating code (PHP.Anuna and PHP/Agent.GC) infecting all of my php files by inserting the same malicious code at the start of every files… slowing down all the website and gathering informations, this was a mess but this can be solved easily.

If you encounter malicious code from these or variants which are using the same signature i will show you how you can clean all of your website files very quickly and monitor it to prevent any other threats (look up the next article) so you can react before it is too late and let it to never happen again.

Here is a little PHP script to run on your server which recursively remove all the malicious code for every .php files in a specified folder based on a virus signature, it is very simple but also very effective for this kind of infection.

If there is any infections found it will show which file was cleaned up and at which position in the file the malicious code was, there is a report at the end of the decontamination process which will show you how much files were treated.

You likely have to configure it before using it because the signature will be different for each infections :

  • Change $target_directory value by the directory you want to clean up
  • Change $virus_begin_signature_str value by the few starting characters of the malicious code
  • Change $virus_end_signature_str value by the end of the malicious code

Disclaimer : I take no responsibility for any loss or damage suffered as a result of using this script, always backup your stuff first.

Note : Due to the use of the file_get_contents function, the script may get very memory hungry for large files, some adaptations of the code with more clever functions may be needed for your case.

<?php
/////////////////
$target_directory = '/var/www/';

//$virus_begin_signature_str = '$pmtccnmmns = \'341]88M4P8]37]278]225]241]334]';
$virus_begin_signature_str = 'd($n)-1);} @error_rP6]';
$virus_end_signature_str = '$pkvpncwqkl(""); $pkvpncwqkl=(468-347); $pmtccnmmns=$pkvpncwqkl-1; ?>';
$virus_name = 'PHP Agent.GC'; // just for pretty print, work with PHP.Anuna and variants
/////////////////

$recursive_directory_iter = new RecursiveDirectoryIterator($target_directory);
$iterator = new RecursiveIteratorIterator($recursive_directory_iter);

$decontaminated_files_count = 0;

foreach ($iterator as $filename => $cur)
{
	$path_info = pathinfo($filename);
	
	if (!isset($path_info['extension'])) {
        continue;
    }
        
    if ($path_info['extension'] !== 'php') {
		continue;
	}
	
	echo "Checking: '".$filename."'".PHP_EOL;
	
    $contents = file_get_contents($filename);
	
	$virus_begin_pos = strpos($contents, $virus_begin_signature_str);

    if ($virus_begin_pos !== false && $filename !== __FILE__) {
        echo $virus_name." found in '".$filename."'".PHP_EOL;
		
		$virus_end_str = $virus_end_signature_str;
		$virus_end_pos = strpos($contents, $virus_end_str) + strlen($virus_end_str);
		
		$before_virus_content = substr($contents, 0, $virus_begin_pos);
		$after_virus_content = substr($contents, $virus_end_pos);
		
		echo $virus_name." content between start pos ".$virus_begin_pos." and end pos ".$virus_end_pos." deleted".PHP_EOL;
		
		$contents = $before_virus_content.$after_virus_content;

        file_put_contents($filename, $contents);
		
		$decontaminated_files_count++;
    }
}

echo $decontaminated_files_count." files were infected and decontaminated in the directory '".$target_directory."'".PHP_EOL;
1 Star2 Stars3 Stars4 Stars5 Stars (6 votes, average: 5.00 out of 5)
Loading...